View PDF [40 KB]
As it stands, Unity* software does not provide hardware-accelerated decode for video playback. Therefore, playing video on a low-end device could suffer dropped frames, low frame rates, and poor quality. The provided project illustrates how to merge the hardware decode path within Unity software. The main components within this solution are the creation of an external video texture mapped to Unity software, the creation of the media player, and the calls exposed to Unity software.
Video Texture Creation
To leverage the media player component, an external texture is created and shared with the Unity software rendering component. To gain access to the Unity software rendering component we use the IUnityInterface to retrieve the IUnityGraphicsD3D11 interface, which provides access to the Unity software rendering component used to call CreateTexture2D, creating a staged texture on the Unity software device. CreateTexture2D then receives the ID3D11Texture2D interface.
The D3D device is used to call CreateShaderResourceView, passing-in the ID3D11Texture2D that will serve as the input to the shader. The CreateShaderResourceView also receives the ID3D11ShaderResourceView, which provides access to the shader-resource-view interface, used to access data of the passed-in resource; in this case, the ID3D11Texture2D texture.
At this point, a shared texture is created and a handle to the shared texture is returned. The handle is then used to open the shared resource (ID3D11Texture2D), where the IDXGISurface surface of the texture is retrieved. CreateDirect3D11SurfaceFromDXGISurface is then called to create an instance of IDirect3DSurface from an IDXGISurface, whereupon a video frame is available. The frame is then copied to the IDirect3DSurface video surface which, as stated, is shared with the Unity software rendering pipeline.
Download [ZIP 7.5MB}
Media Player
To begin, we first create a new instance of the Windows Media* playback media player interface, which provides access to media playback functionality such as play, pause, seek, and so on. Once we have the interface, we load content by calling CreateFromUri, which creates a MediaSource from the passed-in Uri, where MediaSource provides a means of accessing media data.
We then associate the MediaSource to a MediaPlaybackItem, which acts as a wrapper around a MediaSource, exposing audio and video tracks within the MediaSource.
Unity* Software Integration
The purpose of this project is to detail the integration of hardware-accelerated video in Unity software, how it is achieved, and how you can leverage the code to integrate accelerated video into your project. A wrapper is created to expose the relevant calls to create, load, and interact with the media. While not within this article’s scope, more information on creating and exposing native dynamic libraries to Unity software can be found here. The project does not expose all available features of the media player, but does expose core functionality to provide a jump-off point to further expand upon with new features. The following functionality is currently exposed and available for integration:
- CreateMediaPlayback: Create media player instance
- CreatePlaybackTexture: Return native texture to a pointer
- LoadContent: Load from Uri
- SetPosition: Set current playback position (seek)
- GetPosition: Get current playback position
- GetDuration: Retrieve the video duration
- GetPlaybackRate: Retrieve current playback speed
- StateChangedCallback: Media playback change event callback
- Play: Play video
- Stop: Stop video
- Pause: Pause video
*Other names and brands may be claimed as the property of others.