

If possible, could you give more context on the project(s) itself for us to get the full picture? Hope this helps.Ĭlick to expand.Thanks, very good list of use cases! Those are some use cases that I can recall. I recall the scene loading improved compared to PNG sequences in that one case, but the lack of anti-aliasing was a bit jarring to the eye. In an earlier project, we experimented with SVG sequences. Also, if we had frame perfect match with Timeline, we could use the Unity text rendering to support more screen sizes (it's anything between 800px wide and 4K). This, well, did not work too well as you might guess, but we will look into the HAP repo you linked. A common need is to be able to reliably play, loop and pause between sections in film and control the flow and pacing.

Matching video and timeline animations. (We got the atlas count down about 60%, but there are still issues to be sorted out) We're experimenting with slicing each frame into smaller chunks, comparing if the chunk shows change from the previous frame, and reusing previous chunks as much as possible. Common challenge was packing the sprites optimally and keeping the texts from blurring. Using PNG sequences for infographic animations. I recall this causing issues with performance, so it's off the table for now. Using WebM with alpha to display animations that are too complicated to build by other means. It's a hassle and prone to mistakes during development. Every clip requires its own RenderTexture asset, because you sometimes see a frame of another clip before the right one loads. This mostly works, but I do wish the video player could render directly to RawImage without requiring setting up RenderTextures. Looping video in the background and showing canvas UI on top. Our latest use cases (done on 2019.2) have involved We have a skilled VFX and mo-graph team in-house, and we would really benefit if we could utilize them to the fullest. We're constantly balancing between providing striking visuals and often quite low-end business laptops that the applications are usually run on. Hello! I can share some of our video playback experiences in our B2B projects. We would be interested to get your use cases, your needs, and your feedback on the alternatives above. Convert your video to an image sequence and use the new Streaming Image Sequence package (Preview): the trick described at the bottom of this thread to sync any video player with your rendering: (The video source frame rate and the target framerate need to match then)

HAP playback (again Keijiro's amazing contributions!): In the meantime, you have various solutions depending on your use case: It’s a performance-vs-flexibility trade off. To get this capability, we need a different implementation that is likely to have lower performance. The gist of it is that many platform-supplied playback engines follow their own internal clock and don’t give the option to sync to an external source. We currently have not started any work to have built-in support for that in our video player. Usage can be the playback of 2D animations or 2D effects (for example when mixing 2D and 3D animation) or to record a video plate in sync with graphics in Unity. We have regular requests from users to be able to playback videos or image sequences in an accurate way.
