Virtual Production
Scrivener Export - Reformatting Needed!
This article is an export of a Scrivener document. It will definitely need at least some reformatting to work in Obsidian and MkDocs. Delete this note once the article's formatting has been fixed to some extent.
Virtual Production Software¶
Virtual Production Software
Assimilate LiveFX¶
Assimilate LiveFX
https://www.assimilateinc.com/products/livefx/
Unreal Engine¶
Unreal Engine
https://www.unrealengine.com/en-US/virtual-production
Unreal nDisplay¶
Unreal nDisplay
https://docs.unrealengine.com/4.26/en-US/WorkingWithMedia/nDisplay/Overview/
Unreal Composure¶
Unreal Composure
https://docs.unrealengine.com/en-US/Engine/Composure/index.html
OWL Streaming Toolkit for Unreal¶
OWL Streaming Toolkit for Unreal
Off World produces a solution that improves the artist-friendly nature of Unreal based virtual production workflows. The streaming toolkit allows Unreal rendered low-latency NDI video streams to be passed with RGBA data directly into Assimilate LiveFX.
https://offworld.live/products/unreal-engine-live-streaming-toolkit
Disguise¶
Disguise
Unity¶
Unity
https://unity.com/roadmap/virtual-production
TouchDesigner¶
TouchDesigner
https://derivative.ca/UserGuide/TouchDesigner
Notch¶
Notch
https://www.notch.one/virtualproduction/
NVIDIA Omniverse¶
NVIDIA Omniverse
At the base level, Omniverse is a DCC platform that helps users perform tasks like next-generation OpenUSD workflow design/collaboration, virtual production asset prep, ML synthetic training data generation, and metaverse content creation.
https://www.nvidia.com/en-us/omniverse/
The Omniverse toolset has been humorously described by end users as vaguely reminiscent of the entrepreneurial business model that existed in the Klondike era with the "pick and shovel" gear manufacturers.
The meaning here is that Omniverse is a tool that exists to serve a specific utilitarian purpose --- to help the end user get work done fast and efficiently.
The Omniverse end customer is then going off to use the pick and shovel tools they bought on large scale undertakings, to stake their gold claim, and do something big and impactful.
Aximmetry¶
Aximmetry
Virtual Production Hardware¶
Virtual Production Hardware
Brompton LED Video Processors¶
Brompton LED Video Processors
ROE Visual LED Display Panels¶
ROE Visual LED Display Panels
AOTO LED Display Panels¶
AOTO LED Display Panels
7th Sense Design Media Servers¶
7th Sense Design Media Servers
Virtual Production MoCap, MoCo, and Camera Tracking Solutions¶
Virtual Production MoCap, MoCo, and Camera Tracking Solutions
Vicon MoCap¶
Vicon MoCap
Xsens MoCap¶
Xsens MoCap
Faceware MoCap¶
Faceware MoCap
Move.ai MoCap¶
Move.ai MoCap
Unreal Metahuman¶
Unreal Metahuman
https://www.unrealengine.com/en-US/metahuman
Ziva Dynamics¶
Ziva Dynamics
nCam Camera Tracking¶
nCam Camera Tracking
VIVE Mars Cam Tracking¶
VIVE Mars Cam Tracking
EZtrack Tracking Hub¶
EZtrack Tracking Hub
Gyroflow Flowshutter IMU Data Logger¶
Gyroflow Flowshutter IMU Data Logger
https://github.com/gyroflow/flowshutter
Mark Roberts Motion Control¶
Mark Roberts Motion Control
Virtual Production Lens Metadata¶
Virtual Production Lens Metadata
A key aspect of virtual production is to have cine lenses that support the passing of lens metadata.
Cooke Optics¶
Cooke Optics
Cooke Optics cinema lenses use the /i technology lens metadata protocol to store important lens information. The /i technology lens data is passed through a PL mount lens connector to the cinema camera body.
This is a YAML based metadata encoding format that holds 4D time-based data for the lens parameters that is accurately synced to the current video frame. Also having the unique lens serial number record can be helpful in post-production as it provides artists with access to the original Cooke Optics factory lens calibration information which is useful when working with anamorphic lenses that have extensive lens breathing.
Blackmagic Design URSA 12K camera bodies for example, are able to store the /i technology metadata in native BRAW format media.