Skip to content
  • Content »
  • Kartaverse »
  • Immersive Pipeline Integration Guide »

Hardware Control Surfaces and HID Devices

Scrivener Export - Reformatting Needed!

This article is an export of a Scrivener document. It will definitely need at least some reformatting to work in Obsidian and MkDocs. Delete this note once the article's formatting has been fixed to some extent.

BMD Control Surfaces for Resolve/Fairlight

BMD Control Surfaces for Resolve/Fairlight

Blackmagic Design has a wide range of control surfaces to meet the needs of video editors, audio professionals, and colorists.

image107.jpg

For more information:

Graphics Tablets

Graphics Tablets

Fusion's user interface, including the hotkeys used to navigate the viewport inside the Fusion 3D workspace are optimized for use by artists working with a graphics tablet.

For more information:

In addition to controlling the mapping of the buttons on the side of a stylus, artists have the option to customize what the extra buttons and control strips on their tablet are used for on a per-application basis.

This makes it possible to use FuScript based command-line scripting with Resolve/Fusion to allow the extra buttons on your graphics tablet to carry out just about any operation you can imagine including: running scripts, loading media in the viewer windows, adding nodes, bypassing nodes, rendering footage, opening the Fusion Render Manager/Console/Bin windows, or toggling the visibility of views like the Nodes view or Inspector controls.

Logitech MX-Master Options Mice Driver

Logitech MX-Master Options Mice Driver

For more information:

3dconnexion SpaceMouse

3dconnexion SpaceMouse Enterprise

https://3dconnexion.com/uk/

Kartaverse/Immersive Pipeline Integration Guide/img/image356.png

SpaceMouse Dev Resources:

VR/HMD Based Haptic Interfaces

VR/HMD Based Haptic Interfaces

An interesting consideration when re-creating virtual environments, is the existing Meta Quest HMD supports optical hand-tracking. Hand tracking brings accurate real-time "hand gesture" capture into a virtual world.

Hand-tracking works without the need for any 3rd party plastic VR hand-controller gadgets/nunchucks. One simply reaches out and uses one's own fingers to interact directly with objects existing inside the digital-twin location.

You can touch, pick up, carry and interact with the props, tools, and natural objects in the virtual environment. Forces like gravity act upon the objects so dropping or setting down a prop will kick-off a rigid body dynamics simulation of the settling motion as the object's motion comes to a resting state.

These images show the Meta Quest HMD's hand gesture training content:

Kartaverse/Immersive Pipeline Integration Guide/img/image112.pngKartaverse/Immersive Pipeline Integration Guide/img/image249.pngKartaverse/Immersive Pipeline Integration Guide/img/image113.png