All pages
Powered by GitBook
1 of 21

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

Loading...

PLUGINS

OptiTrack Unreal Engine Plugin

For our streaming applications, Unreal Engine 4 and 5 have essentially the same setup. The main difference is the UI and where to find the appropriate settings and buttons. All our guides on this Wiki have been updated to feature Unreal Engine 5. If you need assistance with Unreal Engine 4 please feel free to reach out to our support team.

Plugin Overview

The allow you to stream real-time tracking data from Motive into Unreal Engine. This includes tracking data of Rigid Bodies, Skeletons, and HMDs that are tracked within Motive. This article focuses on the organization of those plugins. For basic instructions on setting up a motion capture system, please refer to the guide instead.

Included Plugins:

  • OptiTrack - Live Link

  • OptiTrack - Streaming Client

Both of these plugins are included in the ZIP file.OptiTrack Unreal

HMD Compatibility

  • A variety of head mounted displays (HMDs) can be integrated using the .

  • For plugin version 1.23 and above, support for Oculus HMDs has been deprecated.

Motive Data Streaming Setup (Server)

First, you'll want to follow the below instructions to set up the data streaming settings in Motive. Once this is configured, Motive will be broadcasting tracking data onto a designated network interface where client applications can receive them.

Streaming Settings

Open the in Motive's Settings window and set the following settings:

  • Enable - Turn on the Enable setting at the top of the NatNet section.

  • Local Interface - Choose the desired IP network address from this dropdown to stream data over.

    • Loopback

Additional Tips

  • For best results, it is advised to run Motive and Unreal Engine separately on different computers, so that they are not competing for processing resources.

  • When streaming the data over a wifi network, Unicast transmission must be used.

Integrating HMDs

OptiTrack motion capture systems can be used to track head mounted displays (HMD) and integrate the tracking data into Unreal Engine for VR applications. For instructions on integrating HMD tracking data into Unreal Engine, please refer to the corresponding page:

Supported HMDs

At the time of writing, the following HMDs are supported:

  • HTC VIVE

  • HTC VIVE Pro 1/2

Deprecated support for Oculus HMDs:

  • Support for Oculus Integration have been deprecated starting from UE plugin version 1.23; Plugin version 1.22 or below must be used for Oculus HMDs.

  • Vive and Valve Index HMDs are supported through the OpenVR driver.

Wireless Multiplayer Setup

When setting up multiplayer games on wireless clients, it is more beneficial for each client to make direct connection to both the tracking-server (Motive) and the game-server, rather than rebroadcasting the streamed tracking data through the game-server. Then, any of the game related actions that interacts with the tracking data can be processed on the game-server, and this server can send out the corresponding updates to the wireless clients. This allows the wireless clients to only receive both the tracking data or updates without having to send back any information; in other words, minimizing the number of data transfers needed. If wireless clients are sending data there will be a minimum of two transfers on the wireless network, and each transfer of data through wireless network is at risk of latency or lost packets.

This is the local computer IP address (127.0.0.1 or Localhost).

  • Used for streaming data locally on the PC you are running Motive on that does not interact with the LAN.

  • Good option for testing network issues.

  • 192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)

    • This IP address is the interface of the LAN either by Wi-Fi or Ethernet.

    • This will be the same address the Client application will use to connect to Motive.

  • Transmission Type

    • For streaming over a Wi-Fi network, setting the Transmission Type to Unicast is strongly advised.

  • Select desired data types to stream under streaming options:

    • Rigid Bodies - Enabled (required).

    • Skeletons - Optional for Skeleton tracking.

    • Markers (Labeled, Unlabled, Asset) - Disabled for HMDs (advised).

    • Devices - Disabled.

  • Skeleton Coordinates

    • Set to Local.

  • Bone Naming Convention

    • Set the appropriate bone naming convention for the client application. For example, if the character uses the FBX naming convention, this will need to be set to FBX.

  • In order to stream data from the Edit mode, a capture-recording must be playing back in Motive.

  • For additional information on data streaming in general, read through the Data Streaming page.

  • Valve Index

  • HP Reverb G2

  • OptiTrack Unreal Engine Plugins
    Getting Started
    Unreal Engine Plugin download
    OptiTrack OpenVR Driver
    Data Streaming Pane
    Unreal Engine: HMD Setup
    Engine plugins shown in the Plugins pane in UE5.
    Broadcast Frame Data set to true for streaming.

    Unreal Engine VCS Inputs

    This page provides instructions on how to configure VCS inputs in Unreal Engine. The basic configuration is similar to configuring any other input triggers in Unreal Engine. Please note that only one VCS controller can be connected and configured due to some limitations. Having two controllers connected at the same time is not supported.

    Setup Steps

    Create VCS Rigid Body in Motive

    Create a Rigid Body from your tracking controller’s markers using the Builder pane or by selecting the markers and using the keyboard hotkey CTRL + T. You'll want to orient the controller along the +Z axis during creation to define the 'neutral' or 'zero' orientation.

    Orientation of VCS in the physical space.
    Line up the controller with the Z axis in Motive to mimic the orientation in the physical space.

    Configure Data Streaming settings

    In Motive, configure the data streaming settings. Use the Data Streaming pane to configure streamed packets. Make sure Rigid Body data is streamed out in order to use VCS.

    Create/load an UE Project

    Start up a project in Unreal Engine (UE).

    Enable Windows RawInput plugin

    Go to Edit tab → Plugins to open the plugins panel. Enable the Windows RawInput plugin under the Input Devices group.

    Connect VCS plugin through the enabled plugin

    In Edit tab → Project Settings, scroll to the bottom on the left side panel until you see Raw Input under the plugins group. Here you will let UE project know which input devices to use.

    Find Hardware ID and Product ID of the VCS controllers

    To find these IDs, you will need to look at the windows device properties. Go to Windows Control Panel -> Devices and Printers. Then right-click on the VCS controllers to access its properties. In the properties, go to the Hardware tab and click properties for “HID-compliant game controller”.

    Once you access the controller properties, go to the details tab. Select Hardware ID in the drop-down menu and the hardware ID (HID) and product ID (PID) will be shown under the highlighted section.

    Input the IDs in UE

    Under the project settings panel Raw Input plugin properties, input both the vendor ID (Hardware ID) and the product ID (PID) that was found under the controller properties.

    Register the Input Buttons

    Now the project has the IDs to look for the controllers, next step is to setup and register the input buttons. To do so, you will play the project scene, and trigger on the buttons to register them.

    In UE, hit Play and press (~) to access the console. In the console, input command ShowDebug INPUT". This will list out all of the input actions on the left side of the viewport.

    Use the keys to register

    Use all of the keys on the controller to register the inputs; total three axis and seven buttons. Please note that these keys may not exactly match the keys on your controller.

    • Axis 1: Joystick left/right

    • Axis 2: Joystick up/down

    • Axis 3: Nob rotate

    • Button 1: Blue

    • Button 2: Black

    • Button 3: White

    • Button 4: Red

    • Button 6: Joystick click

    • Button 7: Nob click

    Map the Registered Inputs

    Now that the buttons have been registered, next step is to map the keys. They will be mapped under Edit → Project Settings → Inputs. Choose either the Axis mapping or the action mapping to map the controls to desired actions.

    Use the Registered Inputs

    Now that all of the buttons are set up, use them to control the VCS in UE.

    Autodesk MotionBuilder

    Autodesk Maya

    External Plugins

    Houdini 19 Integration

    Houdini 3D Animation

    SideFX's Houdini 3D animation software has created a plugin for use with an OptiTrack system.

    This plugin was not designed by OptiTrack. For additional support regarding this plugin, please reach out to SideFX's support directly.

    For more information regarding their plugin designed for OptiTrack's Motive software, please visit the link below.

    Houdini 19.5 Mocap Stream Geometry Node

    Autodesk MotionBuilder: Timecode Data

    Overview

    Timecode for each captured frame can also be streamed into MotionBuilder using the plugin. When the timecode data is available in Motive, you can view the streamed data under the properties of the connected OptiTrack devices.

    To monitor the timecode, select the connected OptiTrack I/O devices under the Navigator taband open the Properties tab of the Resources panel as the tracking data is being streamed to MotionBuilder. Among the properties, timecode values will be reported. This will be available for any type of devices (Optical device, Skeleton device, Insight VCS device) that is connected to Motive. Please note that timecode gets streamed out only if it is available in the capture. Timecode signal can be integrated to the capture data only when SMPTE timecode signal is fed into the camera system via the eSync synchronization hub.

    • Timecode data is presented to MotionBuilder on each device with the following properties, which can be keyed during recording.

    • Use the Timecode Key Interpolation property to set the interpolation method:

      • 0 : Constant (aka Stepping)

    Timecode on Assets

    In addition to timecode on each device, the Skeleton device also writes timecode onto each Skeleton asset root as an animatable user property. This allows you to export that asset only, while keeping the original timecode.

    The Skeleton Device Keying to Timecode

    This is for the Skeleton device plugin ONLY.

    The Skeleton Device has the option of writing keys using the timecode contained in the Motive data stream. This requires:

    • A timecode generator

    • OptiTrack Ethernet Series only

    • OptiTrack eSync connected to timecode generator and configured in Motive for timecode.

    See the for how to do this.

    To enable keying to timecode, set the following Device properties:

    • Use Motive Timecode

      • Check this property to enable timestamping keys using the timecode contained in the Motive data packet.

    • Timecode Format

    MotionBuilder defines the following formats, and the numeric value to use for the property:

    NTSC_DROP : 29.97 NTSC_FULL: -29.97f PAL_25: -25.0f FILM_24 : -24.0f FILM_23976: -23.976f FRAMES_30: -30.0f FRAMES_5994: -59.94f

    1 : Linear
  • 2 : Cubic (MoBu default)

  • Use this property to specify the timecode format.
  • This must be correctly specified, and requires you to specify the format of the timecode generator that is connected to Motive via the OptiTrack eSync device.

  • Timecode setup Wiki page
    Timecode settings.
    Timecode display in Motive.
    Click on image to enlarge.

    OptiTrack Unity Plugin

    Overview

    The OptiTrack Unity3D Plugin allows you to stream real-time Rigid Body, Skeleton, and HMD tracking data from Motive into Unity. Using the streamed data, objects and characters in the scene can be animated. The plugin contents are distributed in unitypackage format, and you can simply load this file into Unity projects to import its contents. Once imported, included C# scripts can be used for instantiating a client origin and receiving the tracking data. This article focuses on how to set up and use the plugin.

    Versions Requirements

    • Unity Version: 2017.2 / 2017.1 or above. (2020.3+ recommended)

    • Visual Studio 2019 or latest Visual C++ Redistributable

    Notes on HMD Integration

    • The HTC VIVE, VIVE Pro, VIVE Pro 2, Valve Index, and HP Reverb HMDs can be integrated through the .

    Motive Setup (Server)

    Streaming Setup

    From Motive, the tracking data can be streamed in real-time either from a live capture (Live Mode) or recorded data (Edit Mode). The streaming settings are configured by modifying the . NatNet streaming must enabled and the correct IP address must be set.

    Streaming in Motive

    Open the in Motive and configure the settings below:

    • Enable - Turn on the Enable setting at the top of the NatNet section.

    • Local Interface - Choose the desired IP network address from this dropdown to stream data over.

    • Transmission Type - Typically you will want to set this to Unicast since it subscribes only to the data you wish to use and normally uses less network bandwidth. This is especially advised if streaming data over WiFi.

    Additional Tips

    • In order to stream data from Edit mode, a capture recording must be playing back in Motive.

    • For best results, it is advised to run Motive and Unreal Engine separately on different computers, so that they are not competing for processing resources.

    Unity Setup (Client)

    Import Plugin Package

    While in the Unity project, double-click on the plugin unitypackage file and import the plugin assets into the project. When the package has been successfully imported, the following contents will be available within the project:

    Plugin Contents

    Folder
    Content Description

    Setting Up the Client Object

    In order to receive tracking data from a server application (e.g. Motive), a client object must be set up. The OptitrackStreamingClient.cs script can be attached to any object to stream data relative to that object. Typically, this script is attached to an empty object or loaded in using the "Client - OptiTrack" prefab object in the Assets/Optitrack/Prefabs folder.

    • [Motive] In the , configure the desired connection settings.

    • [Unity] Under the Prefabs folder, import the "Client - OptiTrack" prefab object into the scene, or attach OptitrackStreamingClient.cs script onto an empty object.

    • [Unity] In the streaming Client object, configure the connection settings to match the streaming settings in Motive.

    Position Data in Unity

    Although it is not strictly necessary, you may find it helpful to organize your tracked objects as children of the streaming Client object. This will allow you to adjust the position of the Client object to adjust the position of all streamed objects relative to the Client object.

    Animating Rigid Body

    1. [Unity] On an object that you wish to animate, attach the OpitrackRigidBody.cs script.

    2. [Unity] In the Streaming Client entry, link the Client object in which the OptitrackStreamingClient.cs script is attached. By default, it searches for an existing client instance, but this must be specified when there are more than one streaming client objects.

    3. [Unity] For the Rigid Body ID entry, input the streaming ID of corresponding Rigid Body asset in Motive. The streaming ID can be found, and changed, under the .

    Animating Skeleton

    By integrating with Unity's animation system, , the Unity3D plugin allows Motive to stream full body Skeleton data. The Skeleton tracking data from Motive is streamed out as hierarchical bone segment orientations, and this data is fed into the Unity's Mecanim system which allows animating characters with different proportions.

    Note: At the time of writing, Mecanim does not support explicit goals for inverse kinematics end-effectors when using real-time retargeting. In addition, you may observe a difference in the overall scale of the position data between the retargeted skeletal animations and streamed Rigid Bodies. These two limitations may lead to inconsistencies with actors interacting with Rigid Body props, and will hopefully be addressed in a future version of the integration.

    Steps

    1. [Unity] On Unity characters, attach OptitrackSkeletonAnimator.cs script as one of its components.

    2. [Unity] For the Streaming Client entry, link the Client object in which the OptitrackStreamingClient.cs script is attached. By default, it searches for an existing client instance, but this must be specified when there are more than one streaming client objects.

    3. [Unity] Enter Skeleton Asset Name which is assigned in Motive

    Animating Markers, Etc.

    1. [Unity] On the OptiTrack Streaming instance, enable the Draw Markers, "Draw Cameras", or "Draw Force Plates" setting(s).

    2. [Motive] Make sure that marker streaming is enabled in Motive if you wish to visualize markers.

    3. [Unity] Make sure the streaming setting is set up correctly, and play the scene.

    Integrating HMDs


    OptiTrack motion capture systems can be used to track head mounted displays (HMD) and integrate the tracking data into Unity for unique VR applications. For instructions on integrating HMD tracking data into Unreal Engine, please refer to the corresponding page .

    Supported HMDs

    At the time of writing, the following HMDs are supported:

    • HTC VIVE

    • HTC VIVE Pro

    Wireless Multiplayer Setup

    When setting up multiplayer games with wireless clients, it is more beneficial for each client to make direct connection to both the tracking-server (Motive) and the game-server, rather than rebroadcasting the streamed tracking data through the game-server. Then, any of the game related actions that interacts with the tracking data can be processed on the game-server, and this server can send out the corresponding updates to the wireless clients. This allows the wireless clients to only receive both the tracking data or updates without having to send back any information; in other words, minimizing the number of data transfers needed. If wireless clients are sending data there will be a minimum of two transfers on the wireless network, and each transfer of data through wireless network is at risk of latency or lost packets.

    OptiTrack Peripheral API

    The OptiTrack Peripheral API is an open C++ API that can be used to create 'plugin' devices. Custom built plugin DLL's will allow you to initialize and synchronize external devices with the OptiTrack motion capture system in Motive. After building a custom plugin device using the API, the library must be placed in the \device folder within the Motive install directory in order to initialize and integrate desired peripheral devices. For integrating AMTI and Bertec force plates and NI-DAQ devices, the existing BiomechDevicePlugin.dll that installs with the peripheral module can be utilized.

    Note: The OptiTrack Peripheral API is available in Motive versions 1.10 and above.

    The following features are supported by the Peripheral API:

    • Real-time synchronized data collection from peripheral hardware device and the OptiTrack motion capture system into Motive Take (TAK) files and the open standard C3D file format.

    • Motive UI access to device properties and event settings under the , allowing the users to configure the device in Motive.

    • Real-time display of live device data in Motive's scope view in the .

    • Captured data review in Motive's 2D Graphing windows in the .

    Contents

    The Peripheral API folder is installed with the Motive software. It can be found in the Motive install directory, which is located in C:\Program Files\OptiTrack\Motive\PeripheralAPI by default.

    Folder
    Contents

    Requirements

    The following components are required for using the OptiTrack Peripheral API.

    • Microsoft Windows 7, 8, or 10.

    • Microsoft Visual Studio 2013 ( Recommended )

    If you plan to use a newer version of Microsoft Visual Studio, you'll need to download Windows SDK 8.1 and the VC++ 2015.3v140 toolset (x86,x64).

    Retarget Solution

    Downloads/Installs

    • You can download the Windows SDK 8.1 from this

    • To download the v140 Toolset, you can modify your existing install of VS, or upon first downloading add the toolset from the Individual Components tab.

    • Once you have the Windows SDK 8.1 and v140 Toolset downloaded, you'll need to retarget the OptiTrackPeripheralExample solution.

    Retargetting Steps

    1. In Visual Studio, right click Solution 'OptiTrackPeripheralExample', or whatever you named the copy as.

    2. Select 'Retarget Solution'

    3. For the Windows SDK Version dropdown, select 8.1.

    Usage

    The following guideline can be used to create and apply custom device plugin DLLs in Motive:

    1. Copy the OptiTrackPeripheralExample project and modify the code in the lines where indicated in the sample.

    2. Build the sample, which produces a plugin DLL. Copy this DLL to the <Motive install folder>\devices subfolder.

    Class Diagram

    (Optional) If using Multicast, then enable/disable the desired data types. For tracking HMDs, disabling the Marker streaming is advised.

    When streaming the data over a WiFi network, Unicast transmission must be used.

    Server Address - IP address of the PC that the server application (Motive) is running on.

  • Local Address - Local IP Address of the PC that the client application (Unity) is running on. (Typically, this looks similar to the Server Address except maybe the last digits.)

  • Connection Type - Must match Motive. Unicast is recommended.

  • [Unity] If you wish to receive tracking data from more than one server instances, you may create multiple objects with the client script attached.

  • [Motive] Make sure Motive is tracking and streaming the data.

  • [Unity] Play the scene. The linked object will be animated according to the associated Rigid Body movement in Motive.

  • [Unity] For the Destination Avatar entry, link to the character's avatar component.

  • [Motive] Make sure Motive is tracking and streaming the data.

  • [Unity] Play the scene. When everything is set up properly, the linked avatar in Unity will be animated according to the streamed Skeleton in Motive. The position of the actor will be in its reference position as explained above.

  • [Unity] Each marker, camera, or force plate will be drawn in the scene, as shown in the screenshot below. (Note: Only markers will animate.)\

    HTC VIVE Pro 2

  • Valve Index

  • HP Reverb

  • Assets/OptiTrack

    All of the Unity plugin contents are included in this folder.

    Assets/OptiTrack/Scripts

    This is the folder that you will mainly use. It contains plugin C# script components that can be imported into Unity objects for receiving streamed data.

    Assets/OptiTrack/Plugins

    This folder contains the plugin libraries and header files.

    Assets/OptiTrack/Prefabs

    This is the easiest place to get started. This folder contains premade objects for setting up a streaming client, tracking a Rigid Body, and retargeting a Skeleton.

    Assets/OptiTrack/Scenes

    This folder contains sample Unity scene that includes pre-configured client, Rigid Body, and Skeleton objects.

    OptiTrack OpenVR Driver
    Streaming Settings
    Streaming Settings
    Streaming Settings
    Rigid Body properties
    Mecanim
    Unity: HMD Setup
    Streaming tracking data into Unity.
    Data Streaming settings in Motive.
    Unity plugin files.
    Client object in Unity and the corresponding Motive data streaming network settings. Click image to enlarge.
    Position data in unity. Click image to enlarge.
    OptiTrack Rigid Body configuration along with the Rigid Body properties in Motive. Configured Streaming ID must match the Rigid Body ID designated from the client side. Click image to enlarge.
    OptiTrack Skeleton Animator script configuration from a character in Unity.
    Skeleton labeled markers drawn in Unity scene.
    For the Platform Toolset dropdown, select v140 or No Upgrade (if VS has already loaded v140 as the default).
  • Click OK

  • This should now be ready to build without errors.

  • If your plugin has external dependencies (e.g. driver / SDK dlls), make sure these are on your system path, or in the Motive install directory.
  • Launch Motive. Your device should appear in the Devices pane in Motive. If it does not, check the Motive Log pane for error notifications.

  • When a peripheral device is detected in Motive, real-time collected data can be monitored from the real-time plotting of the Graph View pane in the Live mode.

  • \example

    Visual Studio Example Device project and source code showing Peripheral API usage.

    \include

    API headers (include in your project)

    \lib

    Peripheral API / Motive import library (link to your project)

    ClassDiagram

    Class relationship diagram of Plugin Devices to Motive Plugin Library.

    Readme.txt

    Peripheral API release notes.

    Devices pane
    Graph View pane
    Graph View pane
    link.
    Edit the 'REQUIRED' and 'OPTIONAL' sections.
    Changes made in the .cpp files are reflected in the Devices and Properties panes.
    Click image to enlarge.

    Unreal Engine: OptiTrack Live Link Plugin

    Overview

    This page provides instructions on how to use the OptiTrack Unreal Engine Live Link plugin. The plugin communicates with Unreal's built-in Live Link system by providing a Live Link source for receiving tracking data streamed from Motive. This plugin can be used for controlling cameras and objects in virtual production applications. When needed, the OptiTrack Unreal Engine Plugin can also be alongside this plugin. For a specific guide to InCamera VFX (i.e. LED Wall Virtual Production) please see this wiki page Unreal Engine: OptiTrack InCamera VFX.

    Setup

    1. [Motive] Setup Rigid Body streaming in Motive.

    Get Motive streaming with at least one Rigid Body or Skeleton asset. Make sure the settings are configured correctly, and the asset is active under the .

    2. [UE] Install the OptiTrack plugins in Unreal Engine (UE).

    You can install the OptiTrack Unreal Engine plugin by putting the plugin files into one of the following directories:

    • A global engine plugin can be placed in C:\Program Files\Epic Games\[Engine Version]\Engine\Plugins

    • A project-specific plugin can be placed in [Project Directory]\Plugins

    3. [UE] Enable the plugins in UE project.

    Go to Edit → Plugins and enable two of the required plugins. First one is the OptiTrack - Live Link plugin under Installed group, and the second one is the built-in Live Link plugin under Built-In group.

    4. [UE] Open the LiveLink pane

    Open the LiveLink pane from Window → Virtual Production → Live Link in the toolbar.

    5. [UE] Configure and create a new OptiTrack source

    In the LiveLink pane under Source options, go to the OptiTrack Source menu and configure the proper connection settings and click Create. Please make sure to use matching network settings configured from the Streaming pane in Motive.

    6. [UE] Check the Connection.

    If the streaming settings are correct and the connection to Motive server is successful, then the plugin will list out all of the detected assets. They should have green dots next to them indicating that the corresponding asset has been created and is receiving data. If the dots are yellow, then it means that the client has stopped receiving data. In this case, check if Motive is still tracking or if there is a connection error.

    Using the Plugin

    Static Meshes or Camera Actors

    1. Add the camera object or static mesh object that you wish to move

    Add a camera actor from the Place Actors pane or a static mesh from the project into your scene. For the static meshes, make sure their Mobility setting is set to Movable under the Transform properties.

    2. Add a LiveLinkController Component

    Select an actor you want to animate. In the Details tab select your "actor" (Instance). In the search bar, type in Live Link. Then click on the Live Link Controller from the populated list.

    3. Select the target Rigid Body

    Under the Live Link properties in the Details tab click in the Subject Representation box and select the target Rigid Body.

    4. Check

    Once the target Rigid Body is selected, each object with the Live Link Controller component attached and configured will be animated in the scene.

    Timecode Setup

    When the camera system is synchronized to another master sync device and a timecode signal is feeding into , then the received timecode can be used in UE project through the plugin.

    1. Set Timecode Provider under project settings

    From Edit → Project Settings, search timecode and under Engine - General settings, you should find settings for the timecode. Here, set the the Timecode Provider to LiveLinkTimeCodeProvider.

    2. Set OptiTrack source in the Live Link pane as the Timecode Provider

    Open the Live Link pane, and select the OptiTrack subject that we created when first setting up the plugin connection. Then, under its properties, check the Timecode Provider box.

    3. Check

    The timecode from Motive should now be seen in the Take Recorder pane. Take Recorder pane can be found under Window → Cinematics → Take Recorder in the toolbar.

    Skeletons

    1. Create a new Animation Blueprint

    Right click the mesh you would like to use and select "Create > Anim Blueprint"

    2. Name and Open the Animation Blueprint

    Name the animation blueprint something appropriate, then double click it to open the blueprint.

    3. Hook up your Blueprint

    Create a "Live Link Pose" component and connect it to the "Output Pose". Assign the "Live Link Subject Name" to the Skeleton that you would like to use.

    Change the "Retarget Asset" property in the Details pane of the blueprint editor to "OptiTrackLiveLinkRetarget"

    4. Getting the Skeleton to Animate

    To animate the Skeleton in real time click the Animation Blueprint from earlier. In the Details pane under the skelteonLive Link Skeleton Animation". After you add that component the mesh should start animating.

    To animate the Skeleton in a game, just press the play button. Adding the "Live Link Skeleton Animation" object is not necessary to animate in play mode.

    Debugging Note

    If the retargeting doesn't match the mesh correctly, then you can create a new OptiTrackLiveLinkRetarget blueprint from scratch and modify the bone mapping names.

    MetaHumans

    Animating a MetaHuman follows basically the same steps as another Skeleton, but requires hooking into the Skeleton at a very specific location. For more information about MetaHuman setup outside of our scope, please visit .

    1. Obtain MetaHuman from the Quixel Bridge Plugin.

    First, you'll want to verify that the Quixel Bridge plugin is installed with Unreal Engine 5. You can install the Quixel Bridge plugin from the Epic Games Launcher by clicking Unreal Engine then the Library Tab.

    You'll also wan to make sure that the Quixel Bridge plugin is enabled. To do this go to Edit > Plugins > Editor and click the check box. You may have to restart Unreal after enabling this plugin.

    Navigate to the Cube + icon in the top toolbar, select the dropdown and choose Quixel Bridge.

    From here, log into your account and select the MetaHuman you want to download/ add to your project.

    In your newly created MetaHuman folder (Note: The folder will be labeled whatever your MetaHuman name is), in the Content Browser, create an animation blueprint by right clicking and navigating to Animation> Animation Blueprint.

    For the skeleton asset, simply choose the metahuman_base_skel. There is no need to choose anything different for the parent class, keep it as AnimInstance.

    Click Create and name it, “ABP_ Metahuman Name”

    3. Setting up Post Processing inside mesh.

    Each MetaHuman will have a “Body” mesh attached to the MetaHuman blueprint. This mesh will have a post-processing animation blueprint automatically attached to it when you import it into your project. We want to remove this from the mesh because it can cause crashing errors if they are used improperly.

    Open your MetaHuman blueprint by double-clicking it. In here, click the Body in the components section, and in the details, panel go to Mesh> Skeletal Mesh, then click to navigate to the mesh in the content browser. Open the mesh by double clicking on it.

    Now in the mesh, scroll down to the Skeletal Mesh section and clear the “Post Process Anim Blueprint”.

    With this done, you can now use multiple MetaHumans in your project while all of them are using the same skeleton.

    Removing this Post Process Anim Blueprint disabled any way of the mesh itself receiving animations.

    Now we will drive all the animation in the animation blueprint we made earlier.

    You can also simply choose the animation blueprint that you made as well. It may cause multiple calls to the blueprint and in multiple areas, so be cautious.

    4. Attaching Live Link in AnimBP

    Open the animation blueprint that we made in step 2.

    You should now see the Animation Graph, if you don’t, navigate to the left in the Animation Graphs and click on “AnimGraph”. In the AnimGraph, right click and type “Live Link Pose”.

    Now attach it to the OutPut Pose node.

    You can add an input pose if you’d like to keep your blueprint free of any compile notes or errors.

    While you have the “Live Link Pose” node selected, navigate to the Details panel and under retarget, select Retarget Asset > OptiTrackLiveLink.

    Now to choose an actor.

    Make sure you have added a live link source streaming over from Motive.

    Choose an actor for the selected dropdown.

    You can select a different actor per AnimBlueprint, as long as you have the proper Post Process Blueprint Animation settings. Referenced in step 3.

    Click Compile and Save.

    5. Changing LOD, attaching the AnimBP and adding Skeletal Animation

    Next, navigate to your MetaHuman blueprint and open it.

    In the “Components” section on the left, scroll down and select LODSync.

    Now on the right go to LOD > Forced LOD and choose any LOD that works for your use-case.

    Do not use –1 as the Forced LOD as this will crash Unreal.

    Go back to the Components panel and click on Body.

    On the right in the Details panel, go to Animation> Animation Mode> Use Animation Blueprint.

    In the Anim Class choose the MetaHuman AnimBP that you created earlier.

    Lastly in the Components panel, click the '+ Add' button and type in ”Live Link Skeletal Animation” and then click Compile.

    If you go to your main Map, you can click and drag your MetaHuman in the scene to watch them animate in Realtime.

    Standalone Game Mode

    For testing the project in standalone game mode, or when developing an nDislay application, the Live Link plugin settings must be saved out and selected as the default preset to be loaded onto the project. If this is not done, the configured settings may not get applied. After configuring the LiveLink plugin settings, save out the preset from the Live Link pane first. Then, open the Project Settings and find Live Link section in the sidebar. Here, you can select the default Live Link preset to load onto the project, as shown in the screenshot below. Once the preset is properly saved and loaded, the corresponding plugin settings will be applied to the standalone game mode.

    If all the configuration is correct, the actors will get animated in the newly opened game window when playing the project in the standalone game mode.

    MotionBuilder Retargeting

    Another path to get data into Unreal Engine is to stream data from Motive -> MotionBuilder (using the OptiTrack MotionBuilder Plugin) -> Unreal Engine (using the Live Link plugin for MotionBuilder). This has the benefit of using the Human IK (HIK) retargeting system in MotionBuilder, which will scale characters of very different sizes/dimensions better than the base Live Link plugin. More information can be found by consulting .

    Troubleshooting

    Q - Trying to add more than 64 frames in the same frame. Oldest frames will be discarded.

    A - This notification message may appear at the bottom of the Live Link pane if the frame rate in the data stream doesn't match the rendering frame rate inside UE. This is within notification within the Engine only, so it should not interfere with the project. If this notification must be removed, you can go to the Project Settings → Engine → General Settings → Framerate section, check Use Fixed Frame Rate option, and set the Fixed Frame Rate to be the same rate as the Motive frame rate.

    Unreal Engine: OptiTrack Streaming Client Plugin

    This page provides instructions on how to set up the OptiTrack Streaming Client Unreal Engine plugin. This plugin is intended for Virtual Reality customers, but can be used with many other applications.

    Streaming Client Setup (Client)

    Next step is to configure the client. Follow below instructions to install and configure the OptiTrack Unreal Engine plugin to receive the streamed tracking data.

    Enable the Plugin

    OptiTrack - Streaming Client Plugin (required)

    1. Download the .

    2. Extract the contents from the ZIP file.

    3. Open the extracted OptiTrack folder, transfer the entire "OptiTrack" folder into the Unreal Engine's plugin directory located in the C:\Program Files\Epic Games\5.#\Engine\Plugins folder (there will be other plugins in that folder already).

    Set up the Client Origin

    Once the OptiTrack - Streaming Client plugin is enabled, the OptiTrack Client Origin actor will be available in Unreal Engine.

    OptiTrack Client Origin

    OptiTrack Client Origin

    The OptiTrack Client Origin class enables the Unreal Engine (client) to communicate with the Rigid Body, Skeleton, and HMD tracking data streamed from Motive.

    To add the client origin, simply drag-and-drop the OptiTrack Client Origin from the Place Actors panel into the level. Once the client origin is placed within the level, its position and orientation will reconcile the global origin of Motive in Unreal Engine. In other words, the tracking data will be represented relative to where this Client Origin object is positioned and oriented.

    Global Origin: Both position and orientation of the OptiTrackClientOrigin will represent the global origin of the tracking volume within Motive.

    Connecting Unreal Engine to Motive

    1. [Motive] Make sure that NatNet streaming is enabled in the in Motive.

    2. [Unreal] Once the plugin is added and enabled in the project, the OptiTrack Client Origin class will be available from the Place Actors panel.

    3. [Unreal] Drag and drop the OptiTrack Client Origin into the scene.

    Connecting to a designated IP address

    If you wish to connect to a server on a specific network address, you can uncheck the Auto Connect setting and manually enter the Server IP Address chosen in the in Motive, Client IP Address, and Connection Type associated with Motive. You may need to run the ipconfig command in the command prompt to obtain an appropriate IP address of the client.

    Advance settings: Auto-initialize

    By default, the auto-initialize feature is enabled and the client origin will get auto-initialized whenever the scene is played. But when needed, you can disable this and set up the project so the client origin gets initialized when a user-defined event is triggered.

    Animate Rigid Bodies

    OptiTrack Rigid Body Actor

    Actor objects in Unreal Engine can be animated using Rigid Body tracking data from Motive. Once the OptiTrack - Streaming Client plugin is enabled in the project, OptiTrack Rigid Body component will be available to use. By attaching this component onto an actor, you can animate its child actors according to the movement of a Rigid Body in Motive. Each Rigid Body component is given a Tracking ID value which associates with the of a Rigid Body in Motive. Once associated, the data from the corresponding Rigid Body will be used to update the transform of the target actor in Unreal Engine.

    Set Up Steps

    1. [Unreal] From the Place Actors panel, search for OptiTrack Rigid Body Actor, then drag-and-drop the actor into the scene.

    2. [Unreal] With this Rigid Body actor selected, attach the target actor that you wish to animate using the Details panel. Make sure the target actor's transformation is set to movable.

    3. [Unreal] Set the relative locations and rotations to all zeroes on this target actor. This actor should be listed as a child of the Rigid Body actor.

    RigidBodyComponent Properties

    Tracking ID

    ID of the Rigid Body used to derive the position and orientatation transform of the attached actor. This ID must match with the of the respective Rigid Body in Motive.

    Hide on Invalid Definition

    When this is checked, the corresponding Rigid Body actor will be hidden from the level until the associated Rigid Body data is streamed out from Motive and received by the plugin.

    Disable Low Latency Update

    Low latency update feature allows Rigid Body position and orientation transform to be updated immediately before rendering minimizing the latency. This is enabled by default. For debugging, you can check this setting to disable this behavior.

    Tracking Origin

    This sets a specific client origin to use for receiving tracking data. When this is unset, the plugin will default to the first client origin that it finds in the scene.

    Respect Parent Transform

    When this is set to true, the Rigid Body transform data from Motive will be applied in respect to the parent actor's pivot coordinates. By default, this is set to false, and all of the tracking data will be applied in respect to the pivot axis of the client origin.

    Drawing Markers

    When needed, you can also draw labeled marker data from Motive into the scene in UE. In most applications, you do not have to draw the markers as Rigid Body data and the Skeleton data will be used instead; however, getting markers generated in the scene may be helpful for debugging purposes. To enable drawing of the markers:

    • [UE4] Expand the OptiTrackClientOrigin (Instance) properties, and enable the Draw Markers checkbox.

    • [Motive] setting in the data streaming pane must be enabled.

    Animate Skeletons

    Skeleton streaming is supported only in plugin versions 1.9 or above.

    Tutorial Video

    Setup

    Follow the below steps to set up Skeleton streaming onto Unreal Engine.

    1. Create a Animation Blueprint in the 3D View

    Step 1. Navigate to a character folder. With sample characters, it is located in Characters → Heros → [Character Name] → Meshes.

    Step 2. Right-click the blank space in the Content Browser pane, then select Animation → Animation Blueprint.

    Step 3. On the pop-up window, select the OptiTrackAnimInstance at the parent class section at the top and click on the target Skeleton name at the bottom. Then click OK.

    Step 4. In the content browser, assign a name to the created animation blueprint.

    Step 5. Drag the character blueprint into the scene.

    Step 6. Select the character blueprint in the 3D View

    • In the Details Pane, select “+ ADD” and create a new an “OptiTrack Skeleton Instance” on the model.

    • Set the “Source Skeleton Asset” equal to the Skeleton name in Motive.

    2. Setup the Blueprint

    **Step 1.**Double-click the animation blueprint in the content browser to open its editor.

    **Step 2.**Right-click the animation graph, then create a new "OptiTrack Skeleton".

    **Step 3.**Right-click the animation graph, then create a new "Get Streaming Client Origin" and connect its output to the Streaming Client Origin.

    **Step 4.**Right-click the animation graph, then create a new "Get Source Skeleton Asset Name" and connect its output to the Source Skeleton Asset Name.

    Step 5. Right-click the animation graph, then create a new "Component To Local" and connect the output from "OptiTrack Skeleton" into its input.

    **Step 6.**Connect all of the nodes together. The basic animation flow chart should look like the following.

    Bone Transformation

    Within the animation blueprint, you can utilize other blueprint utility tools from UE4 to modify the streamed data. For example, Transform (Modify) Bone nodes can be included after the OptiTrack Skeleton node to apply a transform to specific Skeleton bones as needed. Please refer to for more information on using animation blueprints.

    Roll Bone Interpolation

    For characters with unmapped shoulder roll bones, the Skeleton plugin will detect its existence and apply a slight twist to the roll bones to keep smooth swinging motion on the arms. In the OptiTrack Skeleton blueprint, you can enable/disable this feature from the Roll Bone Interpolation checkbox, and you can also adjust how much of twist is applied by setting the Roll Bone Blending parameter. When this parameter is set to 0, the plugin will not adjust the roll bone motion, and when this is set to 1, the plugin will attempt to adjust its motion to keep the shoulder steady on the character.

    Please note that this feature may not work on some characters.

    3. Assign Bone Mapping

    Step 1. Select the OptiTrack Skeleton plugin in the blueprint graph area.

    Step 2. Drop down the Bone Mappings property in the Details Pane.

    Step 3. Click “Auto Fill Bone Mapping” to automatically assign the bones in the Skeleton to the OptiTrack Skeleton names.

    Note: There is no standard for bone naming conventions, so bone names may vary depending on characters. After doing the auto-fill, review the list and double-check that the auto-assigned names are correct. You may need to manually use the drop-down menu to adjust the assigned mapping for missing, or incorrect, items.

    Step 4. Hit "Compile" in the top left to build the blueprint.

    4. Setup OptiTrack Streaming

    Step 1. Open the 3D View

    Step 2. Search OptiTrack Client Origin in the Modes pane.

    Step 3. Drag the OptiTrack Client Origin into the 3D scene, then select it to access its properties.

    • (Optional) put it at 0,0,0 location.

    • Make sure that streaming settings on both Motive and Unreal match.

    See: page for more instructions on setting up the client origin.

    5. Click _Play_

    Notes on bone mapping

    The OptiTrack Unreal Engine Skeleton Plugin uses bone mapping, not retargeting. This means that the bone segments in Motive map directly to the character model (bone mapping), instead of being translated into something that is usable by a more abstract biped model (retargeting). Because of this non-anatomical Skeletons will not map correctly without some additional tweaking.

    Practically, this means that you will need to do things like turn off the toe mapping for characters with high heels, adjusting the pelvis bone in Motive or in the model for characters with non-anatomical hip bones, and not use bipeds that are too anatomically different than humans, such as a gorilla or swamp monster.

    For example, the character sample below has both a non-anatomical pelvis and high heels. It is preferable to use character models that are more anatomically correct, but in this case, you can do a couple things to mitigate these issues:

    1. Turn-off toe streaming

    In the example below, since this character is wearing heels, any actor providing data for this model will also need to be wearing heels. To get around this you can just turn off the toe mapping in the OptiTrack Unreal Engine Skeleton Plugin.

    2. Adjust the bone segments in Motive

    The hip segment on the Countess actor is centered in the stomach rather than in the pelvis, the neck bone in Motive is a bit too long for the model, and the shoulders in Motive do not match the width of the character’s shoulders. By adjusting bones' positions and lengths in Motive, you can make the streamed Skeleton to better match the model; however, please note that there are limitations to how much you can do this.)

    Bone Scaling

    When streaming Skeleton data to animate characters that have different bone lengths compared to the mocap actor, the UE character will need to be scaled accordingly. In this case, the "Scale Bones" feature in the OptiTrack Skeleton node automatically scales the character bones to match the mocap actor. This setting is enabled by default.

    Aligning Bones

    The OptiTrack Unreal Engine Skeleton Plugin uses bone mapping, not retargeting. This means that the bone segments in Motive map directly to the character model (bone mapping), instead of being translated into something that is usable by a more abstract biped model (retargeting). Because of this, non-anatomical Skeletons will not map correctly without some additional tweaking. Starting from plugin version 1.23, you can tweak the alignment of the bone mapping by adding sockets to the Skeleton blueprint:

    Adding Sockets to the Bone Mapping

    1. Open of the character you wish to modify

    2. Under the Skeleton tree, right-click on the bone that you wish to add the sockets to.

    3. Right click and select_Add Socket_.

    4. Go to the Animation blueprint, and change the bone mapping of the bone which you have created sockets for, and map it to the socket that was just created.

    Unreal Engine: HMD Setup

    This page provides instructions on setting up the OptiTrack OpenVR driver for integrating OptiTrack system with Vive HMDs within SteamVR applications; including Unreal Engine and Unity.

    Overview

    For integrating Vive HMDs, the OptiTrack OpenVR Driver must be used. This driver lets you track the head-mounted display (HMD) and the VR controllers using OptiTrack motion capture system and stream the tracking data from Motive directly into SteamVR. In other words, this will basically override the tracking from the lighthouse stations. The plugin ships as an installer package (MSI) which will set up the driver along with a utility tool for configuring client streaming settings. Once integrated, the streamed tracking data can be used in any application platform that utilizes SteamVR. For tracking of objects other than the HMDs, please read through the OptiTrack Unreal Engine Plugin page for details.

    Supported Systems

    • Vive

    • Vive Pro 1/2

    • Valve Index

    • HP Reverb G2

    Unreal Engine Plugin

    Unreal Engine 4

    When developing for SteamVR applications using the OpenVR Driver to track the HMD in Unreal Engine 4, the OptiTrack - Streaming Client version 2.27 must be used and the OptiTrack - VR Latency Optimization version 2.27 plugin is suggested. The OptiTrack - VR Latency Optimization provides HMD render compensation that helps to minimize the latency in VR application.

    Unreal Engine 5

    The latest plugins that support Unreal Engine 5 are OptiTrack - Live Link version 3.0 and OptiTrack - Streaming Client version 3.0.

    HMD Setup

    First of all, setup and optimize the motion capture volume as explained in the or the documentation. If you plan to install any obstacles (e.g. walls) within the capture volume, make sure they are non-reflective, and place and orient the cameras so that every corner is thoroughly captured by multiple cameras.

    General Setup Steps

    1. Attach the markers on the HMD

    2. Create a Rigid Body asset

    3. Calibrate the Pivot Point of the Rigid Body

    4. Configure the Rigid Body settings in Motive

    Place Markers on the HMD

    For the camera system to track the HMD, a set of markers must be attached to the HMD. You can either use the active markers (Active HMD clip or Active Tags) or the passive markers. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light and has the intelligence to be uniquely identified.

    In general, for most VR applications, using active markers is recommended for better tracking stability and ease of use. Active markers also have advantages over passive markers when tracking a large number of objects. For applications that are sensitive to the accuracy of the tracking data, using passive marker may have more benefits. To get more help with finding the best solution for your tracking application, please .

    When using the active markers, you can conveniently put a set of 8 markers onto the HMD by using the HMD Clip, or you can attach the markers from the Tag manually onto the HMD using adhesives and marker posts.

    Active HMD Clip

    Active HMD Clip is an HMD enclosure with a total of 8 active markers embedded for tracking. At the time of writing, there are active HMD clips for Vive Pro / Valve Index HMDs available on the webstore. The clips can be mounted easily by pushing it onto the HMD until the latches click, and you can detach it by gently lifting the three latches located at the top, left, and right side of the clip.

    Once the clip has been mounted, next step is to import the provided into Motive and refine the definition to get the calibrated pivot point position and orientation, which will be explained on the next section.

    Marker Types

    You can either use the passive retro-reflective type markers or the active LED markers to track the HMD. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light which gets uniquely identified in Motive. Either type of marker can be used to track HMDs. Using

    Create HMD Rigid Body in Motive

    This feature can be used only with HMDs that have the clips mounted.

    For using OptiTrack system for VR applications, it is important that the pivot point of HMD Rigid Body gets placed at the appropriate location, which is at the root of the nose in between the eyes. When using the HMD clips, you can utilize the HMD creation tools in the Builder pane to have Motive estimate this spot and place the pivot point accordingly. It utilizes known marker configurations on the clip to precisely positions the pivot point and sets the desired orientation.

    HMDs with passive markers can utilize the tool to calibrate the pivot point.

    Steps

    1. First, make sure Motive is configured for tracking .

    2. Open the under and click Rigid Bodies.

    3. Under the Type drop-down menu, select HMD. This will bring up the options for defining an HMD Rigid Body.

    This is supported only for Motive versions 2.1.2 or above. If you are using any other versions of Motive 2.1, please update the version to 2.1.2, or use a template to create the Rigid Body definition; instructions for which is provided in the following page: .

    Setting up the OpenVR driver

    SteamVR Required: The VR driver streams tracking data through SteamVR. Please make sure SteamVR is installed on the computer before setting up the driver.

    Setup Steps

    Download and run the installer

    Download the OpenVR driver from the page. Once downloaded, launch the installer and follow the prompts to set up the driver. On the last window, make sure to select Launch Configuration Utility before clicking Finish. This will open the Configuration options to setup your HMD with Motive.

    You may receive a warning window prior to the installation wizard. To circumvent this, select More info and then Run Anyway.

    Open the configuration program

    Once the driver has been successfully installed, launch the configuration utility software (C:\Program Files\OptiTrack\OpenVR Driver\ConfigUtil). Using this tool, you can load and check existing configurations and make changes to the settings as needed. To import current settings, click Load and to save out the changes, click Save.

    Please make sure you are running this tool with admin privileges; if not, it might not be able to modify the settings properly. If the configuration software detects a running instance of SteamVR through OpenVR, it will be indicated as Initialized at the very top as shown in the image. Please note that when the settings get modified while SteamVR is running, the SteamVR must be restarted to apply the changes.

    Configure connection settings

    First, configure the connection settings so that the driver listens to the Motive server where the tracking data is streamed from. The server address must match the address where Motive is streaming the data to, and the local address must match the IP address of the computer on the network where the driver is installed.

    Set up the HMD

    In the HMD section, enable the HMD and input the Rigid Body ID of the HMD. The Rigid Body ID must match the property of the HMD Rigid Body definition in Motive.

    Save out the configuration

    Save the configurations by clicking on Save. This will modify the set of configurations in the steamvr.settings file in the steam installation directory and they will override the HMD tracking with the tracking data from Motive. If you already had an instance of OpenVR or SteamVR running, restart the application to apply the changes.

    Configuration File

    The configuration tool basically imports and modifies the contents in the steamvr.settings file (C:\Program Files (x86)\Steam\config\steamvr.settings). When needed, the driver related settings can be changed directly from this file also, but it will be easier to configure the settings using the provided configuration tool.

    Confirm the setup

    Launch SteamVR. If the driver is successfully set up, you should see a tracker icon added to the right of the HMD icon and the HMD will now be using the motion capture system instead of the base stations. Here, please make sure all of the lighthouse base stations are powered off.

    VIVE Controllers

    VIVE controllers are a Beta feature and may not work for every device. Support for this particular feature is limited.

    Setting up the controller (optional)

    When needed, the Vive controllers can be configured as well. To do so, open the configuration utility tool while SteamVR is running. At the top of the configuration tool, it should indicate OpenVR status as Initialized and the controllers must be showing up in SteamVR. Then, in the controller sections, enable the controllers, specify the override device using the drop-down menu, and input the corresponding streaming ID of the controller Rigid Bodies in Motive. Once everything has been configured, save the changes and restart SteamVR. When the override is configured properly, SteamVR will have an additional tracker icon per each enabled controller.

    Data Streaming

    Now that the driver is set up, the HMD tracking will be overridden by tracking data from the mocap camera system, and you can integrate HMDs into the game engine through their own VR integration.

    Streaming settings in Motive

    First, make sure the streaming settings are configured in Motive for streaming out the data. For more information regarding streaming in Motive please visit our wiki page:

    • Enable must be set to toggled on.

    • Local interface must be set to the desired IP address to stream the tracking data from.

    • Streaming of Rigid Bodies must be set to True

    Confirm the Connection

    Once Motive is configured for streaming, launch SteamVR home to check the connection. If everything is setup correctly, you should be able to move around, or translate, within the scene freely. You may also need to check the ground plane to make sure it's well aligned.

    If you experience any unexpected rotations in the view as you move your head, it could indicate that the HMD pivot point has not been calibrated properly. Please revisit the HMD Setup section and make sure the HMD Rigid Body pivot point is positioned and oriented at the expected pivot; which is at the root of nose with z-forward.

    Developing in Unreal Engine

    Make sure Unreal Engine is configured for SteamVR development. Please refer to the Unreal Engine's documentation for more information on developing for SteamVR.

    Streaming Rigid Body/Skeleton data

    This driver is designed for streaming of HMD and controller tracking data only. For streaming tracking data of other Rigid Body objects, you will need to use the corresponding plugins ( or ). In other words, the HMD tracking data will be streamed through the SteamVR using the driver you've installed, and all other tracking data will be streamed through the plugin.

    Aligning world coordinates with the UE plugin

    Client Origin

    When using the OpenVR driver for the HMD and the game engine plugins (UE/Unity) for other types of tracking data, including Rigid Body data, the client origin object must be located at the global origin without any rotations. In other words, the position must be set to (0,0,0) and the rotation must be set to (0,0,0) on the client origin. This is important because this will align the coordinate system from the (UE/Unity) plugin with the coordinate system in OpenVR pipeline

    Notes for Unreal Engine Users

    When using the Unreal Engine plugin, you will need to additionally create a custom pawn for properly aligning the coordinate systems between SteamVR and OptiTrack UE plugin:

    1. Create an "OptTrack Client Origin" to the scene and set the relevant connection info. Refer to the page for more information on setting up the client origin.

    2. Create a new pawn. Right-click in Content Browser, and from the context menu, select Blueprint → Blueprint Class → Pawn.

    3. Load created Blueprint in the editor and add a camera component.

    Autodesk MotionBuilder Plugin

    Overview

    The OptiTrack MotionBuilder Plugin is a collection of MotionBuilder devices, scripts, and samples used for working with OptiTtrack Motive data inside MotionBuilder. The device plugins allow users to stream Motive Live or .tak data into MotionBuilder. This page is designed to help you get started with the general download and setup process, and organize the three MotionBuilder plugin pages for quick reference.

    For basic instructions on setting up a motion capture system, please refer to the guide.

    Included Plugins:

    Streaming
    Assets pane
    eSync 2
    Epic Games's website
    Unreal Engine's Live Link to MotionBuilder documentation
    OptiTrack - Live Link plugin. Make sure the plugins are placed in the plugins folder either in the project or engine directory.
    Built-In Live Link plugin.
    Live Link pane in UE5.
    Live Link pane in UE5.
    Creating OptiTrack source with client streaming settings.
    Connected to the Rigid Body data stream.
    Camera actor listed under the Place Actors pane.
    Actor mobility set to movable.
    Live Link Controller component needs be added.
    The target Rigid Body selected under Live Link Controller component properties.
    Select the timecode provider under project settings.
    Enable timecode provider setting in the plugin.
    Timecode shown in the Take Recorder pane.
    New animation blueprint.
    Rename blueprint.
    Connect to pose, then choose name.
    Retargeting option.
    Component to animate Skeleton.
    Options for retargeting to different names. Click image to enlarge.
    Click image to enlarge.
    Click image to enlarge.
    Creating a preset from Live Link panel. Select Save As Preset.
    Assigning the preset in the Project Settings.
    Playing a scene in standalone game mode.
    Open/Create a new Unreal Engine project.
  • Under the Edit menu, click Plugins to open up the panel where all of the available plugins are listed.

  • Browse to OptiTrack section and enable the "OptiTrack - Streaming Client".

  • Click Apply to submit the changes. It will require the Unreal Engine project to be restarted

  • [Unreal] Place the OptiTrack Client Origin at the desired location within the scene.
  • [Unreal] Select the instantiated OptiTrackClientOrigin object from the World Outliner panel.

  • [Unreal] In the Details panel, make sure its Auto Connect setting is checked. This configures the client origin to automatically search the network and connect to Motive.

  • Now that the client origin is set, the client origin will attempt to connect to Motive and start receiving the tracking data whenever the scene is played.

  • [Motive] In Motive, assign a value to Streaming ID property for the target Rigid Body.

  • [Unreal] In the properties of the OptiTrack Rigid Body Actor component, match the Tracking ID with the Streaming ID of the Rigid Body asset in Motive.

  • Make sure both Motive and OptiTrack Client Origin is set up for streaming, hit Play, and the attached actor object will be animated according to the live-streamed Rigid Body tracking data.

  • Play the scene, and adjust the socket location from the Skeleton Editor to adjust alignment of the bone.

  • plugin ZIP file
    Streaming Pane
    Streaming Pane
    Streaming ID
    Streaming ID
    Labeled Markers
    Paragon
    Unreal Engine documentation
    OptiTrack Unreal Engine
    Skeleton Editor
    Once the plugin is properly added, the Client Origin can be searched and selected in the Place Actors tab.
    Data streaming settings in Motive. Click image to enlarge.
    OptiTrack Client Origin properties defined with corresponding server and client address. Click image to enlarge.
    Streaming ID of a selected Rigid Body asset in Motive.
    Once the OptiTrack - Streaming Client plugin is properly installed, the OptiTrack Rigid Body component will be available in the Place Actors tab in UE5.
    Within the OptiTrack Rigid Body Actor, input the Tracking ID (same as Streaming ID in Motive) for the corresponding Rigid Body asset in Motive.
    Labeled markers shown in UE.
    Adding animation blueprint in UE.
    Specifying Skeleton name under OptiTrack Skeleton Component properties.
    Creating animation blueprint.
    Auto-mapped Skeleton bones.
    Creating a socket for the right hand in Skeleton editor.
    Skeleton Editor: Socket created.
    Animation Blueprint: Mapping the right hand to the created socket.
    Translating the right-hand socket in the Skeleton editor.
    is recommended especially for applications that involve tracking of multiple HMDs in the scene.

    Marker Placement

    • Make sure the markers are attached securely and do not move. If the markers happen to move even slightly after a Rigid Body is defined, it will negatively affect the tracking and the Rigid Body definition may need to be updated.

    • Avoid placing multiple markers in close vicinity as they may overlap in the camera view in certain orientations.

    • Using marker posts to extend out the markers is recommended to improve marker visibility from more angles.

    • If you are using the active markers, there is an extra USB port on the HMD that you could draw the power from.

    • Please read through the page for additional information on the marker placement on a Rigid Body.

    If the selected marker matches one of the Active clips, it will indicate which type of Active Clip is being used.
  • Under the Orientation drop-down menu, select the desired orientation of the HMD. The orientation used for streaming to Unity is +Z forward and Unreal Engine is +X forward, or you can also specify the expected orientation axis on the client plugin side.

  • Hold the HMD at the center of the tracking volume where all of the active markers are tracked well.

  • Select the 8 active markers in the 3D viewport.

  • Click Create. An HMD Rigid Body will be created from the selected markers and it will initiate the calibration process.

  • During calibration, slowly rotate the HMD to collect data samples in different orientations.

  • Once all necessary samples are collected, the calibrated HMD Rigid Body will be created.

  • For wireless streaming, use
    Unicast
    streaming type.
    (optional) Double-check that the “Lock to HMD” property is set to true under the camera component properties in the details pane.
  • Select the pawn and set the “Base Eye Height” property to 0 in the details pane.

  • Compile the pawn then add it to the scene.

  • Select the pawn and set the “Auto Possess Player” to “Player 0”.

  • The HMD should now be working for Levels built for VR.

  • Getting Started guide
    Hardware Setup
    contact us
    Rigid Body asset
    active marker
    OptiTrack Active HMD
    External Pivot Alignment
    active markers
    Builder pane
    View tab
    Using a Template File to Create Vive Pro Active Clip Rigid Body
    downloads
    Streaming ID
    Streaming
    Developing for SteamVR
    UnrealEngine
    Unity
    OptiTrack Unreal Engine Plugin
    OptiTrack plugins listed in UE4.
    OptiTrack plugins listed in UE5.
    Active marker clip for HTC Vive Pro.
    Active marker clip for HP Reverb G2.
    Creating an HMD Rigid Body in the Builder pane.
    Windows installation warning.
    OptiTrack OpenVR Driver Setup Wizard Window.
    HMD Rigid Body streaming settings in Motive.
    Configured properties of the custom pawn.
    Configured properties of the added camera component.
  • OptiTrack - Skeleton

  • OptiTrack - Optical

  • OptiTrack - Insight VCS

  • Motive Data Streaming Setup (Server)

    First, you'll want to follow the instructions below to set up the data streaming settings in Motive. Once this is configured, Motive will be broadcasting tracking data onto a designated network interface where client applications can receive them.

    Streaming Settings

    Broadcast Frame Data set to true for streaming.

    Open the Data Streaming Pane in Motive's Settings window and set the following settings:

    • Enable - Turn on the Enable setting at the top of the NatNet section.

    • Local Interface - Choose the desired IP network address from this dropdown to stream data over.

      • Loopback

        • This is the local computer IP address (127.0.0.1 or Localhost).

        • Used for streaming data locally on the PC you are running Motive on that does not interact with the LAN.

        • Good option for testing network issues.

      • 192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)

        • This IP address is the interface of the LAN either by Wi-Fi or Ethernet.

        • This will be the same address the Client application will use to connect to Motive.

    • Transmission Type

      • For streaming over a Wi-Fi network, setting the Transmission Type to Unicast is strongly advised.

    • Select desired data types to stream under streaming options:

      • Rigid Bodies - Enabled (required).

      • Skeletons - Optional for Skeleton tracking.

      • Markers (Labeled, Unlabled, Asset)

    • Skeleton Coordinates

      • Set to Local.

    • Bone Naming Convention

      • When streaming Skeletons, set to FBX.

    Additional Tips

    • For best results, it is advised to run Motive and MotionBuilder separately on different computers, so that they are not competing for processing resources.

    • When streaming the data over a Wi-Fi network, Unicast transmission must be used.

    • In order to stream data from the Edit mode, a capture-recording must be playing back in Motive.

    • For additional information on data streaming in general, read through the page.

    Plugins

    OptiTrack - Skeleton

    The OptiTrack Skeleton Device allows to you map Motive 6DOF Skeleton joint angle data directly onto a MotionBuilder character

    OptiTrack - Optical

    The OptiTrack Optical Plugin device allows to you to map motion capture (optical) data onto an animated character within MotionBuilder.

    OptiTrack - Insight VCS

    The Virtual Camera device is specifically designed for creating a Virtual Camera in MotionBuilder. You can use the Insight VCS device with standard OptiTrack applications such as Motive, or you can use the device in "Universal" mode, which works with generic MotionBuilder Optical or RigidBody objects, allowing you to use the Insight VCS device with alternative motion capture systems that support optical or rigid body devices in MotionBuilder.

    OptiTrack MotionBuilder Plugins view in MotionBuilder in the Asset Browser tab under Devices.

    Downloading the Plugins

    After downloading the MotionBuilder plugin from the OptiTrack website, follow the steps below for a successful install.

    Step 1. Installer Wizard

    OptiTrack MotionBuilder Plugin .exe file.
    1. Double click the OptiTrack MotionBuilder Plugin .exe file to open the install wizard.

    2. Once the install wizard is open click Next.

    3. Then read through the End User License Agreement and select I accept the terms in the license agreement then click Next.

    4. The next step is to select the version of the plugin that matches the version of MotionBuilder you have. Typically this should already auto-detect which version you have installed on your computer. However, if all have a red 'x', click the dropdown menu for the appropriate version and select This feature will be installed on local hard drive.. Then click Next.

    5. Click Install _and then Yes'_ for the installation to begin.

    2. OptiTrack MotionBuilder Plugin Installer.
    3. OptiTrack MotionBuilder Plugin Installer End User License Agreement.
    4. OptiTrack MotionBuilder Plugin Installer Select Version to install.
    5. OptiTrack MotionBuilder Plugin Installer Select Version to install.

    Wireless Multiplayer Setup

    When setting up multiplayer games on wireless clients, it is more beneficial for each client to make direct connection to both the tracking-server (Motive) and the game-server, rather than rebroadcasting the streamed tracking data through the game-server. Then, any of the game related actions that interacts with the tracking data can be processed on the game-server, and this server can send out the corresponding updates to the wireless clients. This allows the wireless clients to only receive both the tracking data or updates without having to send back any information; in other words, minimizing the number of data transfers needed. If wireless clients are sending data there will be a minimum of two transfers on the wireless network, and each transfer of data through wireless network is at risk of latency or lost packets.

    Getting Started

    Unreal Engine: MotionBuilder Workflow

    This guide will cover how to get motion capture data streamed from Motive to Unreal Engine 5's MetaHumans with the additional help of MotionBuilder.

    A method commonly used in Virtual Production pipelines, uses MotionBuilder to manually retarget the bones and source the streamed data from Motive onto a MetaHuman. Additionally, this process allows for more refined cleanup through MotionBuilder if required in the future.

    Motive Setup

    Open the in Motive's Settings window and set the following settings:

    Rigid Body Tracking
    - Disabled for HMDs (advised).
  • Devices - Disabled.

  • Data Streaming
    Enable

    Turn on the Enable setting at the top of the NatNet section to enable streaming in Motive.

    Local Interface

    It is recommended that Motive and MotionBuilder should run on the same machine. An additional machine should be used to run Unreal, since UE5 can be resource intensive.

    Set to Loopback. This is the LocalHost IP since Motive and MotionBuilder will be running on the same machine.

    Transmission Type

    Set to Multicast

    Markers

    Select which markers you'd like to stream into UE5, you will need to at least have Rigid Bodies and Skeletons enabled.

    Skeleton Coordinates

    Set to Global.

    Bone Naming Convention

    Set to FBX

    Up Axis

    Set to Y-Axis

    Remote Trigger

    Leave unenabled

    Be sure to either actively playing a looped take, or stream a live skeleton in Motive so the plugin can capture the data and send it to MotionBuilder.

    Unreal Engine 5 Plugin for MotionBuilder Setup

    Please download the MotionBuilder UE5 plugin here. Currently the plugin download link is not yet officially on Unreal Engines documentation website, and is a community updated plugin for UE5 that works just the same. Download the Zip file MobuLiveLink.zip.

    Plugin Installation Steps

    Once downloaded, you can extract the Zip folder and go to MobuLiveLink > Binaries > MobuVersion.

    You'll want to then copy those three files in this folder and paste them in to the C:\Program Files\Autodesk\MotionBuilder 2020\bin\x64\plugins. file directory folder.

    Once these files are added to the file directory folder, it should appear as UE - Livelink in Devices under the Asset Browser in MotionBuilder.

    Unreal Engine Setup

    Plugins

    It's best to enable both the Quixel Bridge and OptiTrack Live Link plugins together, since you'll likely need to restart Unreal Engine each time you enable a plugin.

    Quixel Bridge

    First, you'll want to verify that the Quixel Bridge plugin is installed with Unreal Engine 5. You can install the Quixel Bridge plugin from the Epic Games Launcher by clicking Unreal Engine then the Library Tab.

    You'll also wan to make sure that the Quixel Bridge plugin is enabled. To do this go to Edit > Plugins > Editor and click the check box. You may have to restart Unreal after enabling this plugin.

    OptiTrack Live Link Plugin

    You'll also need to download and enable the Unreal Engine OptiTrack Live Link Plugin. To enable this plugin you can download the plugin from the link above, open Settings > Plugins > Either look for OptiTrack in the sidebar or use the search bar to find Live Link, and check the box to enable the OptiTrack Live Link plugin.

    Create a Project

    Next, you'll want to create a project to import your MetaHuman into. Once, your project is created, open Quixel Bridge within your Unreal project.

    Load MetaHuman into Project

    Now that you're in the Quixel Bridge window, log in to your MetaHuman account and navigate to MetaHumans > My MetaHumans.

    Select the MetaHuman you wish to load into the project and select the desired quality profile. Then click 'Download'. After is has completed downloading, you will be able to add it to the current project.

    Blueprint Settings

    Navigate to the MetaHuman blueprint in the Content Browser under Content > MetaHumans > "MetaHumanName" > BP_MetaHumanName and double click to open the blueprint window.

    In the Components tab on the left, select Body then navigate to the Skeletal Mesh tab of the Details node. Choose which Skeletal Mesh you'd like to use.

    Exit this window and find the mesh in the content browser and right click and select Asset Actions > Export....

    Select FBX Export and deselect Level of Detail in the FBX Export Options.

    MotionBuilder Setup and Characterizing

    Retarget Animation

    Now that the skeleton driving the MetaHuman is exported, we can retarget animation in MotionBuilder from the OptiTrack plugin.

    In MotionBuilder select File > Open and select the MetaHuman you exported earlier.

    Apply a namespace, this will prevent this skeleton from conflicting with any other potential skeletons in the scene.

    You do not need to import any takes with this file.

    The Skeleton should now appear in your scene/viewport.

    Characterize Skeleton

    Under the Character Controls panel, select Skeleton > Define under the Define tab.

    To fill out this definition quickly, select the bone you want to map first in the viewport and then in the Definition panel right click and hold on the bone.

    Hover over Assign Selected Bone then let go of right click.

    Repeat that process for all the Main Bone Hierarchy.

    There’s no need for any extra bones in the definition, for example “ Spine_04_Latissimus_r” as Motive’s skeleton does not drive any of those bones.

    You'll want to make sure that the MetaHuman skeleton is in a T-pose. If hand tracking will be added too, make sure to pose the hands in the proper pose for said hand tracking application.

    When all the bones are filled out in the definition and the skeleton is in T-pose there should be a green checkmark notifying everything is ready to characterize.

    To lock in the definition, click the lock at the top of the Character controls panel. Now that the character is created, the respective plugins can be added to the scene that will drive the streaming into Unreal.

    MotionBuilder Streaming Setup

    In MotionBuilder go to the Asset Browser pane select the Device from the Peripherals dropdown. Drag and drop the OptiTrack Skeleton and the UE5 Live Link plugins into the main viewport to add it to the scene.

    From the navigator tab on the left, select OptiTrack Skeleton from the I/O Devices dropdown. This will open the Main tab of the OptiTrack Skeleton plugin's settings.

    Settings

    Local and Server Address

    Set both to 127.0.0.1

    Server Type

    Set this to match was was selected in Motive (either Unicast or Multicast)

    Auto Characterize

    Make sure to check this box to enable Auto Characterize

    Online

    Click this box to enable streaming in MotionBuilder:

    • Green - Connected and streaming.

    • Yellow - Connected but not streaming.

    • Red - Not connected or streaming.

    Live

    Check this box if streaming Live data

    Model Binding

    Click Create... and select the actor from Motive you wish to track

    Unreal Engine - Live Link

    Settings

    Online

    Click this box to enable streaming in UE

    • Green - Connected and streaming.

    • Yellow - Connected but not streaming.

    • Red - Not connected or streaming.

    Subject Selector

    Click the dropdown and select the root of your MetaHuman skeleton.

    Click Add.

    Stream Viewport Camera

    Select Stream Viewport Camera if you want to stream your viewports camera to the preview section in Unreal Engine

    Character Controls

    In the Character Controls panel, select the MetaHuman as the Character and the Motive Actor as the Source.

    You should now see the MetaHuman skeleton copying the data that the Motive Skeleton is using.

    (Optional) Streaming Across Another Device

    In the Settings tab in the UE Live Link Device, go to the Unicast Endpoint section, and change the address to the MotionBuilder machine's IPv4 address and choose a port number denoted by a colon after the IPv4 address.

    To find the IP address of your computer, open a Command Prompt window, type in 'ipconfig' and hit enter. This should display all the networks on your computer, scroll and find the Ethernet adapter card associated with the NIC that you'll be using to connect to the network.

    Port 6666 is reserved and cannot be used for endpoints.

    In Unreal on the computer receiving the stream, open:

    Project Settings > UDP Messaging >Advanced > Static Endpoints > + to add an Array Element and enter the IPv4 address and port number from the previous step.

    Click image to enlarge.

    This will allow MotionBuilder to Live Link to Unreal Engine as normal.

    Final Steps in Unreal

    Open the OptiTrack Live Link plugin downloaded earlier. Navigate to Window > Virtual Production > Live Link.

    In the Live Link tab, create a new MotionBuilder source by selecting Source > Message Bus Source > MoBu Live Link.

    Now that the Live Link connection has been established, the motion data from MotionBuilder can now be streamed into the MetaHuman via an animation blueprint.

    Right click in the content browser where you want to create an animation blueprint and choose Animation> Animation Blueprint.

    Unreal will then prompt you to choose a specific Skeleton asset. For this choose the metahuman_base_skel and click Create and name it ABP_”metahuman name”.

    Now open the animation blueprint and in the AnimGraph create a Live Link Pose node and select the subject name as UE. Then simply hit Compile and Save.

    You do not need to have an Input Pose node in the anim graph in order for it to work, however, if you would like to place one simply right click and search for Input Pose.

    Lastly, Place the MetaHuman into the scene, select the MetaHuman and go to the Details panel on the right of the viewport. Find and select Body, then scroll down to find the Animation tab. In Animation Mode select Use Animation Blueprint and in Anim Class select the Animation Blueprint that you made.

    If your animation data is streaming in, you should see your MetaHuman snap to a position notifying that it updated. If you would like to view the data in the edit viewport, go to the details panel again and add a Live Link Skeletal Animation.

    And that’s it! With this method you’ll be able to easy make changes in MotionBuilder where you see fit, or make adjustments in the Animation Blueprint that is driving the streamed data.

    Data Streaming Pane
    Video Tutorial for all Motionbuilder plugins. For more information, please visit the individual pages for each of the plugins.

    OptiTrack OpenVR Driver

    Overview

    The OptiTrack VR driver lets you stream tracking data of the head-mounted display (HMD) and the controllers from Motive into SteamVR. The plugin ships in an installer package (MSI), and it will set up the driver along with a tool for configuring streaming settings. Using this, the tracking data from Motive can be used to override the tracking of the VR system in any applications that are compatible with SteamVR.

    Supported HMDs:

    • Vive

    • Vive Pro

    • Vive Pro 2

    • Valve Index

    Requirements

    • Motive version 2.2 or higher

    • SteamVR installed on the host computer.

    HMD Setup

    Place Markers on the HMD

    For the camera system to track the HMD, a set of markers must be attached to the HMD. You can either use the active markers (Active HMD clip or Active Tags) or the passive markers. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light and has the intelligence to be uniquely identified.

    In general, for most VR applications, using active markers is recommended for better tracking stability and ease of use. Active markers also have advantages over passive markers when tracking a large number of objects. For applications that are sensitive to the accuracy of the tracking data, using passive marker may have more benefits. To get more help with finding the best solution for your tracking application, please .

    When using the active markers, you can conveniently put a set of 8 markers onto the HMD by using the HMD Clip, or you can attach the markers from the Tag manually onto the HMD using adhesives and marker posts.

    Active HMD Clip

    Active HMD Clip is an HMD enclosure with a total of 8 active markers embedded for tracking. At the time of writing, there are active HMD clips for Vive Pro / Valve Index HMDs available on the webstore. The clips can be mounted easily by pushing it onto the HMD until the latches click, and you can detach it by gently lifting the three latches located at the top, left, and right side of the clip.

    Once the clip has been mounted, next step is to import the provided into Motive and refine the definition to get the calibrated pivot point position and orientation, which will be explained on the next section.

    Marker Types

    You can either use the passive retro-reflective type markers or the active LED markers to track the HMD. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light which gets uniquely identified in Motive. Either type of marker can be used to track HMDs. Using is recommended especially for applications that involve tracking of multiple HMDs in the scene.

    Create HMD Rigid Body in Motive

    This feature can be used only with HMDs that have the clips mounted.

    For using OptiTrack system for VR applications, it is important that the pivot point of HMD Rigid Body gets placed at the appropriate location, which is at the root of the nose in between the eyes. When using the HMD clips, you can utilize the HMD creation tools in the Builder pane to have Motive estimate this spot and place the pivot point accordingly. It utilizes known marker configurations on the clip to precisely positions the pivot point and sets the desired orientation.

    HMDs with passive markers can utilize the tool to calibrate the pivot point.

    Steps

    1. First of all, make sure Motive is configured for tracking .

    2. Open the under and click Rigid Bodies.

    3. Under the Type drop-down menu, select HMD. This will bring up the options for defining an HMD Rigid Body.

    This is supported only for Motive versions 2.1.2 or above. If you are using any other versions of Motive 2.1, please update the version to 2.1.2, or use a template to create the Rigid Body definition; instructions for which is provided in the following page: .

    Setting up the OpenVR Driver

    SteamVR Required: The VR driver streams tracking data through SteamVR. Please make sure SteamVR is installed on the computer before setting up the driver.

    Setup Steps

    Download and run the installer

    Download the OpenVR driver from the page. Once downloaded, launch the installer and follow the prompts to set up the driver. On the last window, make sure to select Launch Configuration Utility before clicking Finish. This will open the Configuration options to setup your HMD with Motive.

    You may receive a warning window prior to the installation wizard. To circumvent this, select More info and then Run Anyway.

    Open the configuration program

    Once the driver has been successfully installed, launch the configuration utility software (C:\Program Files\OptiTrack\OpenVR Driver\ConfigUtil). Using this tool, you can load and check existing configurations and make changes to the settings as needed. To import current settings, click Load and to save out the changes, click Save.

    Please make sure you are running this tool with admin privileges; if not, it might not be able to modify the settings properly. If the configuration software detects a running instance of SteamVR through OpenVR, it will be indicated as Initialized at the very top as shown in the image. Please note that when the settings get modified while SteamVR is running, the SteamVR must be restarted to apply the changes.

    Configure connection settings

    First, configure the connection settings so that the driver listens to the Motive server where the tracking data is streamed from. The server address must match the address where Motive is streaming the data to, and the local address must match the IP address of the computer on the network where the driver is installed.

    Set up the HMD

    In the HMD section, enable the HMD and input the Rigid Body ID of the HMD. The Rigid Body ID must match the property of the HMD Rigid Body definition in Motive.

    Save out the configuration

    Save the configurations by clicking on Save. This will modify the set of configurations in the steamvr.settings file in the steam installation directory and they will override the HMD tracking with the tracking data from Motive. If you already had an instance of OpenVR or SteamVR running, restart the application to apply the changes.

    Configuration File

    The configuration tool basically imports and modifies the contents in the steamvr.settings file (C:\Program Files (x86)\Steam\config\steamvr.settings). When needed, the driver related settings can be changed directly from this file also, but it will be easier to configure the settings using the provided configuration tool.

    Confirm the setup

    Launch SteamVR. If the driver is successfully set up, you should see a tracker icon added to the right of the HMD icon and the HMD will now be using the motion capture system instead of the base stations. Here, please make sure all of the lighthouse base stations are powered off.

    VIVE Controllers

    VIVE controllers are a Beta feature and may not work for every device. Support for this particular feature is limited.

    Setting up the controller (optional)

    When needed, the Vive controllers can be configured as well. To do so, open the configuration utility tool while SteamVR is running. At the top of the configuration tool, it should indicate OpenVR status as Initialized and the controllers must be showing up in SteamVR. Then, in the controller sections, enable the controllers, specify the override device using the drop-down menu, and input the corresponding streaming ID of the controller Rigid Bodies in Motive. Once everything has been configured, save the changes and restart SteamVR. When the override is configured properly, SteamVR will have an additional tracker icon per each enabled controller.

    Data Streaming

    Now that the driver is set up, the HMD tracking will be overridden by tracking data from the mocap camera system, and you can integrate HMDs into the game engine through their own VR integration.

    Streaming settings in Motive

    First of all, make sure the streaming settings are configured in Motive for streaming out the data. For more information regarding streaming in Motive, please visit our page:

    • Broadcast Frame Data must be set to true.

    • Local interface must be set to the desired IP address to stream the tracking data from.

    • Streaming of Rigid Bodies must be set to True

    Notes for Unity users

    Please make sure the Unity project is configured for OpenVR development. In Unity, open player settings from Edit → Project Settings → Player and select the OpenVR under the Virtual Reality SDK lists. Once this is set up properly, it will play the scene on the HMD.

    Unity-OpenVR documenation:

    Unreal Engine

    Make sure Unreal Engine is configured for SteamVR development. Please refer to the Unreal Engine's documentation for more information on developing for SteamVR.

    Unreal Engine-SteamVR:

    Data Port Note

    As of the OpenVR Driver 2.1.0 the auto-detection port default is 1513. In the case where a firewall must configure individual ports to allow or disallow data, this port can be used to allow the OpenVR Driver to connect automatically to Motive.

    Streaming Rigid Body/Skeleton data

    This driver is designed for streaming of HMD and controller tracking data only. For streaming tracking data of other Rigid Body objects, you will need to use the corresponding plugins ( or ). In other words, the HMD tracking data will be streamed through the SteamVR using the driver you've installed, and all other tracking data will be streamed through the plugin's client origin.

    Aligning world coordinates

    Client Origin

    When using both the VR driver and the plugins (UE/Unity), it is important that the client origin object is located at the origin without any rotations. In other words, it must have the position set to (0,0,0) and the rotation set to (0,0,0).

    Notes for Unreal Engine Users

    When using the Unreal Engine plugin, you will need to additionally create a custom pawn for properly aligning the coordinate systems between SteamVR and OptiTrack UE plugin:

    1. Create an "Optitrack Client Origin" to the scene and set the relevant connection info. Refer to the page for more information on setting up the client origin.

    2. Create a new pawn. Right-click in Content Browser, and from the context menu, select Blueprint → Blueprint Class → Pawn.

    3. Load created Blueprint in the editor and add a camera component.

    Autodesk MotionBuilder: OptiTrack Skeleton Plugin

    Overview

    This page provides instructions on how to use the OptiTrack MotionBuilder Skeleton plugin. This plugin allows you to map Motive 6DOF Skeleton joint angle data directly onto a MotionBuilder character.

    MotionBuilder Skeleton Plugin.

    Before following the walkthrough below, please refer to Autodesk MotionBuilder Plugin page for initial steps for setting up motive and downloading the OptiTrack MotionBuilder plugin.

    Motive Data Streaming Setup (Server)

    First, you'll want to follow the instructions below to set up the data streaming settings in Motive. Once this is configured, Motive will be broadcasting tracking data onto a designated network interface where client applications can receive them.

    Streaming Settings

    Open the in Motive's Settings window and set the following settings:

    • Enable - Turn on the Enable setting at the top of the NatNet section.

    • Local Interface - Choose the desired IP network address from this dropdown to stream data over.

      • Loopback

    Additional Tips

    • For best results, it is advised to run Motive and MotionBuilder separately on different computers, so that they are not competing for processing resources.

    • When streaming the data over a Wi-Fi network, Unicast transmission must be used.

    MotionBuilder Setup (Client)

    To get started, drag the OptiTrack Skeleton plugin from the Motion Builder Asset Browser tab > Devices into the Viewer window. This will create a dropdown menu called I/O Devices in the Navigator tab. Click the + button next to I/O Devices and select OptiTrack Skeleton. This will populate the plugin's settings tab if it hasn't already auto-populated from the drag and drop step from earlier.

    Device Settings

    • Local address - IP address of the MotionBuilder computer. In situations where multiple network adapter cards are present, select the adapter that is on the same network as the Motive application.

      • 127.0.0.1

        • This is the local computer IP address (127.0.0.1 or Localhost).

    Once the above settings are input appropriately, you'll want to click the box next to Online. This indicate whether or not Motive is successfully streaming to MotionBuilder.

    • Online color indicator

      • Green - Connected and streaming.

      • Yellow - Connected but not streaming.

    Model Binding

    The Skeleton Device plugin uses model binding to map Motive Skeleton data to MotionBuilder animation nodes. There can be multiple model binding templates in a MotionBuilder scene. The active model binding is indicated in the model binding combo box on the Device tab.

    Automatic Binding Update

    The MotionBuilder plugin monitors the tracking model list it is receiving from the server (e.g. Motive). If it detects a change in the tracking model list, it will automatically update the current MotionBuilder Skeleton list to match, creating new Skeletons when necessary, or re-connecting to existing Skeletons where possible, based upon the Skeleton/tracking model name. It will not remove existing MotionBuilder Skeletons.

    Manual Binding Update

    If the plugin is unable to automatically update the model template, you must update your model binding in the Skeleton Device plugin to a model binding that matches the tracked model. To do this:

    1. [Motive] Change the actively tracked models list by checking/unchecking the desired asset in the Asset pane.

    2. [MotionBuilder Plugin] Press the Update Bindings button on the Skeleton Device tab.

    3. [MotionBuilder Plugin] Select a valid model binding in the Model Binding dropdown.

    If the active model list in Motive changes, the MotionBuilder Plugin Device Information panel will show Tracking Models Changed and the Info Tab will indicate whether a suitable template was found.

    Auto-Characterize

    The Auto-Characterize button on the Skeleton Device will automatically characterize each generated Skeleton based on the scaled neutral pose of the performer. It will create a new MotionBuilder character with the same name as the mocap performer/MotionBuilder Skeleton pairing. The advantages of using this approach are:

    • Does not require Performers to be in T-Pose.

    • Simplifies the number of steps required to map mocap Skeleton onto your rigged MotionBuilder character.

    • Can result in improved retargeting results, if performs typically do not present a good T-Pose, since the characterization is on a scale neutral, or ‘zeroed’ pose.

    When using Auto-Characterize, you must be streaming from Motive before enabling the OptiTrack Skeleton device for correct pose scale detection. Otherwise, when the auto-characterized Skeleton is used as the input to a target character, the character may incorrectly scale. This does not apply if the target character’s Match Source or Actor Space Compensation is adjusted.

    Auto-Characterize is an optional feature and is not required for character retargeting.

    Step-By-Step Example

    Motive Skeleton Streaming Step-by-step

    Step
    Details

    Recording Skeleton Data

    The OptiTrack Skeleton device can record optical data to the current MotionBuilder take. Please refer to the page for steps on how to record from devices into MotionBuilder.

    Playing Back Recorded Data

    The OptiTrack Skeleton device can be used show live data or blend live data with a recorded take. Please refer to for steps on how to record from devices into MotionBuilder.

    MotionBuilder Skeleton Character Setup QuickStart

    One approach to quickly stream Motive Skeletons onto MotionBuilder characters:

    1. [Motive] - Setup a streaming session as outlined above under . Set Data Streaming > Bone Naming Convention > FBX. If this is not set to FBX, the plugin will change this automatically when connected to the server.

    2. [MoBu] - Connect to Motive and create a model binding as described in .

    3. [MoBu] - Make sure the Auto-Characterize feature is enabled under the OptiTrack Skeleton Device.

    Note for saving into FBX When a MoBu character is associated with a streamed Skeleton and it is set to Active, only the mapped segments of the character will be saved in exported FBX files, and segments that were not mapped will not be saved. In order to fully preserve the character including unassociated segments, uncheck Active (from Step 8) before saving out the FBX file.

    This is the local computer IP address (127.0.0.1 or Localhost).

  • Used for streaming data locally on the PC you are running Motive on that does not interact with the LAN.

  • Good option for testing network issues.

  • 192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)

    • This IP address is the interface of the LAN either by Wi-Fi or Ethernet.

    • This will be the same address the Client application will use to connect to Motive.

  • Transmission Type

    • For streaming over a Wi-Fi network, setting the Transmission Type to Unicast is strongly advised.

  • Select desired data types to stream under streaming options:

    • Rigid Bodies - Enabled (required).

    • Skeletons - Optional for Skeleton tracking.

    • Markers (Labeled, Unlabled, Asset) - Disabled for HMDs (advised).

    • Devices - Disabled.

  • Skeleton Coordinates

    • Set to Local.

  • Bone Naming Convention

    • When streaming Skeletons, set to FBX.

  • In order to stream data from the Edit mode, a capture-recording must be playing back in Motive.

  • For additional information on data streaming in general, read through the Data Streaming page.

  • Use this loopback address if Motive is running on the same machine as MotionBuilder.

  • 192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)

    • This IP address is the interface of the LAN either by Wi-Fi or Ethernet.

    • Use this if Motive is running on a different computer, but on the same network as the MotionBuilder computer.

  • 169.xxx.x.xx

    • This address is assigned when a DHCP server could not be reached.

    • This address can be ignored for our application.

  • Server Address - IP address of computer that is running Motive

    • 127.0.0.1

      • Use this IP when both Motive and MotionBuilder are running on the same computer.

  • Server Type

    • Multicast (default) or Unicast

    • Must match what is selected in the Motive Streaming settings.

    • Multicast is default and recommended.

  • Pause Streaming

    • Pauses the live stream.

    • Useful when characterizing from a Live T-Pose

  • Update Bindings

    • Use to update model bindings when the actively tracked models list in Motive changes.

    • Only necessary if the plugin has not automatically detected the tracking list change (e.g. if the tracking list change and data was not streaming).

  • Auto-Characterize

    • Automatically characterize each generated Skeleton based on the scaled neutral pose of the performer.

    • It will create a new MotionBuilder character with the same name as the mocap performer/MotionBuilder Skeleton pairing.

  • Red - Not connected or streaming.
  • Live

    • Indicates to MotionBuilder that data is coming from a live source (checked) or from a recorded take.

  • Recording

    • Indicates to MotionBuilder that data from this device should be recorded when MotionBuilder is recording.

  • [MoBu] - Import a rigged model. Be sure rigged model is in T-Pose facing +Z.

  • [MoBu] - Characterize rigged model.

  • [MoBu] - Rigged Model Character > Input Type > Character

  • [MoBu] - Rigged Model Character > Input Source > Select a Motive Character from Step 2.

  • [MoBu] - Rigged Model Character > Check Active. Model should snap to Motive Skeleton position.

  • Begin streaming/playback from Motive. Rigged model should now animate with Motive Skeleton.

  • [Motive]

    Configure Motive for Streaming Data

    From the Motive Streaming Pane:

    • Select Enable

    • Select Bone Naming Convention to FBX. (The plugin device will automatically reconfigure this if not already set to FBX.)

    • Choose Local Interface IP address from dropdown. If same computer is running both OptiTrack and MotionBuilder use the Loopback option.

    Be sure to configure any Firewall software first (either disable or permit MotionBuilder as an exception).

    [MoBu]

    Create an OptiTrack Skeleton device

    In the MotionBuilder Asset Browser Window -> Devices window. You should see:

    OptiTrack Skeleton

    • Within MotionBuilder, drag the OptiTrack Skeleton device into the Navigator(or Viewer) pane. An instance will be created under the Devices node.

    [MoBu]

    Connect Skeleton Device to Motive

    • In the Navigator window, select OptiTrack Skeleton from the Devices node

    • On the OptiTrack Skeleton pane, set the IP address of the OptiTrack server (e.g. Motive).

    • Click on the 'Online' checkbox - it should change from red to yellow (or green if data from the OptiTrack Server is currently streaming).

    [MoBu]

    Create a Motive to MoBu Skeleton binding

    • Click the Live checkbox

    • Click the Model Binding Combo and select Create New

    • In the Navigator window, under the Scene node, you should see a new the Skeleton nodes that match the currently tracked Motive Skeletons.

    [MoBu]

    Begin streaming marker data

    • From Motive, start live capture or data playback

    • From MotionBuilder, ensure the Viewer window is active (MotionBuilder will not update otherwise).

    • The Skeleton(s) should be animating in the MotionBuilder Viewer window.

    • The MotionBuilder Online check boxes should be green, indicating data is live and actively streaming.

    Data Streaming Pane
    Autodesk MotionBuilder: OptiTrack Optical Plugin
    Playing Back Recorded Data section
    Motive Setup (Server)
    Step-By-Step above
    Broadcast Frame Data set to true for streaming.
    Asset Browser tab > Devices view of the plugins.
    Navigator
    OptiTrack MotionBuilder Skeleton Plugin settings.
    Manual binding update MoBu settings.
    Manual binding update Motive assets in Assets pane.

    HP Reverb

    Marker Placement
    • Make sure the markers are attached securely and do not move. If the markers happen to move even slightly after a Rigid Body is defined, it will negatively affect the tracking and the Rigid Body definition may need to be updated.

    • Avoid placing multiple markers in close vicinity as they may overlap in the camera view in certain orientations.

    • Using marker posts to extend out the markers is recommended to improve marker visibility from more angles.

    • If you are using the active markers, there is an extra USB port on the HMD that you could draw the power from.

    • Please read through the page for additional information on the marker placement on a Rigid Body.

    If the selected marker matches one of the Active clips, it will indicate which type of Active Clip is being used.
  • Under the Orientation drop-down menu, select the desired orientation of the HMD. The orientation used for streaming to Unity is +Z forward and Unreal Engine is +X forward, or you can also specify the expected orientation axis on the client plugin side.

  • Hold the HMD at the center of the tracking volume where all of the active markers are tracked well.

  • Select the 8 active markers in the 3D viewport.

  • Click Create. An HMD Rigid Body will be created from the selected markers and it will initiate the calibration process.

  • During calibration, slowly rotate the HMD to collect data samples in different orientations.

  • Once all necessary samples are collected, the calibrated HMD Rigid Body will be created.

  • For wireless streaming, use Unicast streaming type.
    (optional) Double-check that the “Lock to HMD” property is set to true under the camera component properties in the details pane.
  • Select the pawn and set the “Base Eye Height” property to 0 in the details pane.

  • Compile the pawn then add it to the scene.

  • Select the pawn and set the “Auto Possess Player” to “Player 0”.

  • The HMD should now be working for Levels built for VR.

  • contact us
    Rigid Body asset
    active marker
    OptiTrack Active HMD
    External Pivot Alignment
    active markers
    Builder pane
    View tab
    Using a Template File to Create Vive Pro Active Clip Rigid Body
    downloads
    Streaming ID
    Streaming
    https://docs.unity3d.com/Manual/VRDevices-OpenVR.html
    https://docs.unrealengine.com/en-us/Platforms/SteamVR
    UnrealEngine
    Unity
    OptiTrack Unreal Engine Plugin
    Creating an HMD Rigid Body in the Builder pane.
    Windows installation warning.
    OptiTrack OpenVR Driver Setup Wizard Window.
    Configured properties of the custom pawn.
    Configured properties of the added camera component.

    Unity: HMD Setup

    This page provides instructions on setting up the OptiTrack OpenVR driver for integrating OptiTrack system with Vive HMDs within SteamVR applications; including Unreal Engine and Unity.

    Overview

    For integrating Vive / Vive Pro / Valve Index HMDs, the OptiTrack OpenVR Driver must be used. This driver lets you track the head-mounted display (HMD) and the VR controllers using OptiTrack motion capture system and stream the tracking data from Motive directly into SteamVR. In other words, this will basically override the tracking from the lighthouse stations. The plugin ships as an installer package (MSI) which will set up the driver along with a utility tool for configuring client streaming settings. Once integrated, the streamed tracking data can be used in any application platform that utilizes SteamVR. For tracking of objects other than the HMDs, please read through the OptiTrack Unity Plugin page for details.

    Supported Systems

    • VIVE

    • VIVE Pro 1/2

    • Valve Index

    • HP Reverb G2

    HMD Setup

    First of all, setup and optimize the motion capture volume as explained in the or the documentation. If you plan to install any obstacles (e.g. walls) within the capture volume, make sure they are non-reflective, and place and orient the cameras so that every corner is thoroughly captured by multiple cameras.

    General Setup Steps

    1. Attach the markers on the HMD

    2. Create a Rigid Body asset

    3. Calibrate the Pivot Point of the Rigid Body

    4. Configure the Rigid Body settings in Motive

    Place Markers on the HMD

    For the camera system to track the HMD, a set of markers must be attached to the HMD. You can either use the active markers (Active HMD clip or Active Tags) or the passive markers. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light and has the intelligence to be uniquely identified.

    In general, for most VR applications, using active markers is recommended for better tracking stability and ease of use. Active markers also have advantages over passive markers when tracking a large number of objects. For applications that are sensitive to the accuracy of the tracking data, using passive marker may have more benefits. To get more help with finding the best solution for your tracking application, please .

    When using the active markers, you can conveniently put a set of 8 markers onto the HMD by using the HMD Clip, or you can attach the markers from the Tag manually onto the HMD using adhesives and marker posts.

    Active HMD Clip

    Active HMD Clip is an HMD enclosure with a total of 8 active markers embedded for tracking. At the time of writing, there are active HMD clips for Vive Pro / Valve Index HMDs available on the webstore. The clips can be mounted easily by pushing it onto the HMD until the latches click, and you can detach it by gently lifting the three latches located at the top, left, and right side of the clip.

    Once the clip has been mounted, next step is to import the provided into Motive and refine the definition to get the calibrated pivot point position and orientation, which will be explained on the next section.

    Marker Types

    You can either use the passive retro-reflective type markers or the active LED markers to track the HMD. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light which gets uniquely identified in Motive. Either type of marker can be used to track HMDs. Using

    Create HMD Rigid Body in Motive

    This feature can be used only with HMDs that have the clips mounted.

    For using OptiTrack system for VR applications, it is important that the pivot point of HMD Rigid Body gets placed at the appropriate location, which is at the root of the nose in between the eyes. When using the HMD clips, you can utilize the HMD creation tools in the Builder pane to have Motive estimate this spot and place the pivot point accordingly. It utilizes known marker configurations on the clip to precisely positions the pivot point and sets the desired orientation.

    HMDs with passive markers can utilize the tool to calibrate the pivot point.

    Steps

    1. First of all, make sure Motive is configured for tracking .

    2. Open the under and click Rigid Bodies.

    3. Under the Type drop-down menu, select HMD. This will bring up the options for defining an HMD Rigid Body.

    This is supported only for Motive versions 2.1.2 or above. If you are using any other versions of Motive 2.1, please update the version to 2.1.2, or use a template to create the Rigid Body definition; instructions for which is provided in the following page: .

    Setting up the OpenVR driver

    SteamVR Required: The VR driver streams tracking data through SteamVR. Please make sure SteamVR is installed on the computer before setting up the driver.

    Setup Steps

    Download and run the installer

    Download the OpenVR driver from the page. Once downloaded, launch the installer and follow the prompts to set up the driver. On the last window, make sure to select Launch Configuration Utility before clicking Finish. This will open the Configuration options to setup your HMD with Motive.

    You may receive a warning window prior to the installation wizard. To circumvent this, select More info and then Run Anyway.

    Open the configuration program

    Once the driver has been successfully installed, launch the configuration utility software (C:\Program Files\OptiTrack\OpenVR Driver\ConfigUtil). Using this tool, you can load and check existing configurations and make changes to the settings as needed. To import current settings, click Load and to save out the changes, click Save.

    Please make sure you are running this tool with admin privileges; if not, it might not be able to modify the settings properly. If the configuration software detects a running instance of SteamVR through OpenVR, it will be indicated as Initialized at the very top as shown in the image. Please note that when the settings get modified while SteamVR is running, the SteamVR must be restarted to apply the changes.

    Configure connection settings

    First, configure the connection settings so that the driver listens to the Motive server where the tracking data is streamed from. The server address must match the address where Motive is streaming the data to, and the local address must match the IP address of the computer on the network where the driver is installed.

    Set up the HMD

    In the HMD section, enable the HMD and input the Rigid Body ID of the HMD. The Rigid Body ID must match the property of the HMD Rigid Body definition in Motive.

    Save out the configuration

    Save the configurations by clicking on Save. This will modify the set of configurations in the steamvr.settings file in the steam installation directory and they will override the HMD tracking with the tracking data from Motive. If you already had an instance of OpenVR or SteamVR running, restart the application to apply the changes.

    Configuration File

    The configuration tool basically imports and modifies the contents in the steamvr.settings file (C:\Program Files (x86)\Steam\config\steamvr.settings). When needed, the driver related settings can be changed directly from this file also, but it will be easier to configure the settings using the provided configuration tool.

    Confirm the setup

    Launch SteamVR. If the driver is successfully set up, you should see a tracker icon added to the right of the HMD icon and the HMD will now be using the motion capture system instead of the base stations. Here, please make sure all of the lighthouse base stations are powered off.

    VIVE Controllers

    VIVE controllers are a Beta feature and may not work for every device. Support for this particular feature is limited.

    Setting up the controller (optional)

    When needed, the Vive controllers can be configured as well. To do so, open the configuration utility tool while SteamVR is running. At the top of the configuration tool, it should indicate OpenVR status as Initialized and the controllers must be showing up in SteamVR. Then, in the controller sections, enable the controllers, specify the override device using the drop-down menu, and input the corresponding streaming ID of the controller Rigid Bodies in Motive. Once everything has been configured, save the changes and restart SteamVR. When the override is configured properly, SteamVR will have an additional tracker icon per each enabled controller.

    Data Streaming

    Now that the driver is set up, the HMD tracking will be overridden by tracking data from the mocap camera system, and you can integrate HMDs into the game engine through their own VR integration.

    Streaming settings in Motive

    First, make sure the streaming settings are configured in Motive for streaming out the data. For more information on streaming in Motive please visit our page:

    • Broadcast Frame Data must be set to true.

    • Local interface must be set to the desired IP address to stream the tracking data from.

    • Streaming of Rigid Bodies must be set to True

    Confirm the Connection

    Once Motive is configured for streaming, launch SteamVR home to check the connection. If everything is setup correctly, you should be able to move around, or translate, within the scene freely. You may also need to check the ground plane to make sure it's well aligned.

    If you experience any unexpected rotations in the view as you move your head, it could indicate that the HMD pivot point has not been calibrated properly. Please revisit the HMD Setup section and make sure the HMD Rigid Body pivot point is positioned and oriented at the expected pivot; which is at the root of nose with z-forward.

    Unity SteamVR Setup

    Notes for Unity users

    Once this has been set up, the motion capture system will be used to drive the HMD in SteamVR. In Unity, you should be able to setup development for SteamVR applications and use our system.

    Starting from Unity version 2019 and above, official support for OpenVR in Unity has been deprecated. However, Valve made a plugin for the new Unity XR which can be used instead. Please follow the below steps to set up the Unity XR plugin and get the HMD working inside the Unity project:

    OpenVR Unity XR plugin setup

    1) Download the OpenVR Unity XR package found on the following . 2) Download the OptiTrack OpenVR driver found on our website and configure the settings as described in the above . 3) Open an Unity project 4) [Unity] Open the package manager from Window → Package manager. 5) [Unity] In the package manager, click on the "+" icon at the top and choose Add package from tarball. Then select the downloaded OpenVR Unity XR package.

    6) [Unity] Check to make sure that the OpenVR XR plugin has been installed within your project.

    7) [Unity] Now, follow the instructions on Unity's

    Streaming Rigid Body/Skeleton data

    This driver is designed for streaming of HMD and controller tracking data only. For streaming tracking data of other Rigid Body objects, you will need to use the corresponding plugins ( or ). In other words, the HMD tracking data will be streamed through the SteamVR using the driver you've installed, and all other tracking data will be streamed through the plugin.

    Aligning world coordinates with the plugin

    Client Origin

    When using the OpenVR driver along with the for streaming of tracking data other than the HMD, such as Rigid Bodies and/or Skeletons, it is important that the OptiTrack Client Origin object is located at the global origin without any rotations. In other words, the position must be set to (0,0,0) and the rotation must be set to (0,0,0) on the client origin. This is important because the driver does not use the but instead streams the HMD tracking data directly onto SteamVR through a separate channel.

    Rigid Body Tracking
    is recommended especially for applications that involve tracking of multiple HMDs in the scene.

    Marker Placement

    • Make sure the markers are attached securely and do not move. If the markers happen to move even slightly after a Rigid Body is defined, it will negatively affect the tracking and the Rigid Body definition may need to be updated.

    • Avoid placing multiple markers in close vicinity as they may overlap in the camera view in certain orientations.

    • Using marker posts to extend out the markers is recommended to improve marker visibility from more angles.

    • If you are using the active markers, there is an extra USB port on the HMD that you could draw the power from.

    • Please read through the page for additional information on the marker placement on a Rigid Body.

    If the selected marker matches one of the Active clips, it will indicate which type of Active Clip is being used.
  • Under the Orientation drop-down menu, select the desired orientation of the HMD. The orientation used for streaming to Unity is +Z forward and Unreal Engine is +X forward, or you can also specify the expected orientation axis on the client plugin side.

  • Hold the HMD at the center of the tracking volume where all of the active markers are tracked well.

  • Select the 8 active markers in the 3D viewport.

  • Click Create. An HMD Rigid Body will be created from the selected markers and it will initiate the calibration process.

  • During calibration, slowly rotate the HMD to collect data samples in different orientations.

  • Once all necessary samples are collected, the calibrated HMD Rigid Body will be created.

  • For wireless streaming, use Unicast streaming type.
    to configure your project for XR development. Install the XR Plug-in manager which can be found under
    Edit → Project Setting → XR Plug-in Management
    .

    8) [Unity] Enable the OpenVR Loader under the list of providers in the XR Plug-in Manager. If OpenVR Loader is not listed in there, make sure the plugin was installed properly from step 5) above.

    9) [Unity] Once the plugin is configured, go to GameObject → XR → Add XR Rig.

    10) Play the scene, and make sure the HMD is playing and tracking as well.

    Please keep in mind that these steps are subject to change from Unity. You can find detailed instruction from the following page: https://docs.unity3d.com/Manual/configuring-project-for-xr.html

    In Unity version 2018 and earlier, you can enable SteamVR by configuring the project setting. Please go to Edit → Project Settings → Player, open the XR Settings panel, and enable the Virtual Reality Supported property. You can also follow the instruction in the Unity Documentation:

    • https://docs.unity3d.com/2018.4/Documentation/Manual/VRDevices-OpenVR.html

    Getting Started guide
    Hardware Setup
    contact us
    Rigid Body asset
    active marker
    OptiTrack Active HMD
    External Pivot Alignment
    active markers
    Builder pane
    View tab
    Using a Template File to Create Vive Pro Active Clip Rigid Body
    downloads
    Streaming ID
    Streaming
    Github page
    section
    website
    Unreal Engine
    Unity
    OptiTrack Unity Plugin
    OptiTrack Unity Plugin
    Active marker clip for HTC Vive Pro.
    Active marker clip for HP Reverb G2.
    Creating an HMD Rigid Body in the Builder pane.
    Windows installation warning
    OptiTrack OpenVR Driver Setup Wizard Window
    HMD Rigid Body streaming settings in Motive.
    Rigid Body Tracking

    Autodesk Maya: OptiTrack Insight VCS Plugin

    Overview

    The Insight VCS: Maya plugin is an Autodesk® Maya® plugin designed for live virtual camera work directly within the Maya® environment.

    The Insight VCS plugin works in conjunction with Motive and the Insight VCS Controllers to provide real-time 6 DOF camera position, orientation, and virtual camera controls, including:

    Insight VCS Features

    Feature
    Description

    Installation and Licensing

    Supported Platforms

    The Insight VCS Plugin for Maya works on the following platforms:

    • Autodesk Maya 2011®, 32-bit for Microsoft Windows®.

    • Autodesk Maya 2011®, 64-bit for Microsoft Windows®.

    • Autodesk Maya 2014®, 64-bit for Microsoft Windows®.

    • Autodesk Maya 2015®, 64-bit for Microsoft Windows®.

    Installation

    1. [Maya] - Load plugin - Maya -> Window -> Settings/Preferences -> Plugin-In Manager -> InsightVCS -> Check

    2. [Maya] - Start plugin - Maya -> Window -> InsightVCS

    Licensing

    The VCS:Maya plugin requires a valid Insight VCS:Maya license to run. This license is managed by your OptiTrack server application (Motive) and should be installed in the same license folder as that application.

    Please refer to your order confirmation and/or for specific licensing instructions. Additional information on licensing can be found in our.

    Using the Insight VCS: Maya Plugin

    The following steps outline the basic process for virtual camera work using the Insight VCS plugin within Maya®:

    1. Load a Maya scene

    2. Create any Maya cameras that will be controlled by the Insight VCS

    3. Connect to an OptiTrack data server for 6-DOF data

    4. Select or create a "Controller Profile", which controls how buttons and axes on the tracking controller are used

    Connecting to Motive [Server]

    Refer to the following step-by-step for bringing live mocap data in from an OptiTrack Motion Capture server application, Motive.

    Connecting to the Mocap Data : Step-by-Step

    [Motive]

    • Create a from you tracking controller's markers.

    • Be sure to orient your tracking controller down the -Z axis. This will be the camera's "Neutral" position.

    For more information regarding streaming in Motive, please visit the wiki article.

    Open the in Motive's window and set the following settings:

    • Enable - Turn on the Enable setting at the top of the NatNet section.

    • Local Interface - Choose the desired IP network address from this dropdown to stream data over.

      • Loopback

    [Maya]

    • Open the Insight VCS Plugin panel by selecting Window -> Insight VCS from the Maya main menu.

    [Insight VCS Panel]

    • Set the IP address to match Motive's (i.e. 127.0.0.1 if running Motive and Maya on the same machine) using the Server Address Edit Box.

    • Click the Connected Button. If a connection was made, the green indicator light on this button will change to bright green.

    • Select the mocap source object from the Rigid Body Dropdown.

    You should now see your Maya camera moving within the Maya viewport:

    Connection Settings

    Virtual Camera connection settings are managed by the main interface tab on the Insight VCS plugin panel:

    Controllers

    The Insight VCS plugin supports any DirectInput compatible joystick or USB device. Controllers can then be configured to perform actions or control the camera using Controller Profiles.

    Controller Profiles

    Virtual Camera controls are managed by a Control-to-Event mapping system called the Controller Profile. The controller profile is configured in the Controller Tab. The Insight VCS plugin allows you to create and swap between multiple controller profiles, allowing you to create any number of custom button/axis configurations depending upon the scene, particular move types, different physical VCS controllers or HID devices, etc.

    Profiles can be saved and then later swapped out using the Profile Dropdown. Profiles are saved into <VCS Maya install folder>\Profiles folder.

    The VCS plugin ships with 2 default profiles:

    • The 2 controller VCS Pro (<VCS Maya install folder>\Profiles\VCSProDefault.xml).

    • The XBox based VCS Mini (<VCS Maya install folder>\Profiles\VCSMiniDefault.xml).

    When the Insight VCS plugin is first launched, it will attempt to detect any compatible controllers. It will then attempt to match the detected controllers with an existing Controller Profile, beginning with the last used ("preferred") profile.

    Profile Setup

    The VCS plugin supports 2 types of controller inputs and 2 types of actions:

    • Axis Inputs / Actions: Axis inputs are analog inputs and represent the range of values. This range has been scaled to [0, 1000]. Axis inputs can be assigned to Axis actions. PTZ operations (Pan, Tilt, Zoom) are good examples of typical Axis Actions.

    • Button Inputs / Actions: Button inputs are the button inputs on the controller. These are “one shot” events that occur when the button is pressed. Timeline commands such as Play, Record, and Rewind are typical examples of “one shot” events.

    Some Insight VCS controllers have a dial that is represented in the Axis list as a "Wheel". This is a special form of an axis, and can be used to modify existing actions, such as zoom speed, pan speed, and motion scale amount.

    Some Insight VCS controllers have a "Button 7". This is an internal, reserved button, and cannot be directly accessed.

    Insight VCS Profile Grid Columns

    Axes
    Name of the controller's analog input.

    Action Parameters

    Some actions have parameters that modify the way they operate. The following tables list the axis and button actions, and how the parameter value for that action is interpreted.

    VCS controller - Axis Actions

    Action
    Parameter(s)
    Example

    *Curve Type: See explanation below for a definition of the supported curve types.

    Curve Types

    When mapping a controller thumbstick axis to an animatable camera parameter (pan, zoom), you have the option of specifying how the Insight VCS plugin should interpret controller axis movement as a standard animation curve. Instead of modifying the value over time, however, the motion curve modifies the value over the controller span, from neutral/center position (0) to maximum position (Max). The following diagram describes this relationship:

    The VCS plugin offers the following built-in curve options:

    VCS Controller - Button Actions

    Action
    Parameter
    Example

    Virtual Camera Settings

    The Insight VCS plugin has several properties that can be used to customize its behavior. The Settings Tab can be used to set these:

    VCS General Settings

    Setting
    Description

    Maya Camera Settings

    A Maya Camera controls how you see the 3D scene. Maya's Camera object allow users the ability to model real-world cameras, including settings such as Focal length, aspect ratio, film format, etc.

    Refer to the Maya documentation for more information on Camera Settings.

    Other Resources

    Autodesk MotionBuilder: OptiTrack Optical Plugin

    Overview

    This page provides instructions on how to use the OptiTrack MotionBuilder Optical plugin. The OptiTrack Optical Plugin device allows to you to map motion capture (optical) data onto an animated character within MotionBuilder.

    MotionBuilder Character driven by motion capture data.

    Motive Data Streaming Setup (Server)

    First, you'll want to follow the instructions below to set up the data streaming settings in Motive. Once this is configured, Motive will be broadcasting tracking data onto a designated network interface where client applications can receive them.

    Streaming Settings

    • Enable - Turn on the Enable setting at the top of the NatNet section.

    • Local Interface - Choose the desired IP network address from this dropdown to stream data over.

      • Loopback

    Additional Tips

    • For best results, it is advised to run Motive and MotionBuilder separately on different computers, so that they are not competing for processing resources.

    • When streaming the data over a Wi-Fi network, Unicast transmission must be used.

    MotionBuilder Setup (Client)

    To get started, drag the OptiTrack Optical plugin from the Motion Builder Asset Browser tab > Devices into the Viewer window. This will create a dropdown menu called I/O Devices in the Navigator tab. Click the + button next to I/O Devices and select OptiTrack Optical. This will populate the plugin's settings tab if it hasn't already auto-populated from the drag and drop step from earlier.

    Device Settings

    • Optical Model

      • Specified the MotionBuilder "Opticals" model to map the markers to.

    • Generate a new Optical model/Update the current optical model

    Once the above settings are input appropriately, you'll want to click the box next to Online. This indicate whether or not Motive is successfully streaming to MotionBuilder.

    • Online color indicator

      • Green - Connected and streaming.

      • Yellow - Connected but not streaming.

    Step-By-Step Example

    Motive Skeleton Streaming Step-by-step

    Step
    Details

    Recording Optical Data

    The OptiTrack Optical device can record streamed optical data to the current MotionBuilder take. Note that the looping feature in Motive must be disabled in order to record streamed data in MotionBuilder. The following step-by-step procedure can be used to record data:

    Recording Optical Data Step-by-step

    Step
    Details

    Playing Back Recorded Data

    The OptiTrack Optical device can be used to show live data or blend live data with a recorded take. To playback recorded optical data, you need to tell MotionBuilder to disable live streaming.

    Playing Back Recorded Data Step-by-step

    Step
    Details

    Optical Data Namespace

    When the opticals are generated, their naming conventions will be determined depending on whether the Organize Assets option was enabled under the Optical Device properties.

    Organize Assests Enabled (default)

    The device will generate optical data with colon ( : ) separator (e.g. AssetName:MarkerName), and all of the optical data will be organized under their corresponding root nodes.

    Organize Assets Disabled

    The device will generate optical data with an underscore ( _ ) (e.g. AssetName_MarkerName) and all of the optical data will be listed under a same optical node.

    For auto-mapping imported FBX actors to optical data, the Organize Assets setting must be disabled and underscore separators must be used.

    MotionBuilder Actor/Character Setup

    The following guide is provided as a simplified process for working specifically with Motive, but this is not the only way. For the latest information on setting up and configuring MotionBuilder Actors and Characters, please refer to the .

    To animate characters in MotionBuilder, you need to create the following data flow (or “mapping”):

    Mocap Marker Data > MotionBuilder “Actor” > Skeleton Data > MotionBuilder “Character”

    The Mocap Marker Data > MotionBuilder Actor step maps Motion Capture data (Markers) to the MotionBuilder Actor object. The MotionBuilder Actor object is a Skeleton solver that creates joint angles from Marker data.

    The MotionBuilder “Actor” > Skeleton Data > MotionBuilder “Character” step is specific to MotionBuilder, and this pipeline maps the MotionBuilder Actor Skeleton onto your final character Skeleton. This step requires a “rigged” character. Refer to the MotionBuilder help for detailed information on this process.

    Actor Setup (Mocap Marker Data > MotionBuilder "Actor")

    There are 3 ways to create a Motive Marker Sets > MobBu Actor mapping.

    1. Create a new marker map from scratch.

    2. Import a Motive exported FBX Ascii file.

    3. Auto-create an actor from the Motive stream.

    Autodesk has discontinued support for FBX Ascii import in MoBu 2018 and above.

    1. Create a new marker map from scratch

    1. Create OptiTrack Optical device

    2. Connect to Motive

    3. Generate Opticals

    4. Stream a frame of T-Pose data from Motive

    2. Import Existing Marker Map (from File)

    Option 1: Restore Marker Set from HIK file

    1. [MoBu] Import Marker Set definition (.hik file)

    2. [MoBu] Connect to Motive

    3. [MoBu] Generate Opticals

    4. [Motive] Stream a T-Pose frame of data into MotionBuilder

    Option 2. Export FBX from Motive (Mobu 2017 and earlier only)

    1. [Motive] Right-click on the Skeletons in the Assets pane and export Skeleton as FBX. Make sure the namespace separator is set to (_).

    2. [MoBu] Merge FBX from Step 1 (File > Merge)

    3. [MoBu] Connect to Motive

    4. [MoBu] Select the OptiTrack Optical device in the navigator and access its properties under the resources panel.

    3. Auto-Create from Stream

    The Optitrack MotionBuilder plugin can generate a MoBU Actor for you automatically.

    1. Set the “Create Actors” property to true before bringing the Optical device online.

    2. The Optitrack Mobu plugin will generate an Actor for each Marker Set in the Motive data stream, and correctly map the Motive Marker Set markers to the corresponding Mobu Actor.

    Offline Workflow with Actor Binding

    In order to workaround Autodesk's removal of support for FBX actor import, we recommend the following:

    1. [Motive] Load a TAK file with desired assets

    2. [Mobu] Use Optical Device to Create Actors over stream from TAK file.

    3. [Motive] Export TAK data from Motive (FBX or C3D)

    4. [Mobu] Import C3D/FBX into Mobu

    Character Setup (MotionBuilder Actor > Character)

    1. Do Actor Setup (Above)

    2. Import a rigged Skeleton (File -> Merge -> Skeleton)

    3. If Skeleton is not “characterized, characterize it:

    4. Create MB Character (Drag onto Skeleton “hips” )

    For more information on setting up and configuring MotionBuilder Actors and Characters, please refer to Autodesk's .

    Unlabled Markers

    Unlabeled markers are not commonly needed in the MotionBuilder workflow, but if needed, they can get streamed through the Optical Device plugin.

    1. [Motive] Make sure streaming of unlabeled markers is enabled in Motive’s data streaming panel.

    2. [MoBu] Under the properties of the Optical Device loaded in the scene, make sure Using Unlabeled Markers option is enabled and the Unlabeled Marker Limit is set to the maximum number of unlabeled makers allowed in the scene.

    3. [MoBu] In Optical Devices, generate or update the optical model.

    The Optical Device has a special property for Arena Expression ( viewable from the MotionBuilder Properties tab) that must be checked when using with Arena Expression software:

    This is the local computer IP address (127.0.0.1 or Localhost).

  • Used for streaming data locally on the PC you are running Motive on that does not interact with the LAN.

  • Good option for testing network issues.

  • 192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)

    • This IP address is the interface of the LAN either by Wi-Fi or Ethernet.

    • This will be the same address the Client application will use to connect to Motive.

  • Transmission Type

    • For streaming over a Wi-Fi network, setting the Transmission Type to Unicast is strongly advised.

  • Select desired data types to stream under streaming options:

    • Rigid Bodies - Enabled (required).

    • Skeletons - Optional for Skeleton tracking.

    • Markers (Labeled, Unlabled, Asset) - Disabled for HMDs (advised).

    • Devices - Disabled.

  • Skeleton Coordinates

    • Set to Local.

  • Bone Naming Convention

    • When streaming Skeletons, set to FBX.

  • In order to stream data from the Edit mode, a capture-recording must be playing back in Motive.

  • For additional information on data streaming in general, read through the Data Streaming page.

  • Adds/updates the current Marker Set from OptiTrack to list of MotionBuilder "Opticals" model.

  • Damping Time

    • Device damping time.

  • Local address - IP address of the MotionBuilder computer. In situations where multiple network adapter cards are present, select the adapter that is on the same network as the Motive application.

    • 127.0.0.1

      • This is the local computer IP address (127.0.0.1 or Localhost).

      • Use this loopback address if Motive is running on the same machine as MotionBuilder.

    • 192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)

      • This IP address is the interface of the LAN either by Wi-Fi or Ethernet.

      • Use this if Motive is running on a different computer, but on the same network as the MotionBuilder computer.

    • 169.xxx.x.xx

      • This address is assigned when a DHCP server could not be reached.

      • This address can be ignored for our application.

  • Server Address - IP address of computer that is running Motive

    • 127.0.0.1

      • Use this IP when both Motive and MotionBuilder are running on the same computer.

  • Server Type

    • Multicast (default) or Unicast

    • Must match what is selected in the Motive Streaming settings.

    • Multicast is default and recommended.

  • OptiTrack Marker Set

    • The name of the OptiTrack Marker Set this optical is binding to.

  • Marker Set Scale

    • The global scale factor to be applied to the marker data before mapping to the actor.

  • Red - Not connected or streaming.
  • Live

    • Indicates to MotionBuilder that data is coming from a live source (checked) or from a recorded take.

  • Recording

    • Indicates to MotionBuilder that data from this device should be recorded when MotionBuilder is recording.

  • Model Bindings

    • Unused.

  • Device Information

    • Information about the status of the connection.

  • You should see the Opticals in the MotionBuilder 3D viewer

  • Create MB Actor

  • Fit MB Actor to Opticals

  • Create an Optical > Marker Set > Actor mapping:

    • Import existing mapping

      • Actor > Marker Set > Import > OptiTrack HIK file

      • Drag all opticals (incl root) onto Actor’s “Reference Cell”

    • Create a new mapping:

      • Actor > Marker Set > Create

      • Drag individual opticals to Actor segments

  • Activate Actor (Actor > Activate)

    • Actor snaps to marker cloud pose

    • Actor should now be animating in Viewer

  • [MoBu] Actor Panel : Drag Opticals to Actor Markers (Actor Prop Sheet > Reference)

  • [MoBu] Activate Actor (Actor > Activate)

    • Actor snaps to marker cloud pose

    • Actor should now be animating in Viewer

  • [MoBu] Disable Organize Assets property to generate Opticals with underscore separators in its names.

  • [MoBu] Generate Opticals.

  • [Motive] Stream a T-Pose frame of data into MotionBuilder

  • [MoBu] Actor Panel : Select all of the streamed Opticals and Drag them onto the Actor Markers (Actor Prop Sheet > Reference). The names under the marker name sheet MUST match the Skeleton labels for auto-mapping.

  • [MoBu] Activate Actor (Actor > Activate)

    • Actor snaps to marker cloud pose

    • Actor should now be animating in Viewer

  • [Mobu] Update Actor binding to imported C3D / FBX

    Map Character to Actor

    • Select Character -> Character Settings -> Input Type -> Actor Input

    • Check “Active”

  • Activate Actor (Actor -> Activate)

    • Skeleton and Actor should now be animating in Viewer

  • [MoBu] Unlabeled markers will show up as orange opticals in the scene.

    [Motive]

    Configure Motive for Streaming Data

    From the Motive Streaming Pane:

    • Select Enable

    • Select Bone Naming Convention to FBX. (The plugin device will automatically reconfigure this if not already set to FBX.)

    • Choose Local Interface IP address from dropdown. If same computer is running both OptiTrack and MotionBuilder use the Loopback option.

    Be sure to configure any Firewall software first (either disable or permit MotionBuilder as an exception).

    [MoBu]

    Create an OptiTrack Optical device

    In the MotionBuilder Asset Browser Window > Devices window. You should see:

    OptiTrack Optical

    • Within MotionBuilder, drag the OptiTrack Optical device into the Navigator(or Viewer) pane. An instance will be created under the Devices node.

    [MoBu]

    Connect Optical Device to Motive

    • In the Navigator window, select OptiTrack Optical from the Devices node

    • On the OptiTrack Optical pane, set the IP address of the OptiTrack server (e.g. Motive).

    • Click on the Online box - it should change from red to yellow (or green if data from the OptiTrack Server is currently streaming).

    [MoBu]

    Create a Marker Set > Opticals Mapping

    • In the 'OptiTrack Marker Set' Dropdown, select the name of a currently defined Marker Set in Motive. You may need to resize the device pane in MotionBuilder to access the features.

    • Press the Generate new optical model button

    • In the Navigator window, under the Opticals node, you should see a new the marker list. This indicates the plugin has successfully retrieved the marker list from the OptiTrack server. You should also see the Opticals displayed in the Viewer window if the Server is currently streaming.

    [MoBu]

    Begin streaming marker data

    • From Motive, start live capture or data playback

    • From MotionBuilder, ensure the Viewer window is active (MotionBuilder will not update otherwise).

    • The marker set should be animating in the MotionBuilder Viewer window.

    • The MotionBuilder Online check boxes should be green, indicating data is live and actively streaming.

    Enable Optical Device for recording

    [Mobu] > Optical > Check Recording

    Start Recording

    • [Mobu] > Transport Control > Record (Create new take)

    • [Mobu] > Transport Control > Play ( start recording frames)

    • [Mobu] > Transport Control > Stop

    Disable Live streaming

    • [Mobu] > Optical > Uncheck Recording

    • [Mobu] > Optical > Uncheck Live

    Playback recorded take

    • [Mobu] > Transport Control > Rewind

    • [Mobu] > Transport Control > Play

    MotionBuilder documentation
    MotionBuilder documentation
    Asset Browser tab > Devices view of the plugins.
    Navigator tab.
    OptiTrack MotionBuilder Skeleton Plugin settings.
    Organize Assets enabled.
    Organize Assets disabled.
    Properties tab in MotionBuilder Using Unlabeled Markers.
    Arena Expression checked in Properties tab.

    Autodesk Maya 2016®, 64-bit for Microsoft Windows®.

  • Autodesk Maya 2017®, 64-bit for Microsoft Windows®.

  • Autodesk Maya 2018®, 64-bit for Microsoft Windows®.

  • Start laying down camera moves!

    This is the local computer IP address (127.0.0.1 or Localhost).

  • Used for streaming data locally on the PC you are running Motive on that does not interact with the LAN.

  • Good option for testing network issues.

  • 192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)

    • This IP address is the interface of the LAN either by Wi-Fi or Ethernet.

    • This will be the same address the Client application will use to connect to Motive.

  • Transmission Type

    • For streaming over a Wi-Fi network, setting the Transmission Type to Unicast is strongly advised.

  • Select desired data types to stream under streaming options:

    • Rigid Bodies - Enabled (required).

    • Skeletons - Optional for Skeleton tracking.

    • Markers (Labeled, Unlabled, Asset) - Disabled for HMDs (advised).

    • Devices - Disabled.

  • Skeleton Coordinates

    • Set to Local.

  • Bone Naming Convention

    • Set the appropriate bone naming convention for the client application. For example, if the character uses the FBX naming convention, this will need to be set to FBX.

  • Select the Maya camera to be controlled using the Maya Camera Dropdown.
  • Click the Live Button to begin streaming data from the mocap Rigid Bodyto the Maya camera. If live data is streaming, the indicator light on this button will change to bright yellow.

  • [Focal length change rate] [Curve Type*]

    1.0

    Orbit Offset

    [Orbit offset change rate] [Curve Type*]

    1.0

    Focal Distance

    [Focal distance change rate] [Curve Type*]

    1.0

    Wheel Modifier

    [VCS Dial controls only] Modify an axis' parameter value (e.g. zoom speed, pan speed, translation scale) by a specified increment. Format: [axis name] [increment]

    Examples: X Axis .1 (+/- the X Axis parameter by 0.1) Y Axis .2 (+/- the Y Axis parameter by 0.2) Z Axis .1 (+/- the Z Axis parameter by 0.1) Scale All .5 (+/- all translational scale by .5) Translate All 1.0 (+/- all pan speeds by 1.0)

    Amount to increment/decrement current translation scale

    1.0 [scale up by 1.0] -1.0 [scale down by 1.0]

    FOV +/-

    Amount to increment/decrement current Focal length

    1.0 [increase focal length by 1]

    MelCommand

    Runs a Maya Mel command or script.

    NatNextPrimeLense.mel

    ResetOffset

    [x y z] Optional - specifies the position to reset camera to, otherwise camera is reset to (0.0,0.0,0.0)

    10.0 10.0 0.0 [reset camera offset t o10,0,0]

    ToggleAxisAction

    Toggles a specified axis between 2 actions. [Axis name],[Action1 Index], [Action1 Params],[Action2 Index],[Action2 Params] The example at right toggles the Y Axis behavior between Dolly In/Out at speed 1.0 with a Cubic Curve and Focal Length at 0.1 speed with a Quartic curve. This action can be used to extend axis functionality without swapping profiles.

    Y Axis, 3, 1.0 1, 4, 0.1 2

    Applies smoothing to the camera position values

    Smooth Rotation

    Applies smoothing to the camera rotation values

    Connection Type

    Indicates connection interface to use when connecting to an OptiTrack server application. Options are Multicast and Unicast. This setting must be the same as your OptiTrack server application. Default is Multicast

    Sample Rate

    Indicates the rate, in frames-per-second (fps), the VCS should sample the mocap server application for 6 DOF __ position/orientation value updates. Use this if necessary to match the playback speed of your scene to ensure consistency of controller pans during timeline playback and non-timeline playback

    Pan / Dolly / Boom

    Use VCS controls to Pan Left/Right and Up/Down. Pan in local, world, or a combination of coordinate systems. Adjust pan speeds on the fly with controls or scripts.

    Pitch / Tilt / Roll

    Absolute orientation at all times from the OptiTrack optical system

    Free Move

    Absolute position at all times from the OptiTrack optical system. Scale movement in real-time with controllers or from script.

    Zoom

    Fully control camera zoom / FOV and zoom rates using the controller's analog thumbsticks and speed adjusters.

    Smooth

    Advanced kalman filtering allows for customizing a "steadicam" feeling.

    Play / Record

    Control common actions like recording and playback using the controller.

    Custom Commands

    Connected

    Click this box to connect to Motive.

    Dark Green - Not Connected

    Light Green - Connected and streaming

    Refer to the Maya status window for details about connection errors.

    Live

    Indicates whether camera position/orientation data should be coming from a live mocap source (checked) or from a recorded take. Disable this when playing back recorded camera moves.

    Dark Yellow - Camera not using live data

    Light Yellow - Camera using live mocap data

    Recording

    Starts recording.

    Dark Red - Not Recording

    Light Red - Recording

    Server Address

    IP Address of the OptiTrack Server

    Rigid Body

    Indicates which Motive's Rigid Body to use for controlling the camera.

    Maya Camera

    Indicates which Maya camera to control.

    Action

    Action to take or value to change.

    Parameter

    Input parameter used by some actions to modify the action in some way (i.e. speed up or slow down zooming).

    Value

    Current value of the controller input.

    Pan Right/Left

    [Pan Speed] [Curve Type*]

    1.0 [ pan at normal rate, linear curve] 1.0 1 [ pan at normal rate, ease-in curve] 0.5 1 [ pan at half speed, ease-in curve] 2.0 [ pan at 2x speed ]

    Dolly In/Out

    [Pan Speed] [Curve Type*]

    1.0

    Pan Up/Down

    [Pan Speed] [Curve Type*]

    1.0

    Record

    None

    Play

    None

    Rewind

    None

    Enable Axes

    Selectively enable/disable individual mocap movement channels.

    Scale Translation

    Scale the physical movement (when tracking controller is moved).

    Offset Translation

    Can be used for 2 purposes : 1. To adjust the center of the physical volume to the virtual scene. 2. To effectively pan/truck/dolly the camera. This value is updated by the thumbstick controls for the Pan/dolly/truck operations

    Offset Translation Mode

    Affects how Offset Translation is applied to the camera: 0 : Global Translates the camera according to the Maya global coordinate system (global). 1 : Local Translates the camera according to the camera’s coordinate system (local). 2 : LocalOnStart Translates the camera according to the camera’s coordinate system when the camera first moves (stick first moves), then keeps that axis (Does not continuously update the coordinate system).

    Scale Updates Offset

    Instructs whether changes to Scale Translation update the Offset Translation value in order to keep the camera in the same position (true) or does not affect Offset Translation, resulting in camera position moving to new scaled amount

    Quick Start Guide
    Licensing and Activation FAQ
    Rigid Body
    Data Streaming
    Data Streaming Pane
    Settings
    Insight VCS Support and Supplemental Documentation
    799KB
    Insight VCS-Pro Quick Start Guide.pdf
    PDF
    Open
    276KB
    Insight VCS-Pro LED Identification Key.pdf
    PDF
    Open
    Click image to enlarge.
    A Typical Insight VCS Controller Profile
    Controller value modifier curve.
    VCS General Settings

    Customize the controller by mapping controller inputs to execute scripts for complete control and on-person camera operation.

    Focal Length +/-

    Scale Translation

    Smooth Translation

    Autodesk MotionBuilder: OptiTrack Insight VCS Plugin

    Overview

    This page provides instructions on how to use the OptiTrack MotionBuilder Virtual Camera Device (Insight VCS) plugin. The Virtual Camera device is specifically designed for creating a Virtual Camera in MotionBuilder. You can use the Insight VCS device with standard OptiTrack applications such as Motive, or you can use the device in "Universal" mode, which works with generic MotionBuilder Optical or RigidBody objects, allowing you to use the Insight VCS device with alternative motion capture systems that support optical or Rigid Body devices in MotionBuilder.

    Camera Tracking in MotionBuilder.

    Before following the walkthrough below, please refer to Autodesk MotionBuilder Plugin page for initial steps for setting up motive and downloading the OptiTrack MotionBuilder plugin.

    Motive Data Streaming Setup (Server)

    First, you'll want to follow the instructions below to set up the data streaming settings in Motive. Once this is configured, Motive will be broadcasting tracking data onto a designated network interface where client applications can receive them.

    Streaming Settings

    • Enable - Turn on the Enable setting at the top of the NatNet section.

    • Local Interface - Choose the desired IP network address from this dropdown to stream data over.

      • Loopback

    Additional Tips

    • For best results, it is advised to run Motive and MotionBuilder separately on different computers, so that they are not competing for processing resources.

    • When streaming the data over a Wi-Fi network, Unicast transmission must be used.

    MotionBuilder Setup (Client)

    To get started, drag the OptiTrack Optical plugin from the Motion Builder Asset Browser tab > Devices into the Viewer window. This will create a dropdown menu called I/O Devices in the Navigator tab. Click the + button next to I/O Devices and select OptiTrack Optical. This will populate the plugin's settings tab if it hasn't already auto-populated from the drag and drop step from earlier.

    Device Settings

    • Local address - IP address of the MotionBuilder computer. In situations where multiple network adapter cards are present, select the adapter that is on the same network as the Motive application.

      • 127.0.0.1

        • This is the local computer IP address (127.0.0.1 or Localhost).

    Once the above settings are input appropriately, you'll want to click the box next to Online. This indicate whether or not Motive is successfully streaming to MotionBuilder.

    • Online color indicator

      • Green - Connected and streaming.

      • Yellow - Connected but not streaming.

    Insight VCS Features

    Feature
    Description

    The Virtual Camera also integrates into existing MotionBuilder camera control workflows, including spline/path/constraint animation and custom scripted behaviors.

    Creating a Virtual Camera Device (OptiTrack Server)

    First you'll need to create the "neutral" or "zero" orientation of a Rigid Body

    The “neutral” or “zero” orientation of a Rigid Body is the orientation when it is created in Motive. This will be the camera’s neutral orientation. In addition, for correct interpretation into MotionBuilder’s coordinate system, it is important you align your Rigid Body with the correct axis and coordinate system convention as follows:

    • Point your tracking controller (e.g. VCS Pro) along physical volume -Z axis.

    Step-By-Step

    After correctly orienting your Rigid Body follow the steps below to continue with the setup:

    1. [Motive] Create a Rigid Body from your tracking controller’s markers.

    2. [OptiTrack Server App] Enable network streaming (make sure Rigid Body data is streaming).

    3. [MotionBuilder] Drag the OptiTrack Insight VCS device from the Motion Builder Asset Browser Panel into the Viewer or Navigator window.

    4. [Insight VCS Panel] Connect to an OptiTrack Server (e.g. Motive, Arena, TrackingTools) by clicking the “Online” checkbox. If the connection was successful and data is streaming from you OptiTrack server application, this box will change from Red to Green.

    You should now see a standard MotionBuilder Camera moving within your 3D scene:

    Creating a Virtual Camera Device (Universal Mode)

    In Universal mode, a MotionBuilder Rigid Body is used to drive a camera position. This position/orientation information is merged with the VCS camera controls and applied to the camera's final state (position, lens settings, etc.). It is assumed the Rigid Body orientation matches the MotionBuilder default camera orientation (camera lens aimed down +X axis). For example, if streaming from Motive, create a Rigid Body in MotionBuilder from the optical data, with the camera lens aimed down +X in MotionBuilder.

    Step-By-Step

    1. [MotionBuilder] Create a Rigid Body or a Marker. For a Marker:

      • Create a bone (or some rigid element) from the geometry your 6DOF system streams into MotionBuilder

      • Create a MotionBuilder "Marker" element, and make this new marker a child of the bone

    Limitations

    • The following VCS features/properties are unavailable when operating in Universal Mode:

    • Scale Rotation

    Controllers

    The Insight VCS plugin supports any DirectInput compatible joystick or USB device. Controllers can then be configured to perform actions or control the camera using Controller Profiles.

    Controller Profiles

    Virtual Camera controls are managed by a Control-to-Event mapping system called the Controller Profile. The controller profile is configured in the Controller Tab. The Insight VCS plugin allows you to create and swap between multiple controller profiles, allowing you to create any number of custom button/axis configurations depending upon the scene, particular move types, different physical VCS controllers or HID devices, etc. Profiles can be saved and then later swapped out using the Profile Dropdown. Profiles are saved into <VCS Mobu install folder>\Profiles folder.

    The VCS plugin ships with 2 default profiles:

    • The 2 controller VCS Pro (<VCS Mobu install folder>\Profiles\VCSProDefault.xml).

    • The XBox based VCS Mini (<VCS Mobu install folder>\Profiles\VCSMiniDefault.xml).

    When the Insight VCS plugin is first launched, it will attempt to detect any compatible controllers. It will then attempt to match the detected controllers with an existing Controller Profile, beginning with the last used ("preferred") profile.

    Profile Setup

    The VCS plugin supports 2 types of controller inputs and 2 types of actions:

    Axis Inputs / Actions

    • Axis inputs are analog inputs and represent the range of values. This range has been scaled to [0, 1000]. Axis inputs can be assigned to Axis actions. PTZ operations (Pan, Tilt, Zoom) are good examples of typical Axis Actions.

    Button Inputs / Actions

    • Button inputs are the button inputs on the controller. These are “one shot” events that occur when the button is pressed. Transport commands such as Play, Record, and Rewind are typical examples of “one shot” events.

    Some Insight VCS controllers have a dial that is represented in the Axis list as a "Wheel". This is a special form of an axis, and can be used to modify existing actions, such as zoom speed, pan speed, and motion scale amount.

    Typical Insight VCS Controller Map

    Insight VCS Inputs/Action Settings

    Action Parameters

    Some actions have parameters that modify the way they operate. The following tables list the axis and button actions, and how the parameter value for that action is interpreted.

    VCS Controller - Axis Actions

    Action
    Parameter(s)
    Example

    Curve Types

    When mapping a controller thumbstick axis to an animatable camera parameter (pan, zoom), you have the option of specifying how the Insight VCS plugin should interpret controller axis movement as a standard animation curve. Instead of modifying the value over time, however, the motion curve modifies the value over the controller span, from neutral/center position (0) to maximum position (Max). The following diagram describes this relationship:

    Button Actions

    VCS Controller - Button Actions

    Action
    Parameter
    Example

    Run Script Usage

    When using the Run Script action to map button presses to MotionBuilder scripts, be sure to note the following:

    1. Scripts must be placed in the MotionBuilder scripts folder in order to be correctly located. For a typical MotionBuilder installation this folder is:

      • C:\Program Files\Autodesk\MotionBuilder 2014\bin\config\Scripts:

    2. The RunScript Param is the filename of the script, including the .py extension:

    Virtual Camera Device Settings

    The Insight VCS plugin has several properties that can be used to customize its behavior. These properties can be accessed in the same manner as any other MotionBuilder object property, such as from the Asset Browser or from MotionBuilder's Python scripting environment.

    Action
    Parameter

    MotionBuilder Camera Settings

    A MotionBuilder Camera controls how you see the 3D scene. MotionBuilder’s Camera object allow users the ability to model realworld cameras, including settings such as Focal length, aspect ratio, film format, etc.

    Refer to the for more information on Camera Settings.

    This is the local computer IP address (127.0.0.1 or Localhost).

  • Used for streaming data locally on the PC you are running Motive on that does not interact with the LAN.

  • Good option for testing network issues.

  • 192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)

    • This IP address is the interface of the LAN either by Wi-Fi or Ethernet.

    • This will be the same address the Client application will use to connect to Motive.

  • Transmission Type

    • For streaming over a Wi-Fi network, setting the Transmission Type to Unicast is strongly advised.

  • Select desired data types to stream under streaming options:

    • Rigid Bodies - Enabled (required).

    • Skeletons - Optional for Skeleton tracking.

    • Markers (Labeled, Unlabled, Asset) - Disabled for HMDs (advised).

    • Devices - Disabled.

  • Skeleton Coordinates

    • Set to Local.

  • Bone Naming Convention

    • When streaming Skeletons, set to FBX.

  • In order to stream data from the Edit mode, a capture-recording must be playing back in Motive.

  • For additional information on data streaming in general, read through the Data Streaming page.

  • Use this loopback address if Motive is running on the same machine as MotionBuilder.

  • 192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)

    • This IP address is the interface of the LAN either by Wi-Fi or Ethernet.

    • Use this if Motive is running on a different computer, but on the same network as the MotionBuilder computer.

  • 169.xxx.x.xx

    • This address is assigned when a DHCP server could not be reached.

    • This address can be ignored for our application.

  • Server Address - IP address of computer that is running Motive

    • 127.0.0.1

      • Use this IP when both Motive and MotionBuilder are running on the same computer.

  • Server Type

    • Multicast (default) or Unicast

    • Must match what is selected in the Motive Streaming settings.

    • Multicast is default and recommended.

  • Red - Not connected or streaming.
  • Live

    • Indicates to MotionBuilder that data is coming from a live source (checked) or from a recorded take.

  • Recording

    • Indicates to MotionBuilder that data from this device should be recorded when MotionBuilder is recording.

  • Model Binding

    • Indicates the MotionBuilder Camera to be controlled by the tracking controller.

  • Device Information

    • Information about the status of the connection.

  • OptiTrack Connection

    • Indicates the data source is an OptiTrack server application.

  • Universal Connection

    • Indicates the data source is a generic MotionBuilder RigidBody.

  • Rigid Body ID

    • [OptiTrack Connection] Name of the OptiTrack server application’s Rigid Body to use for tracking.

  • Rigid Body

    • [Universal Connection] Name of the MotionBuilder RigidBody to use as a position/orientation source.

  • Control common actions like recording and playback using the controller.

    Custom Commands

    Customize the controller by mapping controller inputs to execute scripts for complete control and one-person camera operation.

  • [Insight VCS Panel] Create a new MotionBuilder camera using the Model Binding dropdown.

  • [Insight VCS Panel] [Optional] If tracking more than one Rigid Body object in your OptiTrack server application, select the Rigid Body you wish to use as your tracking source using the Rigid Body ID dropdown on the CameraTracker device panel (Note: the camera tracker will automatically default to the first detected Rigid Body).

  • This new “Marker” marker should now have the same 6DOF value as the bone
  • . Use this “Marker” in the VCS universal dropdown to drive the 6DOF data of the VCS.

  • [Insight VCS Panel] Check the "Universal Connection" Radio.

  • [Insight VCS Panel] Check "Online".

  • [Insight VCS Panel] Create a new MotionBuilder camera binding using the Model Binding dropdown.

  • [Insight VCS Panel] Select the Rigid Body you created in step 1 using the Rigid Body dropdown in the Universal Connection group box.

  • Offset Rotation

    [Focal length change rate] [Curve Type]

    1.0

    Orbit Offset

    [Orbit offset change rate] [Curve Type]

    1.0

    Focal Distance

    [Focal distance change rate] [Curve Type]

    1.0

    Wheel Modifier

    [VCS Dial controls only] Modify an axis' parameter value (e.g. zoom speed, pan speed, translation scale) by a specified increment.Format:

    • [axis name] [increment]

    Examples:

    • X Axis .1 (+/- the X Axis parameter by 0.1)

    • Y Axis .2 (+/- the Y Axis parameter by 0.2)

    • Z Axis .1 (+/- the Z Axis parameter by 0.1)

    Rotate Right/Left

    [Rotate Speed] [Curve Type]

    • 1.0 [ rotate at normal rate, linear curve]

    • 1.0 1 [rotate at normal rate, ease-in curve]

    • 0.5 1 [rotate at half speed, ease-in curve]

    • 2.0 [rotate at 2x speed ]

    Rotate Up/Down

    [Rotate Speed] [Curve Type]

    SAME AS ABOVE

    Tilt Right/Left

    [Rotate Speed] [Curve Type]

    SAME AS ABOVE

    Runs a MotionBuilder python script. This script must be located in your MotionBuilder scripts root folder.

    ResetOffset.py

    ToggleAxisAction

    • Toggles a specified axis between 2 actions.

    • [Axis name],[Action1 Index], [Action1 Params],[Action2 Index],[Action2 Params]

    • The example at right toggles the Y Axis behavior between Dolly In/Out at speed 1.0 with a Cubic Curve and Focal Length at 0.1 speed with a Quartic curve.

    Y Axis, 3, 1.0 1, 4, 0.1 2

    Pause

    None.

    Stop

    None.

    Rewind

    None.

    Suspend Tracking

    None.

    Scale Translation +/-

    Increment amount.

    .5

    Scale Rotation +/-

    Increment amount.

    .5

    Zoom +/-

    Increment amount.

    .5

    FOV +/-

    Increment amount.

    .5

    Playback Speed +/-

    Enumerated value that matches MoBu transport.

    2

    Reset Zoom

    Focal Length to reset to.

    50.0

    Reset Offset

    [x y z] Optional - specifies the position to reset camera to, otherwise camera is reset to (0.0,0.0,0.0).

    10.0 10.0 0.0 [reset camera offset to 10,10,0]

    Reset Rotation Offset

    [x y z] Optional - specifies the rotation vector to reset to (in degrees), otherwise camera is reset to (0.0,0.0,0.0).

    0.0 90.0 0.0 ( reset camera to 90 degrees yaw)

    Reset Orbit Offset

    None.

    Change Camera

    None.

    Play Last Take

    None.

    Reset to Live

    None.

    Instructs whether changes to Scale Translation update the Offset Translation value in order to keep the camera in the same position (true) or does not affect Offset Translation, resulting in camera position moving to new scaled amount.

    Smooth Translation Amount

    Applies smoothing to the camera position values.

    Smooth Rotation Amount

    Applies smoothing to the camera rotation values.

    Dead Zone

    Controller thumbsticks do not typically restore to an exact center value. Dead Zone can be used to specify a value range around thumbstick center that should be ignored. This can be used, for example, to prevent drift in pan/dolly/zoom when thumbsticks are mapped to these actions.

    Pan/Dolly/Boom

    Use VCS controls to Pan Left/Right and Up/Down. Pan in local, world, or a combination of coordinate systems. Adjust pan speeds on the fly with controls or scripts.

    Pitch/Tilt/Roll

    Absolute orientation at all times from the OptiTrack optical system.

    Free Move

    Absolute position at all times from the OptiTrack optical system. Scale movement in real-time with controllers or from script.

    Zoom

    Fully control camera zoom/FOV and zoom rates using the controller's analog thumbsticks and speed adjusters.

    Smooth

    Advanced Kalman filtering allows for customizing a "Steadicam" feeling.

    Axes

    Name of the controller’s analog input.

    Action

    Action to take or value to change.

    Param

    Input parameter used by some actions to modify the action in some way (e.g. speed up or slowdown zooming).

    Value

    Current value of the control input.

    Pan Right/Left

    [Pan Speed] [Curve Type]

    • 1.0 [ pan at normal rate, linear curve]

    • 1.0 1 [ pan at normal rate, ease-in curve]

    • 0.5 1 [ pan at half speed, ease-in curve]

    • 2.0 [ pan at 2x speed ]

    Dolly In/Out

    [Pan Speed] [Curve Type]

    1.0

    Pan Up/Down

    [Pan Speed] [Curve Type]

    1.0

    Record

    Copy data from previous take.

    true

    Play

    None.

    Fullscreen

    Toggles between Fullscreen and the MotionBuilder GUI. On return to the MotionBuilder GUI, this parameter indicates the number of viewports to show.

    2

    Scale Translation

    Scale the physical movement (when tracking controller is moved).

    Scale Rotation

    Scale the physical movement (when tracking controller is moved).

    Offset Translation

    Can be used for 2 purposes :

    1. To adjust the center of the physical volume to the virtual scene.

    2. To effectively pan/truck/dolly the camera. This value is updated by the thumbstick controls for the Pan/dolly/truck operations

    OffsetTranslationMode

    Affects how Offset Translation is applied to the camera:

    • 0 : Global Translates the camera according to the MotionBuilder global coordinate system (global)

    • 1 : Local Translates the camera according to the camera’s coordinate system (local).

    • 2 : LocalOnStart Translates the camera according to the camera’s coordinate system when the camera first moves (stick first moves), then keeps that axis (Does not continuously update the coordinate system).

    Boom Global Always

    Always pan camera up/down in the global Y axis, regardless of the OffsetTranslationMode

    MotionBuilder documentation
    Asset Browser tab > Devices view of the plugins.
    Navigator tab.
    OptiTrack MotionBuilder Skeleton Plugin settings.
    Controller orientation with CS-400 ground plane. Any ground plane can be used, but just make sure it oriented correctly along the physical -Z axis (longest side of ground plane).
    Camera in 3D scene.
    Insight VCS Controller Map tab.
    VCS Plugin built-in curve options.
    Insight VCS Properties.

    Play/Record

    Focal Length +/-

    RunScript

    Scale Updates Offset

    Scale All .5 (+/- all translational scale by .5)
  • Translate All 1.0 (+/- all pan speeds by 1.0)

  • This action can be used to extend axis functionalitywithout swapping profiles.