Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
Loading...
For our streaming applications, Unreal Engine 4 and 5 have essentially the same setup. The main difference is the UI and where to find the appropriate settings and buttons. All our guides on this Wiki have been updated to feature Unreal Engine 5. If you need assistance with Unreal Engine 4 please feel free to reach out to our support team.
The OptiTrack Unreal Engine Plugins allow you to stream real-time tracking data from Motive into Unreal Engine. This includes tracking data of Rigid Bodies, Skeletons, and HMDs that are tracked within Motive. This article focuses on the organization of those plugins. For basic instructions on setting up a motion capture system, please refer to the Getting Started guide instead.
OptiTrack - Live Link
OptiTrack - Streaming Client
OptiTrack - VR Latency Optimization
A variety of head mounted displays (HMDs) can be integrated using the OptiTrack OpenVR Driver.
For plugin version 1.23 and above, support for Oculus HMDs has been deprecated.
First, you'll want to follow the below instructions to set up the data streaming settings in Motive. Once this is configured, Motive will be broadcasting tracking data onto a designated network interface where client applications can receive them.
Open the Data Streaming Pane in Motive's Settings window and set the following settings:
Enable - Turn on the Enable setting at the top of the NatNet section.
Local Interface - Choose the desired IP network address from this dropdown to stream data over.
Loopback
This is the local computer IP address (127.0.0.1 or Localhost).
Used for streaming data locally on the PC you are running Motive on that does not interact with the LAN.
Good option for testing network issues.
192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)
This IP address is the interface of the LAN either by Wi-Fi or Ethernet.
This will be the same address the Client application will use to connect to Motive.
Transmission Type
For streaming over a Wi-Fi network, setting the Transmission Type to Unicast is strongly advised.
Select desired data types to stream under streaming options:
Rigid Bodies - Enabled (required).
Skeletons - Optional for Skeleton tracking.
Markers (Labeled, Unlabled, Asset) - Disabled for HMDs (advised).
Devices - Disabled.
Skeleton Coordinates
Set to Local.
Bone Naming Convention
Set the appropriate bone naming convention for the client application. For example, if the character uses the FBX naming convention, this will need to be set to FBX.
Additional Tips
For best results, it is advised to run Motive and Unreal Engine separately on different computers, so that they are not competing for processing resources.
When streaming the data over a wifi network, Unicast transmission must be used.
In order to stream data from the Edit mode, a capture-recording must be playing back in Motive.
For additional information on data streaming in general, read through the Data Streaming page.
The Unreal Engine: OptiTrack Live Link Plugin is primarily intended for Virtual Production applications that require use of real-time cameras or Skeleton tracking in Unreal Engine. It interfaces with Unreal Engine's built-in Live Link plugin to allow for for streaming data from Motive in both real-time and runtime. When timecode data is available in Motive, streamed timestamps can be used to align the data with other components within the project.
The Unreal Engine: OptiTrack Streaming Client Plugin is primarily used for Virtual Reality due to it's low latency optimizations. It provides streaming of Rigid Body and Skeleton tracking data from Motive into Unreal Engine, and it also includes helper functions and actors to provide quick integrating of the plugin into the scene.
The OptiTrack - VR Latency Optimization plugin is an optional plugin for SteamVR projects. When enabled, it provides HMD render compensation to minimize latency in VR applications. For users developing for SteamVR using the OptiTrack plugin, whether or not using the OptiTrack OpenVR Driver, it's suggested to have this plugin enabled.
This version of the plugin is compatible with Unreal Engine 5.
This version of the plugin is compatible with Unreal Engine 5.
OptiTrack motion capture systems can be used to track head mounted displays (HMD) and integrate the tracking data into Unreal Engine for VR applications. For instructions on integrating HMD tracking data into Unreal Engine, please refer to the corresponding page:
Supported HMDs
At the time of writing, the following HMDs are supported:
HTC VIVE
HTC VIVE Pro 1/2
Valve Index
HP Reverb G2
Deprecated support for Oculus HMDs:
Support for Oculus Integration have been deprecated starting from UE plugin version 1.23; Plugin version 1.22 or below must be used for Oculus HMDs.
Vive and Valve Index HMDs are supported through the OpenVR driver.
When setting up multiplayer games on wireless clients, it is more beneficial for each client to make direct connection to both the tracking-server (Motive) and the game-server, rather than rebroadcasting the streamed tracking data through the game-server. Then, any of the game related actions that interacts with the tracking data can be processed on the game-server, and this server can send out the corresponding updates to the wireless clients. This allows the wireless clients to only receive both the tracking data or updates without having to send back any information; in other words, minimizing the number of data transfers needed. If wireless clients are sending data there will be a minimum of two transfers on the wireless network, and each transfer of data through wireless network is at risk of latency or lost packets.
The OptiTrack VR driver lets you stream tracking data of the head-mounted display (HMD) and the controllers from Motive into SteamVR. The plugin ships in an installer package (MSI), and it will set up the driver along with a tool for configuring streaming settings. Using this, the tracking data from Motive can be used to override the tracking of the VR system in any applications that are compatible with SteamVR.
Supported HMDs:
Vive
Vive Pro
Vive Pro 2
Valve Index
HP Reverb
Motive version 2.2 or higher
SteamVR installed on the host computer.
For the camera system to track the HMD, a set of markers must be attached to the HMD. You can either use the active markers (Active HMD clip or Active Tags) or the passive markers. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light and has the intelligence to be uniquely identified.
When using the active markers, you can conveniently put a set of 8 markers onto the HMD by using the HMD Clip, or you can attach the markers from the Tag manually onto the HMD using adhesives and marker posts.
Active HMD Clip
Active HMD Clip is an HMD enclosure with a total of 8 active markers embedded for tracking. At the time of writing, there are active HMD clips for Vive Pro / Valve Index HMDs available on the webstore. The clips can be mounted easily by pushing it onto the HMD until the latches click, and you can detach it by gently lifting the three latches located at the top, left, and right side of the clip.
Marker Types
Marker Placement
Make sure the markers are attached securely and do not move. If the markers happen to move even slightly after a Rigid Body is defined, it will negatively affect the tracking and the Rigid Body definition may need to be updated.
Avoid placing multiple markers in close vicinity as they may overlap in the camera view in certain orientations.
Using marker posts to extend out the markers is recommended to improve marker visibility from more angles.
If you are using the active markers, there is an extra USB port on the HMD that you could draw the power from.
For using OptiTrack system for VR applications, it is important that the pivot point of HMD Rigid Body gets placed at the appropriate location, which is at the root of the nose in between the eyes. When using the HMD clips, you can utilize the HMD creation tools in the Builder pane to have Motive estimate this spot and place the pivot point accordingly. It utilizes known marker configurations on the clip to precisely positions the pivot point and sets the desired orientation.
Under the Type drop-down menu, select HMD. This will bring up the options for defining an HMD Rigid Body.
If the selected marker matches one of the Active clips, it will indicate which type of Active Clip is being used.
Under the Orientation drop-down menu, select the desired orientation of the HMD. The orientation used for streaming to Unity is +Z forward and Unreal Engine is +X forward, or you can also specify the expected orientation axis on the client plugin side.
Hold the HMD at the center of the tracking volume where all of the active markers are tracked well.
Click Create. An HMD Rigid Body will be created from the selected markers and it will initiate the calibration process.
During calibration, slowly rotate the HMD to collect data samples in different orientations.
Once all necessary samples are collected, the calibrated HMD Rigid Body will be created.
SteamVR Required: The VR driver streams tracking data through SteamVR. Please make sure SteamVR is installed on the computer before setting up the driver.
Download and run the installer
You may receive a warning window prior to the installation wizard. To circumvent this, select More info and then Run Anyway.
Open the configuration program
Once the driver has been successfully installed, launch the configuration utility software (C:\Program Files\OptiTrack\OpenVR Driver\ConfigUtil). Using this tool, you can load and check existing configurations and make changes to the settings as needed. To import current settings, click Load and to save out the changes, click Save.
Please make sure you are running this tool with admin privileges; if not, it might not be able to modify the settings properly. If the configuration software detects a running instance of SteamVR through OpenVR, it will be indicated as Initialized at the very top as shown in the image. Please note that when the settings get modified while SteamVR is running, the SteamVR must be restarted to apply the changes.
Configure connection settings
First, configure the connection settings so that the driver listens to the Motive server where the tracking data is streamed from. The server address must match the address where Motive is streaming the data to, and the local address must match the IP address of the computer on the network where the driver is installed.
Save the configurations by clicking on Save. This will modify the set of configurations in the steamvr.settings file in the steam installation directory and they will override the HMD tracking with the tracking data from Motive. If you already had an instance of OpenVR or SteamVR running, restart the application to apply the changes.
Configuration File
The configuration tool basically imports and modifies the contents in the steamvr.settings file (C:\Program Files (x86)\Steam\config\steamvr.settings). When needed, the driver related settings can be changed directly from this file also, but it will be easier to configure the settings using the provided configuration tool.
Launch SteamVR. If the driver is successfully set up, you should see a tracker icon added to the right of the HMD icon and the HMD will now be using the motion capture system instead of the base stations. Here, please make sure all of the lighthouse base stations are powered off.
VIVE controllers are a Beta feature and may not work for every device. Support for this particular feature is limited.
When needed, the Vive controllers can be configured as well. To do so, open the configuration utility tool while SteamVR is running. At the top of the configuration tool, it should indicate OpenVR status as Initialized and the controllers must be showing up in SteamVR. Then, in the controller sections, enable the controllers, specify the override device using the drop-down menu, and input the corresponding streaming ID of the controller Rigid Bodies in Motive. Once everything has been configured, save the changes and restart SteamVR. When the override is configured properly, SteamVR will have an additional tracker icon per each enabled controller.
Now that the driver is set up, the HMD tracking will be overridden by tracking data from the mocap camera system, and you can integrate HMDs into the game engine through their own VR integration.
Broadcast Frame Data must be set to true.
Local interface must be set to the desired IP address to stream the tracking data from.
Streaming of Rigid Bodies must be set to True
For wireless streaming, use Unicast streaming type.
Please make sure the Unity project is configured for OpenVR development. In Unity, open player settings from Edit → Project Settings → Player and select the OpenVR under the Virtual Reality SDK lists. Once this is set up properly, it will play the scene on the HMD.
Make sure Unreal Engine is configured for SteamVR development. Please refer to the Unreal Engine's documentation for more information on developing for SteamVR.
As of the OpenVR Driver 2.1.0 the auto-detection port default is 1513. In the case where a firewall must configure individual ports to allow or disallow data, this port can be used to allow the OpenVR Driver to connect automatically to Motive.
Client Origin
When using both the VR driver and the plugins (UE/Unity), it is important that the client origin object is located at the origin without any rotations. In other words, it must have the position set to (0,0,0) and the rotation set to (0,0,0).
Notes for Unreal Engine Users
When using the Unreal Engine plugin, you will need to additionally create a custom pawn for properly aligning the coordinate systems between SteamVR and OptiTrack UE plugin:
Create a new pawn. Right-click in Content Browser, and from the context menu, select Blueprint → Blueprint Class → Pawn.
Load created Blueprint in the editor and add a camera component.
(optional) Double-check that the “Lock to HMD” property is set to true under the camera component properties in the details pane.
Select the pawn and set the “Base Eye Height” property to 0 in the details pane.
Compile the pawn then add it to the scene.
Select the pawn and set the “Auto Possess Player” to “Player 0”.
The HMD should now be working for Levels built for VR.
This page provides instructions on how to use the OptiTrack Unreal Engine Live Link plugin. The plugin communicates with Unreal's built-in Live Link system by providing a Live Link source for receiving tracking data streamed from Motive. This plugin can be used for controlling cameras and objects in virtual production applications. When needed, the OptiTrack Unreal Engine Plugin can also be alongside this plugin. For a specific guide to InCamera VFX (i.e. LED Wall Virtual Production) please see this wiki page Unreal Engine: OptiTrack InCamera VFX.
1. [Motive] Setup rigid body streaming in Motive.
Get Motive streaming with at least one rigid body or Skeleton asset. Make sure the Streaming settings are configured correctly, and the asset is active under the Assets pane.
2. [UE] Install the OptiTrack plugins in Unreal Engine (UE).
You can install the OptiTrack Unreal Engine plugin by putting the plugin files into one of the following directories:
A global engine plugin can be placed in C:\Program Files\Epic Games\[Engine Version]\Engine\Plugins
A project-specific plugin can be placed in [Project Directory]\Plugins
3. [UE] Enable the plugins in UE project.
Go to Edit → Plugins and enable two of the required plugins. First one is the OptiTrack - Live Link plugin under Installed group, and the second one is the built-in Live Link plugin under Built-In group.
4. [UE] Open the LiveLink pane
Open the LiveLink pane from Window → Virtual Production → Live Link in the toolbar.
5. [UE] Configure and create a new OptiTrack source
In the LiveLink pane under Source options, go to the OptiTrack Source menu and configure the proper connection settings and click Create. Please make sure to use matching network settings configured from the Streaming pane in Motive.
6. [UE] Check the Connection.
If the streaming settings are correct and the connection to Motive server is successful, then the plugin will list out all of the detected assets. They should have green dots next to them indicating that the corresponding asset has been created and is receiving data. If the dots are yellow, then it means that the client has stopped receiving data. In this case, check if Motive is still tracking or if there is a connection error.
1. Add the camera object or static mesh object that you wish to move
Add a camera actor from the Place Actors pane or a static mesh from the project into your scene. For the static meshes, make sure their Mobility setting is set to Movable under the Transform properties.
2. Add a LiveLinkController Component
Select an actor you want to animate. In the Details tab select your "actor" (Instance). In the search bar, type in Live Link. Then click on the Live Link Controller from the populated list.
3. Select the target rigid body
Under the Live Link properties in the Details tab click in the Subject Representation box and select the target rigid body.
4. Check
Once the target rigid body is selected, each object with the Live Link Controller component attached and configured will be animated in the scene.
When the camera system is synchronized to another master sync device and a timecode signal is feeding into eSync 2, then the received timecode can be used in UE project through the plugin.
1. Set Timecode Provider under project settings
From Edit → Project Settings, search timecode and under Engine - General settings, you should find settings for the timecode. Here, set the the Timecode Provider to LiveLinkTimeCodeProvider.
2. Set OptiTrack source in the Live Link pane as the Timecode Provider
Open the Live Link pane, and select the OptiTrack subject that we created when first setting up the plugin connection. Then, under its properties, check the Timecode Provider box.
3. Check
The timecode from Motive should now be seen in the Take Recorder pane. Take Recorder pane can be found under Window → Cinematics → Take Recorder in the toolbar.
1. Create a new Animation Blueprint
Right click the mesh you would like to use and select "Create > Anim Blueprint"
2. Name and Open the Animation Blueprint
Name the animation blueprint something appropriate, then double click it to open the blueprint.
3. Hook up your Blueprint
Create a "Live Link Pose" component and connect it to the "Output Pose". Assign the "Live Link Subject Name" to the Skeleton that you would like to use.
Change the "Retarget Asset" property in the Details pane of the blueprint editor to "OptiTrackLiveLinkRetarget"
4. Getting the Skeleton to Animate
To animate the Skeleton in real time click the Animation Blueprint from earlier. In the Details pane under the skelteonLive Link Skeleton Animation". After you add that component the mesh should start animating.
To animate the Skeleton in a game, just press the play button. Adding the "Live Link Skeleton Animation" object is not necessary to animate in play mode.
Debugging Note
If the retargeting doesn't match the mesh correctly, then you can create a new OptiTrackLiveLinkRetarget blueprint from scratch and modify the bone mapping names.
Animating a MetaHuman follows basically the same steps as another Skeleton, but requires hooking into the Skeleton at a very specific location. For more information about MetaHuman setup outside of our scope, please visit Epic Games's website.
1. Find the Skeletal Mesh
Navigate to the Skeletal Mesh for your MetaHuman. This is typically located in a folder such as MetaHumans > [Name] > [Female/Male] > [Height] > [Weight] > Body. Double click the Skeletal Mesh to open the blueprint.
2. Open the AnimGraph Tab
Click the "Blueprint" option on the top bar of new window. In the bottom left corner navigate to My Blueprint > Animation Graphs > AnimGraph and double click.
3. Hook up your Blueprint
You'll see a very complex AnimGraph already setup. Make a new Live Link Pose object like in the Skeleton creation steps. Connect the Input Pose or Control Rig to the input of the Live Link Pose. Connect the output of the Live Link Pose to the Output Pose AnimGraph object.
4. Retarget in the Details Tab
The last step in this window is to set the Retarget Asset to OptiTrackLiveLinkRetarget for the Live Link Pose node. To do this, simply click on the dropdown in the Details tab and click the dropdown under Retarget Asset and select OptiTrackLiveLinkRetarget. After it has been set, click Compile in the top left of this window. You may now close this window and move on to the next steps.
5. Level of Detail (LOD)
MetaHumans will change their Level of Detail (LOD), i.e. how complex the asset is, based on how far the camera is from the actor among other factors. In order for things to animate correctly the Forced LOD must be a minimum of 1 (default of -1). To change this setting, click on the (Instance) you wish to change in the Details tab on the main workspace window. Below the selected (Instance) select the LOD tab. From here you can change the value in the Forced LOD Model field.
6. Animate your MetaHuman
At this point if you drag the base MetaHuman object into the scene then it will animate like other Skeletons.
For testing the project in standalone game mode, or when developing an nDislay application, the Live Link plugin settings must be saved out and selected as the default preset to be loaded onto the project. If this is not done, the configured settings may not get applied. After configuring the LiveLink plugin settings, save out the preset from the Live Link pane first. Then, open the Project Settings and find Live Link section in the sidebar. Here, you can select the default Live Link preset to load onto the project, as shown in the screenshot below. Once the preset is properly saved and loaded, the corresponding plugin settings will be applied to the standalone game mode.
If all the configuration is correct, the actors will get animated in the newly opened game window when playing the project in the standalone game mode.
Another path to get data into Unreal Engine is to stream data from Motive -> MotionBuilder (using the OptiTrack MotionBuilder Plugin) -> Unreal Engine (using the Live Link plugin for MotionBuilder). This has the benefit of using the Human IK (HIK) retargeting system in MotionBuilder, which will scale characters of very different sizes/dimensions better than the base Live Link plugin. More information can be found by consulting Unreal Engine's Live Link to MotionBuilder documentation.
This page provides instructions on how to configure VCS inputs in Unreal Engine. The basic configuration is similar to configuring any other input triggers in Unreal Engine. Please note that only one VCS controller can be connected and configured due to some limitations. Having two controllers connected at the same time is not supported.
Create a Rigid Body from your tracking controller’s markers using the Builder pane or by selecting the markers and using the keyboard hotkey CTRL + T. You'll want to orient the controller along the +Z axis during creation to define the 'neutral' or 'zero' orientation.
Start up a project in Unreal Engine (UE).
Go to Edit tab → Plugins to open the plugins panel. Enable the Windows RawInput plugin under the Input Devices group.
In Edit tab → Project Settings, scroll to the bottom on the left side panel until you see Raw Input under the plugins group. Here you will let UE project know which input devices to use.
To find these IDs, you will need to look at the windows device properties. Go to Windows Control Panel -> Devices and Printers. Then right-click on the VCS controllers to access its properties. In the properties, go to the Hardware tab and click properties for “HID-compliant game controller”.
Once you access the controller properties, go to the details tab. Select Hardware ID in the drop-down menu and the hardware ID (HID) and product ID (PID) will be shown under the highlighted section.
Under the project settings panel Raw Input plugin properties, input both the vendor ID (Hardware ID) and the product ID (PID) that was found under the controller properties.
Register the Input Buttons
Now the project has the IDs to look for the controllers, next step is to setup and register the input buttons. To do so, you will play the project scene, and trigger on the buttons to register them.
In UE, hit Play and press (~) to access the console. In the console, input command ShowDebug INPUT". This will list out all of the input actions on the left side of the viewport.
Use all of the keys on the controller to register the inputs; total three axis and seven buttons. Please note that these keys may not exactly match the keys on your controller.
Axis 1: Joystick left/right
Axis 2: Joystick up/down
Axis 3: Nob rotate
Button 1: Blue
Button 2: Black
Button 3: White
Button 4: Red
Button 6: Joystick click
Button 7: Nob click
Map the Registered Inputs
Now that the buttons have been registered, next step is to map the keys. They will be mapped under Edit → Project Settings → Inputs. Choose either the Axis mapping or the action mapping to map the controls to desired actions.
Now that all of the buttons are set up, use them to control the VCS in UE.
The allows you to stream real-time Rigid Body, skeleton, and HMD tracking data from Motive into Unity. Using the streamed data, objects and characters in the scene can be animated. The plugin contents are distributed in unitypackage format, and you can simply load this file into Unity projects to import its contents. Once imported, included C# scripts can be used for instantiating a client origin and receiving the tracking data. This article focuses on how to set up and use the plugin.
Unity Version: 2017.2 / 2017.1 or above. (2020.3+ recommended)
Visual Studio 2019 or latest Visual C++ Redistributable
Notes on HMD Integration
Streaming in Motive
Enable - Turn on the Enable setting at the top of the NatNet section.
Local Interface - Choose the desired IP network address from this dropdown to stream data over.
Transmission Type - Typically you will want to set this to Unicast since it subscribes only to the data you wish to use and normally uses less network bandwidth. This is especially advised if streaming data over WiFi.
(Optional) If using Multicast, then enable/disable the desired data types. For tracking HMDs, disabling the Marker streaming is advised.
Additional Tips
In order to stream data from Edit mode, a capture recording must be playing back in Motive.
For best results, it is advised to run Motive and Unreal Engine separately on different computers, so that they are not competing for processing resources.
When streaming the data over a WiFi network, Unicast transmission must be used.
While in the Unity project, double-click on the plugin unitypackage file and import the plugin assets into the project. When the package has been successfully imported, the following contents will be available within the project:
Plugin Contents
In order to receive tracking data from a server application (e.g. Motive), a client object must be set up. The OptitrackStreamingClient.cs script can be attached to any object to stream data relative to that object. Typically, this script is attached to an empty object or loaded in using the "Client - OptiTrack" prefab object in the Assets/Optitrack/Prefabs
folder.
[Unity] Under the Prefabs folder, import the "Client - OptiTrack" prefab object into the scene, or attach OptitrackStreamingClient.cs script onto an empty object.
[Unity] In the streaming Client object, configure the connection settings to match the streaming settings in Motive.
Server Address - IP address of the PC that the server application (Motive) is running on.
Local Address - Local IP Address of the PC that the client application (Unity) is running on. (Typically, this looks similar to the Server Address except maybe the last digits.)
Connection Type - Must match Motive. Unicast is recommended.
[Unity] If you wish to receive tracking data from more than one server instances, you may create multiple objects with the client script attached.
Although it is not strictly necessary, you may find it helpful to organize your tracked objects as children of the streaming Client object. This will allow you to adjust the position of the Client object to adjust the position of all streamed objects relative to the Client object.
[Unity] On an object that you wish to animate, attach the OpitrackRigidBody.cs script.
[Unity] In the Streaming Client entry, link the Client object in which the OptitrackStreamingClient.cs script is attached. By default, it searches for an existing client instance, but this must be specified when there are more than one streaming client objects.
[Motive] Make sure Motive is tracking and streaming the data.
[Unity] Play the scene. The linked object will be animated according to the associated Rigid Body movement in Motive.
Note: At the time of writing, Mecanim does not support explicit goals for inverse kinematics end-effectors when using real-time retargeting. In addition, you may observe a difference in the overall scale of the position data between the retargeted skeletal animations and streamed Rigid Bodies. These two limitations may lead to inconsistencies with actors interacting with Rigid Body props, and will hopefully be addressed in a future version of the integration.
[Unity] On Unity characters, attach OptitrackSkeletonAnimator.cs script as one of its components.
[Unity] For the Streaming Client entry, link the Client object in which the OptitrackStreamingClient.cs script is attached. By default, it searches for an existing client instance, but this must be specified when there are more than one streaming client objects.
[Unity] Enter Skeleton Asset Name which is assigned in Motive
[Unity] For the Destination Avatar entry, link to the character's avatar component.
[Motive] Make sure Motive is tracking and streaming the data.
[Unity] Play the scene. When everything is set up properly, the linked avatar in Unity will be animated according to the streamed skeleton in Motive. The position of the actor will be in its reference position as explained above.
[Unity] On the OptiTrack Streaming instance, enable the Draw Markers, "Draw Cameras", or "Draw Force Plates" setting(s).
[Motive] Make sure that marker streaming is enabled in Motive if you wish to visualize markers.
[Unity] Make sure the streaming setting is set up correctly, and play the scene.
[Unity] Each marker, camera, or force plate will be drawn in the scene, as shown in the screenshot below. (Note: Only markers will animate.)\
Supported HMDs
At the time of writing, the following HMDs are supported:
HTC VIVE
HTC VIVE Pro
HTC VIVE Pro 2
Valve Index
HP Reverb
When setting up multiplayer games with wireless clients, it is more beneficial for each client to make direct connection to both the tracking-server (Motive) and the game-server, rather than rebroadcasting the streamed tracking data through the game-server. Then, any of the game related actions that interacts with the tracking data can be processed on the game-server, and this server can send out the corresponding updates to the wireless clients. This allows the wireless clients to only receive both the tracking data or updates without having to send back any information; in other words, minimizing the number of data transfers needed. If wireless clients are sending data there will be a minimum of two transfers on the wireless network, and each transfer of data through wireless network is at risk of latency or lost packets.
This page provides instructions on setting up the for integrating OptiTrack system with Vive HMDs within SteamVR applications, including Unreal Engine and Unity.
For integrating Vive HMDs, the OptiTrack OpenVR Driver must be used. This driver lets you track the head-mounted display (HMD) and the VR controllers using OptiTrack motion capture system and stream the tracking data from Motive directly into SteamVR. In other words, this will basically override the tracking from the lighthouse stations. The plugin ships as an installer package (MSI) which will set up the driver along with a utility tool for configuring client streaming settings. Once integrated, the streamed tracking data can be used in any application platform that utilizes SteamVR. For tracking of objects other than the HMDs, please read through the page for details.
Supported Systems
Vive
Vive Pro 1/2
Valve Index
HP Reverb G2
When developing for SteamVR applications using the OpenVR Driver to track the HMD in Unreal Engine 4, the OptiTrack - Streaming Client version 2.27 must be used and the OptiTrack - VR Latency Optimization version 2.27 plugin is suggested. The OptiTrack - VR Latency Optimization provides HMD render compensation that helps to minimize the latency in VR application.
The latest plugins that support Unreal Engine 5 are OptiTrack - Live Link version 3.0 and OptiTrack - Streaming Client version 3.0.
General Setup Steps
Attach the markers on the HMD
Create a Rigid Body asset
Calibrate the Pivot Point of the Rigid Body
Configure the Rigid Body settings in Motive
For the camera system to track the HMD, a set of markers must be attached to the HMD. You can either use the active markers (Active HMD clip or Active Tags) or the passive markers. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light and has the intelligence to be uniquely identified.
When using the active markers, you can conveniently put a set of 8 markers onto the HMD by using the HMD Clip, or you can attach the markers from the Tag manually onto the HMD using adhesives and marker posts.
Active HMD Clip
Active HMD Clip is an HMD enclosure with a total of 8 active markers embedded for tracking. At the time of writing, there are active HMD clips for Vive Pro / Valve Index HMDs available on the webstore. The clips can be mounted easily by pushing it onto the HMD until the latches click, and you can detach it by gently lifting the three latches located at the top, left, and right side of the clip.
Marker Types
Marker Placement
Make sure the markers are attached securely and do not move. If the markers happen to move even slightly after a Rigid Body is defined, it will negatively affect the tracking and the Rigid Body definition may need to be updated.
Avoid placing multiple markers in close vicinity as they may overlap in the camera view in certain orientations.
Using marker posts to extend out the markers is recommended to improve marker visibility from more angles.
If you are using the active markers, there is an extra USB port on the HMD that you could draw the power from.
For using OptiTrack system for VR applications, it is important that the pivot point of HMD Rigid Body gets placed at the appropriate location, which is at the root of the nose in between the eyes. When using the HMD clips, you can utilize the HMD creation tools in the Builder pane to have Motive estimate this spot and place the pivot point accordingly. It utilizes known marker configurations on the clip to precisely positions the pivot point and sets the desired orientation.
Steps
Under the Type drop-down menu, select HMD. This will bring up the options for defining an HMD Rigid Body.
If the selected marker matches one of the Active clips, it will indicate which type of Active Clip is being used.
Under the Orientation drop-down menu, select the desired orientation of the HMD. The orientation used for streaming to Unity is +Z forward and Unreal Engine is +X forward, or you can also specify the expected orientation axis on the client plugin side.
Hold the HMD at the center of the tracking volume where all of the active markers are tracked well.
Click Create. An HMD Rigid Body will be created from the selected markers and it will initiate the calibration process.
During calibration, slowly rotate the HMD to collect data samples in different orientations.
Once all necessary samples are collected, the calibrated HMD Rigid Body will be created.
SteamVR Required: The VR driver streams tracking data through SteamVR. Please make sure SteamVR is installed on the computer before setting up the driver.
You may receive a warning window prior to the installation wizard. To circumvent this, select More info and then Run Anyway.
Once the driver has been successfully installed, launch the configuration utility software (C:\Program Files\OptiTrack\OpenVR Driver\ConfigUtil). Using this tool, you can load and check existing configurations and make changes to the settings as needed. To import current settings, click Load and to save out the changes, click Save.
Please make sure you are running this tool with admin privileges; if not, it might not be able to modify the settings properly. If the configuration software detects a running instance of SteamVR through OpenVR, it will be indicated as Initialized at the very top as shown in the image. Please note that when the settings get modified while SteamVR is running, the SteamVR must be restarted to apply the changes.
First, configure the connection settings so that the driver listens to the Motive server where the tracking data is streamed from. The server address must match the address where Motive is streaming the data to, and the local address must match the IP address of the computer on the network where the driver is installed.
Save the configurations by clicking on Save. This will modify the set of configurations in the steamvr.settings file in the steam installation directory and they will override the HMD tracking with the tracking data from Motive. If you already had an instance of OpenVR or SteamVR running, restart the application to apply the changes.
Configuration File
The configuration tool basically imports and modifies the contents in the steamvr.settings file (C:\Program Files (x86)\Steam\config\steamvr.settings). When needed, the driver related settings can be changed directly from this file also, but it will be easier to configure the settings using the provided configuration tool.
Confirm the setup
Launch SteamVR. If the driver is successfully set up, you should see a tracker icon added to the right of the HMD icon and the HMD will now be using the motion capture system instead of the base stations. Here, please make sure all of the lighthouse base stations are powered off.
VIVE controllers are a Beta feature and may not work for every device. Support for this particular feature is limited.
When needed, the Vive controllers can be configured as well. To do so, open the configuration utility tool while SteamVR is running. At the top of the configuration tool, it should indicate OpenVR status as Initialized and the controllers must be showing up in SteamVR. Then, in the controller sections, enable the controllers, specify the override device using the drop-down menu, and input the corresponding streaming ID of the controller Rigid Bodies in Motive. Once everything has been configured, save the changes and restart SteamVR. When the override is configured properly, SteamVR will have an additional tracker icon per each enabled controller.
Now that the driver is set up, the HMD tracking will be overridden by tracking data from the mocap camera system, and you can integrate HMDs into the game engine through their own VR integration.
Enable must be set to toggled on.
Local interface must be set to the desired IP address to stream the tracking data from.
Streaming of Rigid Bodies must be set to True
For wireless streaming, use Unicast streaming type.
Once Motive is configured for streaming, launch SteamVR home to check the connection. If everything is setup correctly, you should be able to move around, or translate, within the scene freely. You may also need to check the ground plane to make sure it's well aligned.
If you experience any unexpected rotations in the view as you move your head, it could indicate that the HMD pivot point has not been calibrated properly. Please revisit the HMD Setup section and make sure the HMD Rigid Body pivot point is positioned and oriented at the expected pivot; which is at the root of nose with z-forward.
Make sure Unreal Engine is configured for SteamVR development. Please refer to the Unreal Engine's documentation for more information on developing for SteamVR.
Client Origin
When using the OpenVR driver for the HMD and the game engine plugins (UE/Unity) for other types of tracking data, including Rigid Body data, the client origin object must be located at the global origin without any rotations. In other words, the position must be set to (0,0,0) and the rotation must be set to (0,0,0) on the client origin. This is important because this will align the coordinate system from the (UE/Unity) plugin with the coordinate system in OpenVR pipeline
Notes for Unreal Engine Users
When using the Unreal Engine plugin, you will need to additionally create a custom pawn for properly aligning the coordinate systems between SteamVR and OptiTrack UE plugin:
Create a new pawn. Right-click in Content Browser, and from the context menu, select Blueprint → Blueprint Class → Pawn.
Load created Blueprint in the editor and add a camera component.
(optional) Double-check that the “Lock to HMD” property is set to true under the camera component properties in the details pane.
Select the pawn and set the “Base Eye Height” property to 0 in the details pane.
Compile the pawn then add it to the scene.
Select the pawn and set the “Auto Possess Player” to “Player 0”.
The HMD should now be working for Levels built for VR.
This page provides instructions on setting up the for integrating OptiTrack system with Vive HMDs within SteamVR applications; including Unreal Engine and Unity.
For integrating Vive / Vive Pro / Valve Index HMDs, the OptiTrack OpenVR Driver must be used. This driver lets you track the head-mounted display (HMD) and the VR controllers using OptiTrack motion capture system and stream the tracking data from Motive directly into SteamVR. In other words, this will basically override the tracking from the lighthouse stations. The plugin ships as an installer package (MSI) which will set up the driver along with a utility tool for configuring client streaming settings. Once integrated, the streamed tracking data can be used in any application platform that utilizes SteamVR. For tracking of objects other than the HMDs, please read through the page for details.
VIVE
VIVE Pro 1/2
Valve Index
HP Reverb G2
First of all, setup and optimize the motion capture volume as explained in the or the documentation. If you plan to install any obstacles (e.g. walls) within the capture volume, make sure they are non-reflective, and place and orient the cameras so that every corner is thoroughly captured by multiple cameras.
General Setup Steps
Attach the markers on the HMD
Create a Rigid Body asset
Calibrate the Pivot Point of the Rigid Body
Configure the Rigid Body settings in Motive
For the camera system to track the HMD, a set of markers must be attached to the HMD. You can either use the active markers (Active HMD clip or Active Tags) or the passive markers. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light and has the intelligence to be uniquely identified.
When using the active markers, you can conveniently put a set of 8 markers onto the HMD by using the HMD Clip, or you can attach the markers from the Tag manually onto the HMD using adhesives and marker posts.
Active HMD Clip
Active HMD Clip is an HMD enclosure with a total of 8 active markers embedded for tracking. At the time of writing, there are active HMD clips for Vive Pro / Valve Index HMDs available on the webstore. The clips can be mounted easily by pushing it onto the HMD until the latches click, and you can detach it by gently lifting the three latches located at the top, left, and right side of the clip.
Marker Types
Marker Placement
Make sure the markers are attached securely and do not move. If the markers happen to move even slightly after a Rigid Body is defined, it will negatively affect the tracking and the Rigid Body definition may need to be updated.
Avoid placing multiple markers in close vicinity as they may overlap in the camera view in certain orientations.
Using marker posts to extend out the markers is recommended to improve marker visibility from more angles.
If you are using the active markers, there is an extra USB port on the HMD that you could draw the power from.
For using OptiTrack system for VR applications, it is important that the pivot point of HMD Rigid Body gets placed at the appropriate location, which is at the root of the nose in between the eyes. When using the HMD clips, you can utilize the HMD creation tools in the Builder pane to have Motive estimate this spot and place the pivot point accordingly. It utilizes known marker configurations on the clip to precisely positions the pivot point and sets the desired orientation.
Under the Type drop-down menu, select HMD. This will bring up the options for defining an HMD Rigid Body.
If the selected marker matches one of the Active clips, it will indicate which type of Active Clip is being used.
Under the Orientation drop-down menu, select the desired orientation of the HMD. The orientation used for streaming to Unity is +Z forward and Unreal Engine is +X forward, or you can also specify the expected orientation axis on the client plugin side.
Hold the HMD at the center of the tracking volume where all of the active markers are tracked well.
Click Create. An HMD Rigid Body will be created from the selected markers and it will initiate the calibration process.
During calibration, slowly rotate the HMD to collect data samples in different orientations.
Once all necessary samples are collected, the calibrated HMD Rigid Body will be created.
SteamVR Required: The VR driver streams tracking data through SteamVR. Please make sure SteamVR is installed on the computer before setting up the driver.
You may receive a warning window prior to the installation wizard. To circumvent this, select More info and then Run Anyway.
Once the driver has been successfully installed, launch the configuration utility software (C:\Program Files\OptiTrack\OpenVR Driver\ConfigUtil). Using this tool, you can load and check existing configurations and make changes to the settings as needed. To import current settings, click Load and to save out the changes, click Save.
Please make sure you are running this tool with admin privileges; if not, it might not be able to modify the settings properly. If the configuration software detects a running instance of SteamVR through OpenVR, it will be indicated as Initialized at the very top as shown in the image. Please note that when the settings get modified while SteamVR is running, the SteamVR must be restarted to apply the changes.
First, configure the connection settings so that the driver listens to the Motive server where the tracking data is streamed from. The server address must match the address where Motive is streaming the data to, and the local address must match the IP address of the computer on the network where the driver is installed.
Save the configurations by clicking on Save. This will modify the set of configurations in the steamvr.settings file in the steam installation directory and they will override the HMD tracking with the tracking data from Motive. If you already had an instance of OpenVR or SteamVR running, restart the application to apply the changes.
Configuration File
The configuration tool basically imports and modifies the contents in the steamvr.settings file (C:\Program Files (x86)\Steam\config\steamvr.settings). When needed, the driver related settings can be changed directly from this file also, but it will be easier to configure the settings using the provided configuration tool.
Launch SteamVR. If the driver is successfully set up, you should see a tracker icon added to the right of the HMD icon and the HMD will now be using the motion capture system instead of the base stations. Here, please make sure all of the lighthouse base stations are powered off.
VIVE controllers are a Beta feature and may not work for every device. Support for this particular feature is limited.
Setting up the controller (optional)
When needed, the Vive controllers can be configured as well. To do so, open the configuration utility tool while SteamVR is running. At the top of the configuration tool, it should indicate OpenVR status as Initialized and the controllers must be showing up in SteamVR. Then, in the controller sections, enable the controllers, specify the override device using the drop-down menu, and input the corresponding streaming ID of the controller Rigid Bodies in Motive. Once everything has been configured, save the changes and restart SteamVR. When the override is configured properly, SteamVR will have an additional tracker icon per each enabled controller.
Now that the driver is set up, the HMD tracking will be overridden by tracking data from the mocap camera system, and you can integrate HMDs into the game engine through their own VR integration.
Broadcast Frame Data must be set to true.
Local interface must be set to the desired IP address to stream the tracking data from.
Streaming of Rigid Bodies must be set to True
For wireless streaming, use Unicast streaming type.
Once Motive is configured for streaming, launch SteamVR home to check the connection. If everything is setup correctly, you should be able to move around, or translate, within the scene freely. You may also need to check the ground plane to make sure it's well aligned.
If you experience any unexpected rotations in the view as you move your head, it could indicate that the HMD pivot point has not been calibrated properly. Please revisit the HMD Setup section and make sure the HMD Rigid Body pivot point is positioned and oriented at the expected pivot; which is at the root of nose with z-forward.
Once this has been set up, the motion capture system will be used to drive the HMD in SteamVR. In Unity, you should be able to setup development for SteamVR applications and use our system.
Starting from Unity version 2019 and above, official support for OpenVR in Unity has been deprecated. However, Valve made a plugin for the new Unity XR which can be used instead. Please follow the below steps to set up the Unity XR plugin and get the HMD working inside the Unity project:
OpenVR Unity XR plugin setup
6) [Unity] Check to make sure that the OpenVR XR plugin has been installed within your project.
8) [Unity] Enable the OpenVR Loader under the list of providers in the XR Plug-in Manager. If OpenVR Loader is not listed in there, make sure the plugin was installed properly from step 5) above.
9) [Unity] Once the plugin is configured, go to GameObject → XR → Add XR Rig.
10) Play the scene, and make sure the HMD is playing and tracking as well.
In Unity version 2018 and earlier, you can enable SteamVR by configuring the project setting. Please go to Edit → Project Settings → Player, open the XR Settings panel, and enable the Virtual Reality Supported property. You can also follow the instruction in the Unity Documentation:
Client Origin
This page provides instructions on how to set up the OptiTrack Streaming Client Unreal Engine plugin. This plugin is intended for Virtual Reality customers, but can be used with many other applications.
Next step is to configure the client. Follow below instructions to install and configure the OptiTrack Unreal Engine plugin to receive the streamed tracking data.
OptiTrack - Streaming Client Plugin (required)
Download the .
Extract the contents from the ZIP file.
Open the extracted OptiTrack folder, transfer the entire "OptiTrack" folder into the Unreal Engine's plugin directory located in the C:\Program Files\Epic Games\5.#\Engine\Plugins
folder (there will be other plugins in that folder already).
Open/Create a new Unreal Engine project.
Under the Edit menu, click Plugins to open up the panel where all of the available plugins are listed.
Browse to OptiTrack section and enable the "OptiTrack - Streaming Client".
Click Apply to submit the changes. It will require the Unreal Engine project to be restarted
Once the OptiTrack - Streaming Client plugin is enabled, the OptiTrack Client Origin actor will be available in Unreal Engine.
OptiTrack Client Origin
The OptiTrack Client Origin class enables the Unreal Engine (client) to communicate with the Rigid Body, Skeleton, and HMD tracking data streamed from Motive.
To add the client origin, simply drag-and-drop the OptiTrack Client Origin from the Place Actors panel into the level. Once the client origin is placed within the level, its position and orientation will reconcile the global origin of Motive in Unreal Engine. In other words, the tracking data will be represented relative to where this Client Origin object is positioned and oriented.
Global Origin: Both position and orientation of the OptiTrackClientOrigin will represent the global origin of the tracking volume within Motive.
[Unreal] Once the plugin is added and enabled in the project, the OptiTrack Client Origin class will be available from the Place Actors panel.
[Unreal] Drag and drop the OptiTrack Client Origin into the scene.
[Unreal] Place the OptiTrack Client Origin at the desired location within the scene.
[Unreal] Select the instantiated OptiTrackClientOrigin object from the World Outliner panel.
[Unreal] In the Details panel, make sure its Auto Connect setting is checked. This configures the client origin to automatically search the network and connect to Motive.
Now that the client origin is set, the client origin will attempt to connect to Motive and start receiving the tracking data whenever the scene is played.
Connecting to a designated IP address
Advance settings: Auto-initialize
By default, the auto-initialize feature is enabled and the client origin will get auto-initialized whenever the scene is played. But when needed, you can disable this and set up the project so the client origin gets initialized when a user-defined event is triggered.
[Unreal] From the Place Actors panel, search for OptiTrack Rigid Body Actor, then drag-and-drop the actor into the scene.
[Unreal] With this Rigid Body actor selected, attach the target actor that you wish to animate using the Details panel. Make sure the target actor's transformation is set to movable.
[Unreal] Set the relative locations and rotations to all zeroes on this target actor. This actor should be listed as a child of the Rigid Body actor.
[Motive] In Motive, assign a value to Streaming ID property for the target Rigid Body.
[Unreal] In the properties of the OptiTrack Rigid Body Actor component, match the Tracking ID with the Streaming ID of the Rigid Body asset in Motive.
Make sure both Motive and OptiTrack Client Origin is set up for streaming, hit Play, and the attached actor object will be animated according to the live-streamed Rigid Body tracking data.
When this is checked, the corresponding Rigid Body actor will be hidden from the level until the associated Rigid Body data is streamed out from Motive and received by the plugin.
Low latency update feature allows Rigid Body position and orientation transform to be updated immediately before rendering minimizing the latency. This is enabled by default. For debugging, you can check this setting to disable this behavior.
This sets a specific client origin to use for receiving tracking data. When this is unset, the plugin will default to the first client origin that it finds in the scene.
When this is set to true, the Rigid Body transform data from Motive will be applied in respect to the parent actor's pivot coordinates. By default, this is set to false, and all of the tracking data will be applied in respect to the pivot axis of the client origin.
When needed, you can also draw labeled marker data from Motive into the scene in UE. In most applications, you do not have to draw the markers as Rigid Body data and the Skeleton data will be used instead; however, getting markers generated in the scene may be helpful for debugging purposes. To enable drawing of the markers:
[UE4] Expand the OptiTrackClientOrigin (Instance) properties, and enable the Draw Markers checkbox.
Skeleton streaming is supported only in plugin versions 1.9 or above.
Follow the below steps to set up Skeleton streaming onto Unreal Engine.
1. Create a Animation Blueprint in the 3D View
Step 2. Right-click the blank space in the Content Browser pane, then select Animation → Animation Blueprint.
Step 3. On the pop-up window, select the OptiTrackAnimInstance at the parent class section at the top and click on the target Skeleton name at the bottom. Then click OK.
Step 4. In the content browser, assign a name to the created animation blueprint.
Step 5. Drag the character blueprint into the scene.
Step 6. Select the character blueprint in the 3D View
In the Details Pane, select “+ ADD” and create a new an “OptiTrack Skeleton Instance” on the model.
Set the “Source Skeleton Asset” equal to the Skeleton name in Motive.
2. Setup the Blueprint
**Step 1.**Double-click the animation blueprint in the content browser to open its editor.
**Step 2.**Right-click the animation graph, then create a new "OptiTrack Skeleton".
**Step 3.**Right-click the animation graph, then create a new "Get Streaming Client Origin" and connect its output to the Streaming Client Origin.
**Step 4.**Right-click the animation graph, then create a new "Get Source Skeleton Asset Name" and connect its output to the Source Skeleton Asset Name.
Step 5. Right-click the animation graph, then create a new "Component To Local" and connect the output from "OptiTrack Skeleton" into its input.
**Step 6.**Connect all of the nodes together. The basic animation flow chart should look like the following.
Bone Transformation
Roll Bone Interpolation
For characters with unmapped shoulder roll bones, the Skeleton plugin will detect its existence and apply a slight twist to the roll bones to keep smooth swinging motion on the arms. In the OptiTrack Skeleton blueprint, you can enable/disable this feature from the Roll Bone Interpolation checkbox, and you can also adjust how much of twist is applied by setting the Roll Bone Blending parameter. When this parameter is set to 0, the plugin will not adjust the roll bone motion, and when this is set to 1, the plugin will attempt to adjust its motion to keep the shoulder steady on the character.
Please note that this feature may not work on some characters.
3. Assign Bone Mapping
Step 1. Select the OptiTrack Skeleton plugin in the blueprint graph area.
Step 2. Drop down the Bone Mappings property in the Details Pane.
Step 3. Click “Auto Fill Bone Mapping” to automatically assign the bones in the Skeleton to the OptiTrack Skeleton names.
Note: There is no standard for bone naming conventions, so bone names may vary depending on characters. After doing the auto-fill, review the list and double-check that the auto-assigned names are correct. You may need to manually use the drop-down menu to adjust the assigned mapping for missing, or incorrect, items.
Step 4. Hit "Compile" in the top left to build the blueprint.
4. Setup OptiTrack Streaming
Step 1. Open the 3D View
Step 2. Search OptiTrack Client Origin in the Modes pane.
Step 3. Drag the OptiTrack Client Origin into the 3D scene, then select it to access its properties.
(Optional) put it at 0,0,0 location.
Make sure that streaming settings on both Motive and Unreal match.
5. Click _Play_
The OptiTrack Unreal Engine Skeleton Plugin uses bone mapping, not retargeting. This means that the bone segments in Motive map directly to the character model (bone mapping), instead of being translated into something that is usable by a more abstract biped model (retargeting). Because of this non-anatomical Skeletons will not map correctly without some additional tweaking.
Practically, this means that you will need to do things like turn off the toe mapping for characters with high heels, adjusting the pelvis bone in Motive or in the model for characters with non-anatomical hip bones, and not use bipeds that are too anatomically different than humans, such as a gorilla or swamp monster.
For example, the character sample below has both a non-anatomical pelvis and high heels. It is preferable to use character models that are more anatomically correct, but in this case, you can do a couple things to mitigate these issues:
1. Turn-off toe streaming
In the example below, since this character is wearing heels, any actor providing data for this model will also need to be wearing heels. To get around this you can just turn off the toe mapping in the OptiTrack Unreal Engine Skeleton Plugin.
2. Adjust the bone segments in Motive
The hip segment on the Countess actor is centered in the stomach rather than in the pelvis, the neck bone in Motive is a bit too long for the model, and the shoulders in Motive do not match the width of the character’s shoulders. By adjusting bones' positions and lengths in Motive, you can make the streamed Skeleton to better match the model; however, please note that there are limitations to how much you can do this.)
When streaming Skeleton data to animate characters that have different bone lengths compared to the mocap actor, the UE character will need to be scaled accordingly. In this case, the "Scale Bones" feature in the OptiTrack Skeleton node automatically scales the character bones to match the mocap actor. This setting is enabled by default.
The OptiTrack Unreal Engine Skeleton Plugin uses bone mapping, not retargeting. This means that the bone segments in Motive map directly to the character model (bone mapping), instead of being translated into something that is usable by a more abstract biped model (retargeting). Because of this, non-anatomical Skeletons will not map correctly without some additional tweaking. Starting from plugin version 1.23, you can tweak the alignment of the bone mapping by adding sockets to the Skeleton blueprint:
Adding Sockets to the Bone Mapping
Under the Skeleton tree, right-click on the bone that you wish to add the sockets to.
Right click and select_Add Socket_.
Go to the Animation blueprint, and change the bone mapping of the bone which you have created sockets for, and map it to the socket that was just created.
Play the scene, and adjust the socket location from the Skeleton Editor to adjust alignment of the bone.
In general, for most VR applications, using active markers is recommended for better tracking stability and ease of use. Active markers also have advantages over passive markers when tracking a large number of objects. For applications that are sensitive to the accuracy of the tracking data, using passive marker may have more benefits. To get more help with finding the best solution for your tracking application, please .
Once the clip has been mounted, next step is to import the provided into Motive and refine the definition to get the calibrated pivot point position and orientation, which will be explained on the next section.
You can either use the passive retro-reflective type markers or the active LED markers to track the HMD. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light which gets uniquely identified in Motive. Either type of marker can be used to track HMDs. Using is recommended especially for applications that involve tracking of multiple HMDs in the scene.
Please read through the page for additional information on the marker placement on a Rigid Body.
This feature can be used only with HMDs that have the clips mounted.
HMDs with passive markers can utilize the tool to calibrate the pivot point.
First of all, make sure Motive is configured for tracking .
Open the under and click Rigid Bodies.
Select the 8 active markers in the .
This is supported only for Motive versions 2.1.2 or above. If you are using any other versions of Motive 2.1, please update the version to 2.1.2, or use a template to create the Rigid Body definition; instructions for which is provided in the following page: .
Download the OpenVR driver from the page. Once downloaded, launch the installer and follow the prompts to set up the driver. On the last window, make sure to select Launch Configuration Utility before clicking Finish. This will open the Configuration options to setup your HMD with Motive.
In the HMD section, enable the HMD and input the Rigid Body ID of the HMD. The Rigid Body ID must match the property of the HMD Rigid Body definition in Motive.
First of all, make sure the streaming settings are configured in Motive for streaming out the data. For more information regarding streaming in Motive, please visit our page:
Unity-OpenVR documenation:
Unreal Engine-SteamVR:
This driver is designed for streaming of HMD and controller tracking data only. For streaming tracking data of other Rigid Body objects, you will need to use the corresponding plugins ( or ). In other words, the HMD tracking data will be streamed through the SteamVR using the driver you've installed, and all other tracking data will be streamed through the plugin's client origin.
Create an "Optitrack Client Origin" to the scene and set the relevant connection info. Refer to the page for more information on setting up the client origin.
In Motive, configure the data streaming settings. Use the pane to configure streamed packets. Make sure Rigid Body data is streamed out in order to use VCS.
The HTC VIVE, VIVE Pro, VIVE Pro 2, Valve Index, and HP Reverb HMDs can be integrated through the .
From Motive, the tracking data can be streamed in real-time either from a live capture (Live Mode) or recorded data (Edit Mode). The streaming settings are configured by modifying the . NatNet streaming must enabled and the correct IP address must be set.
Open the in Motive and configure the settings below:
[Motive] In the , configure the desired connection settings.
[Unity] For the Rigid Body ID entry, input the streaming ID of corresponding Rigid Body asset in Motive. The streaming ID can be found, and changed, under the .
By integrating with Unity's animation system, , the Unity3D plugin allows Motive to stream full body skeleton data. The skeleton tracking data from Motive is streamed out as hierarchical bone segment orientations, and this data is fed into the Unity's Mecanim system which allows animating characters with different proportions.
OptiTrack motion capture systems can be used to track head mounted displays (HMD) and integrate the tracking data into Unity for unique VR applications. For instructions on integrating HMD tracking data into Unreal Engine, please refer to the corresponding page .
First of all, setup and optimize the motion capture volume as explained in the or the documentation. If you plan to install any obstacles (e.g. walls) within the capture volume, make sure they are non-reflective, and place and orient the cameras so that every corner is thoroughly captured by multiple cameras.
In general, for most VR applications, using active markers is recommended for better tracking stability and ease of use. Active markers also have advantages over passive markers when tracking a large number of objects. For applications that are sensitive to the accuracy of the tracking data, using passive marker may have more benefits. To get more help with finding the best solution for your tracking application, please .
Once the clip has been mounted, next step is to import the provided into Motive and refine the definition to get the calibrated pivot point position and orientation, which will be explained on the next section.
You can either use the passive retro-reflective type markers or the active LED markers to track the HMD. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light which gets uniquely identified in Motive. Either type of marker can be used to track HMDs. Using is recommended especially for applications that involve tracking of multiple HMDs in the scene.
Please read through the page for additional information on the marker placement on a Rigid Body.
This feature can be used only with HMDs that have the clips mounted.
HMDs with passive markers can utilize the tool to calibrate the pivot point.
First, make sure Motive is configured for tracking .
Open the under and click Rigid Bodies.
Select the 8 active markers in the .
This is supported only for Motive versions 2.1.2 or above. If you are using any other versions of Motive 2.1, please update the version to 2.1.2, or use a template to create the Rigid Body definition; instructions for which is provided in the following page: .
Download the OpenVR driver from the page. Once downloaded, launch the installer and follow the prompts to set up the driver. On the last window, make sure to select Launch Configuration Utility before clicking Finish. This will open the Configuration options to setup your HMD with Motive.
In the HMD section, enable the HMD and input the Rigid Body ID of the HMD. The Rigid Body ID must match the property of the HMD Rigid Body definition in Motive.
First, make sure the streaming settings are configured in Motive for streaming out the data. For more information regarding streaming in Motive please visit our wiki page:
This driver is designed for streaming of HMD and controller tracking data only. For streaming tracking data of other Rigid Body objects, you will need to use the corresponding plugins ( or ). In other words, the HMD tracking data will be streamed through the SteamVR using the driver you've installed, and all other tracking data will be streamed through the plugin.
Create an "OptTrack Client Origin" to the scene and set the relevant connection info. Refer to the page for more information on setting up the client origin.
In general, for most VR applications, using active markers is recommended for better tracking stability and ease of use. Active markers also have advantages over passive markers when tracking a large number of objects. For applications that are sensitive to the accuracy of the tracking data, using passive marker may have more benefits. To get more help with finding the best solution for your tracking application, please .
Once the clip has been mounted, next step is to import the provided into Motive and refine the definition to get the calibrated pivot point position and orientation, which will be explained on the next section.
You can either use the passive retro-reflective type markers or the active LED markers to track the HMD. Passive markers are retroreflective markers that reflect infrared light emitted from the IR LEDs on the camera. On the other hand, the active markers are LED markers that emit the IR light which gets uniquely identified in Motive. Either type of marker can be used to track HMDs. Using is recommended especially for applications that involve tracking of multiple HMDs in the scene.
Please read through the page for additional information on the marker placement on a Rigid Body.
This feature can be used only with HMDs that have the clips mounted.
HMDs with passive markers can utilize the tool to calibrate the pivot point.
First of all, make sure Motive is configured for tracking .
Open the under and click Rigid Bodies.
Select the 8 active markers in the .
This is supported only for Motive versions 2.1.2 or above. If you are using any other versions of Motive 2.1, please update the version to 2.1.2, or use a template to create the Rigid Body definition; instructions for which is provided in the following page: .
Download the OpenVR driver from the page. Once downloaded, launch the installer and follow the prompts to set up the driver. On the last window, make sure to select Launch Configuration Utility before clicking Finish. This will open the Configuration options to setup your HMD with Motive.
In the HMD section, enable the HMD and input the Rigid Body ID of the HMD. The Rigid Body ID must match the property of the HMD Rigid Body definition in Motive.
First, make sure the streaming settings are configured in Motive for streaming out the data. For more information on streaming in Motive please visit our page:
1) Download the OpenVR Unity XR package found on the following . 2) Download the OptiTrack OpenVR driver found on our website and configure the settings as described in the above . 3) Open an Unity project 4) [Unity] Open the package manager from Window → Package manager. 5) [Unity] In the package manager, click on the "+" icon at the top and choose Add package from tarball. Then select the downloaded OpenVR Unity XR package.
7) [Unity] Now, follow the instructions on Unity's to configure your project for XR development. Install the XR Plug-in manager which can be found under Edit → Project Setting → XR Plug-in Management.
Please keep in mind that these steps are subject to change from Unity. You can find detailed instruction from the following page:
This driver is designed for streaming of HMD and controller tracking data only. For streaming tracking data of other Rigid Body objects, you will need to use the corresponding plugins ( or ). In other words, the HMD tracking data will be streamed through the SteamVR using the driver you've installed, and all other tracking data will be streamed through the plugin.
When using the OpenVR driver along with the for streaming of tracking data other than the HMD, such as Rigid Bodies and/or Skeletons, it is important that the OptiTrack Client Origin object is located at the global origin without any rotations. In other words, the position must be set to (0,0,0) and the rotation must be set to (0,0,0) on the client origin. This is important because the driver does not use the but instead streams the HMD tracking data directly onto SteamVR through a separate channel.
[Motive] Make sure that NatNet streaming is enabled in the in Motive.
If you wish to connect to a server on a specific network address, you can uncheck the Auto Connect setting and manually enter the Server IP Address chosen in the in Motive, Client IP Address, and Connection Type associated with Motive. You may need to run the ipconfig command in the command prompt to obtain an appropriate IP address of the client.
Actor objects in Unreal Engine can be animated using Rigid Body tracking data from Motive. Once the OptiTrack - Streaming Client plugin is enabled in the project, OptiTrack Rigid Body component will be available to use. By attaching this component onto an actor, you can animate its child actors according to the movement of a Rigid Body in Motive. Each Rigid Body component is given a Tracking ID value which associates with the of a Rigid Body in Motive. Once associated, the data from the corresponding Rigid Body will be used to update the transform of the target actor in Unreal Engine.
ID of the Rigid Body used to derive the position and orientatation transform of the attached actor. This ID must match with the of the respective Rigid Body in Motive.
[Motive] setting in the data streaming pane must be enabled.
Step 1. Navigate to a character folder. With sample characters, it is located in Characters → Heros → [Character Name] → Meshes.
Within the animation blueprint, you can utilize other blueprint utility tools from UE4 to modify the streamed data. For example, Transform (Modify) Bone nodes can be included after the OptiTrack Skeleton node to apply a transform to specific Skeleton bones as needed. Please refer to for more information on using animation blueprints.
See: page for more instructions on setting up the client origin.
Open of the character you wish to modify
Assets/OptiTrack
All of the Unity plugin contents are included in this folder.
Assets/OptiTrack/Scripts
This is the folder that you will mainly use. It contains plugin C# script components that can be imported into Unity objects for receiving streamed data.
Assets/OptiTrack/Plugins
This folder contains the plugin libraries and header files.
Assets/OptiTrack/Prefabs
This is the easiest place to get started. This folder contains premade objects for setting up a streaming client, tracking a Rigid Body, and retargeting a skeleton.
Assets/OptiTrack/Scenes
This folder contains sample Unity scene that includes pre-configured client, Rigid Body, and skeleton objects.
This page briefly goes over the active finger marker set and how it needs to be set up.
The active finger Marker Set utilizes the tracking capability of active markers and its active labeling features to accomplish tracking of both the hands and the fingers. These Marker Sets require the active marker tracking solution and the Tag(s). Wired from an active Tag, each active marker must attach to the expected locations. For each hand, 10 active markers are needed for each hand.
Manus VR Gloves
Alternatively, you can also use Manus VR Gloves for tracking the fingers. For more information, refer to the Manus Glove Setup page.
The Active Base Station
Active Tag(s) with active markers.
A way to attach active markers onto the hands (e.g. gloves).
Baseline + Active Fingers (57)
Core + Active Fingers (62)
Right Hand + Active Fingers (10)
Left Hand + Active Fingers (10)
Total 11 markers will be attached on each hand.
Tags: Attach Tags to each hand for the active LEDs.
Active Markers (10): Each Tag connects up to 8 active markers. Position the wired active markers at the tip of all five fingers (5), one each on the knuckle of the index finger and the pinky finger (2), and lastly, place the remaining active marker on the inside (medial side) of the wrist axis.
Steps for creating an active finger Marker Set is the same as the other Skeleton Marker Sets:
Open the Builder pane and select the desired hand Marker Set under the drop-down menu.
Make sure all of the markers are placed in the correct positions. For the Core + Active Fingers (62) Marker Set, please make sure the passive full body tracking markers are also placed on the person's body.
Once the markers have been placed, ask the subject to strike the calibration pose.
Select the finger markers in the 3D viewport.
The Marker Detected must match Marker Required in the Builder pane.
Click Create.
OpenVR Driver
This synchronization setup is not required for integrating HTC Vive with OptiTrack system. For integrating HTC Vive with OptiTrack system, please use the OptiTrack OpenVR Driver. This driver will completely override the tracking of an HTC Vive HMD so that the HMD can be tracked just using the OptiTrack system without the lighthouse base stations, and when using the OpenVR driver, synchronization between two systems is not necessary. This article is for specific applications where both lighthouse base station and the OptiTrack system must be running simultaneously.
Notes on the Sync Settings
The sync settings listed on this page have not been tested with the latest version of the firmware. This means that the appropriate sync offset value indicated on this page might not be correct. For integrating into SteamVR, please consider using OptiTrack OpenVR Driver to completely override the tracking.
This article provides instructions on how to synchronize an OptiTrack Motion Capture system with an HTC Vive virtual reality system, specifically the lighthouse base stations, to avoid overlapping of the infrared tracking lights. The HTC Vive system also uses infrared LEDs and lasers for tracking its head-mounted display (HMD) and controllers. When using an OptiTrack mocap system in conjunction with the HTC Vive system, the infrared tracking from the two systems can interfere with each other. For this reason, the two systems must be synchronized in a way so that the two different tracking lights do not temporally overlap. Currently, sync configurations with only OptiTrack Prime-series camera systems have been tested.
Let’s go through the synchronization setup. The following setup instructions assume that the two tracking base stations of the HTC Vive system are set to channel b and c and are optically synchronized. The channel b base station will serve as the master device for synchronizing the two systems. Sync out signal from the channel b station will feed into one of the input ports on the eSync 2, and a sync offset (specified in microseconds, or μs) will be applied to so that IR lights from the two systems don’t interfere with each other. The following section describes the instructions in detail.
Note:
The eSync 2 is required in order to synchronize HTC Vive lighthouses with the camera system.
Synchronization with Flex camera systems is not supported.
The base stations will synchronize optically in this setup. Refer to the respective documentation for more details on setting up the HTC Vive system. Set the tracking stations to channels b and c, so that they are optically synchronized (in the absence of a sync cable connection between them).
Refer to the Hardware Setup wiki pages for more details.
The sync output of the base stations use 3.5 mm stereo (TRS) cables, whereas the Input ports of the eSync 2 are BNC ports. You will need to use a stereo female to RCA male adapter (e.g. http://www.monoprice.com/product?p_id=5612) as well as an RCA-to-BNC adaptor (included with the eSync 2) to connect the channel b base station and the eSync 2 hub. After attaching the stereo to RCA adaptor, connect the red RCA cable into the eSync 2 using a BNC adapter, as shown in the following photo.
Open the Synchronization pane in Motive, and set the synchronization type to Custom Synchronization.
Set the Sync Input to Input 1, which was the input port of the eSync 2 where the sync cable was connected to. If the sync cable is properly connected and the HTC Vive system is properly working, the bottom signal monitor will display a frequency of approximately 60 Hz detected through the Input 1 port of the eSync 2. Note that this configuration will synchronize the OptiTrack camera system to the sync signal coming through the Input port.
Now that the OptiTrack system’s shutter timing is synchronized with the base stations of the HTC Vive system, you will need to introduce a sync offset to avoid overlapping of the tracking lights. The following list of offset sync parameters are tested to avoid the interference. Input these parameters into the Synchronization pane. If you wish to increase the final frame rate of the mocap system, you will need to apply a multiplier.
Final Frame Rate: 120 Hz
Sync Multiplier: 2
Sync Offset: 1780 μs
Final Frame Rate: 240 Hz
Sync Multiplier: 4
Sync Offset: 3150 μs
Notes on the Sync Settings
The sync settings listed on this page have not been tested with the latest version of the firmware. This means that the appropriate sync offset value indicated on this page might not be correct. For integrating into SteamVR, please consider using OptiTrack OpenVR Driver to completely override the tracking.
Press Apply to employ the sync configuration. The tracking IR lights from both systems will no longer interfere, and the HTC Vive components will be working properly and available in the SteamVR application.
Another important note is that high camera exposure settings may cause IR light from the base stations to be detected by the OptiTrack system. It's suggested to keep the camera exposure below 1000 us for all of the cameras.
Now that the two systems are synchronized to avoid the IR interference, both systems can be used together to provide immersive VR experiences. Note that the instructions listed on this page are tested to work with HTC Vive system, but alternative approaches may also be possible.
Currently IMUs are unsupported in Motive 3.x, this page is purely for reference purposes.
This page provides instructions on configuring the active components, Active Tags and/or Active Pucks, that are equipped with inertial measurement unit (IMU) sensors. By fusing the optical tracking data of the active markers with the IMU data, Motive can further stabilize the rotational tracking of the Rigid Bodies and accommodate for tiny jitters throughout the tracking. This is recommended for applications where the stability of rotation tracking data is important, including tracking cameras in virtual production applications, tracking drones in low camera-count setups, and more.
Motive version 2.2 to 2.3.1
IMU Active Batch Programmer
Base Station: Firmware 2.2.2 or above
IMU Active Tags/Pucks: Firmware 2.2.1 or above
When configured, each Active Base Station can communicate with up to 7 IMU Tags/Pucks.
When needed, multiple Base Stations can be plugged into the same camera network; as long as they are communicating through a separate channel.
To use the IMU components, all of the devices, including the Base Station, must have firmware version 2.x or above installed. Please follow the instruction below to check the firmware versions on each device.
For the IMU data to be used for tracking in Motive, unique Uplink IDs must be assigned onto each IMU Tag or Puck using the active batch programmer program. Then, the Uplink IDs will need to be inputted under the properties of the corresponding Rigid Bodies in Motive. Please follow the steps below:
Use the following link to download the batch programmer used for 2.x firmware. Once downloaded, unzip the downloaded file and launch the EXE file.
Batch Programmer Download:
Before setting up the components, we will want to check the existing configurations on the Base Station, Tags, and/or Pucks. To check this, enable the ‘’Read-Only Mode’’ check box at the bottom-left corner of the active batch programmer, and then connect the devices one at a time via USB. Each time a new active component is detected, existing configurations including the firmware version will get listed under the Log section. If any of the settings do not meet the below requirements, you can re-configure them in the next step.
Important
When checking the configuration, make sure the Read-Only Mode is enabled. If not, the active batch programmer will upload the settings onto the new Tags that are newly detected by the computer. This could reset the existing configuration that is already on the device.
Firmware Version
In order to use IMU Tags/Pucks, all of the active components must have firmware version 2.2.x or above installed. Please check the firmware version on the active components. If the installed firmware is an older version, please contact us for assistance with upgrading the firmware.
RF Channel
Please make sure that all of the active components are communicating through the same channel. Check this and take a note of the channel value as this will also need to be inputted into Motive for IMU data. The RF channel can be set anywhere between 11-26.
Label
The Labels are active IDs assigned to each marker, and it's important that there are no overlapping labels, or IDs, assigned to the same batch of active Tags/Pucks. This setting is applicable to Active Tags/Pucks only.
After checking the settings, you can unplug the device and connect the next one to check the settings.
Now that we understand the existing configurations and what needs to be changed, we can configure the active components. To do this, disable the Read-Only Mode in the batch programmer, and settings configured in the IMU batch programmer will get applied to each newly connected device. Please note that this could also reset the existing configuration, thus, make sure to configure only the settings that need to be changed.
Configure Uplink ID
For IMU Tags or Pucks, unique Uplink IDs need to be assigned for each of them. For doing this, simply check the box next to the Uplink ID whenever we are configuring IMU Tags or Pucks.
RF Channel/Label
Uncheck the boxes next to "Set marker labels" and "Set radio options" unless they also need to be reconfigured.
All of the Tags/Pucks are pre-configured for every set of devices that gets shipped out in the same order, and we also make sure none of the Labels overlap. Thus, users should not need to use this batch programmer in general. The cases when you would want to reconfigure the active components are: 1) When you have purchased new Tags/Pucks to add to the system from the previous order. 2) When there is a need to change the RF communication channel to avoid interferences.
In cases where marker labels or RF channels need to be reconfigured, enable the box next to the corresponding setting in the batch programmer. DO NOT check the box unless the settings need to be changed. When reconfiguring marker labels, ALL of the Tags and Pucks will need to be reconfigured at once, so that the labels do not overlap with other components in the system.
Once we have configured the settings in the batch programmer, connect Active Tags or Pucks one at a time and the programmer will apply the configured settings to the newly connected device, and it will also make sure that the IDs do not overlap in the same batch. Once the settings get applied the details will get listed under the Log.
It’s recommended to note the assigned Uplink IDs for each Tag/Puck so that we can track down the settings easily when needed.
Once a device is configured, disconnect and connect the next device to apply the same settings and unique IDs. Repeat this for all of the active components that need to be configured.
When using an IMU Active Tag to create a custom Rigid Body, please make sure both the PCB board of the tag and the LEDs are securely attached and rigidly constrained to the object. The IMU sensor fusion will not work correctly if either the markers or the tag are not secured to the rigid object.
Now that each of the IMU Tags/Pucks has been assigned with an Uplink ID, the next step is to input this ID into the properties of the corresponding Rigid Bodies in Motive. Also, the RF channel configured for each active component will need to be inputted as well. Follow the below steps to assign Active Tag ID and Active RF Channel property for each Rigid Body:
1) Launch Motive
Make sure the configured Active Base Station is connected to the camera system and launch Motive.
2) Power the IMU Tag/Puck
Click the power button on the Tag/Puck to start the device. When the Tag/Puck first initializes, the status LEDs will start blink in red and orange rapidly. This indicates that the IMU sensor is attempting to initialize. At this stage, keep the IMU Tag/Puck stationary until the status LED blinks in green indicating that it has completed the initialization.
Note: Do not hold down on the power button when powering up the Tag/Puck as it will put the device into a boot mode. When in boot mode, it will not track in Motive and two of the LEDs will light up in orange. If this happens, hold down on the power button again to turn off the Tag/Puck and restart it with just a single click on the button.
3) Create a Rigid Body
Select the active markers associated with a Tag/Puck and create a Rigid Body.
4) Access advanced rigid properties
Open the Properties pane and select the Rigid Body. Reveal the advanced properties by clicking on "..." menu at the top-right and selecting Show Advanced.
5) Input IMU Properties: Active Tag ID and RF Channel
Scroll down to find the IMU section and input the "Uplink ID" value for the corresponding Tag/Puck into. Additionally, input the "RF Channel" used the the corresponding Base Station. Once this is completed the Rigid Body should have the correct properties set to use the IMU in the tag. You can also adjust the Sensor Fusion property to balance the contribution between optical tracking data and IMU data for the Rigid Body solve.
6) Check the status
At this point, the IMU Tags and Pucks are ready to be used. The color of the Rigid Body indicates the status of the IMU data stream.
iewport
When the color of Rigid Body is the same as the assigned Rigid Body color, it indicates Motive is connected to the IMU and receiving data.
If the color is orange, it indicate the IMU is attempting to calibrate. Slowly rotate the object until the IMU finishes calibrating.
If the color is red, it indicates the Rigid Body is configured for receiving IMU data, but no data is coming through the designated RF channel. Make sure Active Tag ID and RF channel values mat the configuration on the active Tag/Puck.
Description
Confirm IMU Tracking
It may be helpful to double-check to confirm that IMU is working properly. To do this, access the properties of the IMU Rigid Body, set the Min Marker Count Rigid Body property to 2, and set the Tracking Algorithm to Marker Based. Then, cover up all of the active markers on the Tag/Puck except for two, and make sure that Motive is still able to track the Rigid Body rotation.
This page covers manual positioning of HMD Rigid Bodies in Motive. This is an old workflow that a bit of time and effort to set up. With the HMD Calibration tool, you can create and auto-calibrate the HMD Rigid Bodies much easier and faster.
When manually positioning the appropriate location of the Rigid Body pivot point, you will need to have landmark markers on specific locations.
When attaching retroreflective markers, make sure markers are securely attached and readily captured by the cameras. For attaching the markers, we recommend using our 20 mm wide and 30 mm tall M4 threaded plastic marker bases with Acrylic adhesives, available at the webstore, to attach the markers onto the HMD.
A markered HMD will be defined as a Rigid Body in Motive. When placing markers, make sure the placement asymmetry is respected in the arrangement within the HMD. Also, the marker arrangements between multiple HMDs must be incongruent. For more details, read about marker placement from the Rigid Body Tracking page. Also, for tracking the HMD, two landmark markers must be placed in the following locations:
Eye-level Side Markers (2)
Place two markers on left and right side of the HMD, these markers will serve two additional purposes. First, they will indicate yaw of the HMD, and they will be used to align the Rigid Body orientation with the orientation of the actual HMD component. Thus, a line interconnecting the two markers must be parallel to the frontal plane, or the display, of the HMD. Second, these markers will be used to locate the elevation of the eyes when creating the Rigid Body in Motive. In summary, the two landmark markers must be carefully placed considering the following:
The markers should align along eye-level of the user when the HMD is mounted.
Most importantly, place these markers in the exactly same location of the left and right side so that they form a precisely symmetrical arrangement.
Same dimension attachment bases must be used for both of the markers.
For best virtual experiences, the pivot point of the HMD Rigid Body, in Motive, needs to be positioned on the midpoint between two eyes, of the user when the HMD is put on. To locate this, use the side and top-center landmark markers as references. For more information on adjusting Rigid Body pivot points, please read through the Rigid Body Tracking page.
Gizmo Tool: Translate, Rotate, and Scale
For Motive versions 2.1 and above, setting pivot point location is much easier using the GIZMO tools. Instructions on adjusting pivot point location using the GIZMO tool is detailed in the following page: Gizmo Tool: Translate, Rotate, and Scale. Using this tool, you can select markers in the 3D viewport and easily place the Rigid Body pivot point onto a specific landmark marker, or onto a midpoint between the selected markers.
1. Set the pivot point over the landmark marker. Use the Set Pivot Point to Selected Marker feature to assign the pivot point to the marker. This will set the elevation of the pivot point along the eye-level.
3. Translate the pivot point along the z-axis using the translation tool. For the most accurate position, you may need to physically measure the sagittal, z-axis, distance from the landmark marker to the root of nose, and apply the measured offset.
Now that you have translated the pivot point, you need to make detailed adjustments to the orientation using the orientation transformation tool. For best results, align the two front markers along the x-axis grid and roughly center the Rigid Body along the z-axis grid. Then, check to make sure that each of the Rigid Body orientation axes is parallel to the grids lines in Motive. If there is any deviation, apply rotation to adjust the offset. If needed, transparency of the axes and the grids can be adjusted from the Application settings.
In Unreal Engine: the X-axis of the HMD Rigid Body must be directed forward.
In Unity: the Z-axis of the HMD Rigid Body must be directed forward.
Tip: Once you have the Rigid Body asset for the HMD configured, you can export the asset into a TRA file for future uses. Importing the TRA file (e.g. CV1.tra) will load the Rigid Body (HMD) asset and make it available for use; however, the marker placement must remain unchanged in order to re-load previously created Rigid Bodies.
The Navigation Controller integration allows users to track the Bluetooth navigation controller and to receive its inputs signals, in real-time. Once the controllers are properly set up, tracking data and key input data of the corresponding controller will be displayed in real-time in Motive. The data can be streamed onto client applications that are developed using VRPN.
Requirements
Windows 10 with administrator privileges
Bluetooth dongle (Kinivo BTD-400 Bluetooth USB adapter)
mini-USB cable
Navigation Controller
Please read through the NavigationControllerReadme.txt file before starting with the setup process.
Connect the Bluetooth dongle and all of the Navigation Controllers to the PC using the provided mini-USB cables.
To use the controller, the driver for the navigation controller must be installed. Once the driver is successfully installed, both Bluetooth dongle and controllers should get listed in Windows Device Manager under "Universal Serial Bus Devices"
Run OptiTrackNavController\BluetoothDriver\DPInst.exe
as administrator.
Operation of the navigation controllers can be confirmed through two applications: ScpServer.exe and ScpMonitor.exe. When the driver is installed properly, each connected controller will be recognized in these applications:
Run SCPTools\ScpServer.exe. Each Controller should appear as "Pad X"
Run SCPTools\ScpMonitor.exe. Charging status of each controller will be shown.
Once the device connection is confirmed from step 3, you can use the ScpMonitor to confirm the key inputs. Make sure the keys are properly working in this application in order to use it within Motive.
From Windows Task Tray -> SCPMonitor -> Right Click -> Profile Manager.
Check the buttons on the controller and make sure the input signal gets properly received onto the Profile Manager.
Now that both connection and operation is confirmed, we can start navigating using the controller within Motive. Before doing that, copy OptiTrackNavController\MotivePlugin\WandPeripheral.dll to <Motive Install Folder>\Devices
folder, so that this library can be used in Motive.
Before starting Motive, make sure to start up the ScpServer.exe application under the SCPTools folder.
Start Motive. When the program shows up, there will be 4 Controller devices listed under the Devices pane.
Power the controllers. Active controllers will be shown in the Devices pane with a check-mark next to each.
Next step is to import the predefined TRA files. Each controller's physical marker configuration will reflect one of the predefined TRA Rigid Body definition files (e.g. ControllerA.tra). If the loaded TRA file matches the marker configuration, it will be tracked within Motive. A quick way to find out is to load all of the four Rigid Body definition files, place the controllers within the capture volume, and then remove the definitions that are not tracked. Please check the marker positions on the controller against the PDF configuration files and determine which TRA files to import into Motive.
Once the corresponding Rigid Body file has been imported, rename the RigidBody to match the name of the Controller (e.g. Controller 1). Once this is configured, a rectangular block and an arrow vector will be displayed in the 3D viewport over top of the Rigid Body.
Step 5. Confirm Operation
When you pull the trigger, the magnitude of the arrow vector will increase correspondingly. Each button clicks will be inputted through each channel that can be plotted on the Graph View pane. You can use the provided Live-DeviceSelectedChannel template to plot the output graph.
The Controller has a rechargeable battery.
To recharge the Controller, simply connect the Controller to the PC using the supplied Mini-USB cable for charging.
The Controller LED flashes slowly while charging, solid when complete.
The charge state is Displayed in the Device's properties in Properties pane
The Controller Clip comes in 4 unique configurations (A, B, C, D). Corresponding TRA Rigid Body definition is provided for each configuration (e.g. ControllerA.tra).
The Controller Clip coordinate system is in right-handed coordinate system (RHS), +Z aligned with controller's forward axis (Marker 3)
Only use the provided Bluetooth dongle and Mini-USB cables.
Controller activity is transient, but their ID is persistent (Controller 1 is always Controller 1)
Controllers perform an initial 'bonding' with the SCP Server. This pairing is remembered, so upon successive connections, the controller 'number' is preserved (1-4).
Controller represents to SCPServer as DS 3.
4 Devices always show in Motive, regardless of their power/connection status. Their ID is based on their pre-paired / bonded status. This allows for devices to easily come and go but mapped to a consistent ID during a session. After pairing, controllers can be physically numbered to simplify visual/physical identification.
Controllers turn on manually, but turn off automaticallym after a user-definable timeout (default: 5 minutes).
There is a reset button on the bottom if the controller is not operating correctly.
Recording, or playback, is not supported.
Smoothing applied to the RigidBody may be userful, since smooth movement over latency is preferred.
Controller Device Name and RigidBody Name must match for VRPN to correctly correlate Rigid Body position and orientation with Controller's Axis and button states. (e.g. Controller 1).
This page provides general instructions on how to use the OptiHub2 for integrating external devices with Flex series mocap systems. The OptiHub2 not only provides power for the USB cameras, but it also includes external sync in/out ports for integration of external devices. With proper configurations, you can have another device (parent) control the mocap system or have the mocap system control another external device (child), or both. Setup instructions for both child and parent devices will be covered. Note that the sync setup may vary depending on the type of integrated devices as well as the morphology of the communicated sync signals. Use this guide for understanding the general idea of how external devices are implemented, and apply it to your own needs.
On the OptiHub2, there are one External SYNC In and one External SYNC Out ports for connecting external devices. In general, parent devices connect to the input port for controlling the mocap system, and child devices connect to the output port to be triggered by the mocap system. Once the devices are connected, the input and output source needs to be configured under the OptiHub2 properties in Motive.
Important Note
Please note that the OptiHub2 is not designed for precise synchronization with external devices. It is used to provide only a rough synchronization to a trigger event on the input/output signal. Using an OptiHub2, there will be some amount of time delay between the trigger events and the desired actions, and for this reason, the OptiHub2 is not suitable for the precisely synchronizing to an external device. To accomplish such synchronization, it is recommended to use the eSync 2 instead along with an Ethernet camera system.
Difference Between OptiSync and Wired Sync
OptiSync
The OptiSync is a custom camera-to-camera synchronization protocol designed for Flex series cameras. The OptiSync protocol sends and receives sync signals over the USB cable, without the need for RCA sync cables. This sync method is only available when using Flex 3 or Flex 13 cameras connected to the OptiHub2.
Wired Sync
The Wired Sync is a camera-to-camera synchronization protocol using RCA cables in a daisy chain arrangement. With a master RCA sync cable connecting the master camera to the OptiHub2, each camera in the system is connected in series via RCA sync cables and splitters. The V100:R1 (Legacy) and the Slim 3U cameras utilize Wired Sync only, and therefore any OptiTrack system containing these cameras need to be synchronized through the Wired Sync. Wired Sync is optionally available for Flex 3 cameras.
Additional Notes:
Unlike the eSync2, the OptiHub2 cannot generate an output signal at a higher frequency than the camera capture rate.
When using multiple OptiHub2s, input and output ports of only the parent OptiHub2 can be used. The parent OptiHub2 is the first OptiHub2 within the daisy-chained RCA sync chain.
Duo/Trio Tracking Bars:
Use the Sync In/Out ports on the I/O-X USB hub to use the external signal to drive the tracking bar.
Step 1. [Hardware] Connect a parent device into the External SYNC In port of the parent OptiHub2.
Step 2. [Motive] Launch Motive.
Step 3. [Motive] Open the Devices pane and the Properties pane under the view tab.
Step 4. [Motive] Select the parent OptiHub2 in the Devices pane, then its properties will get listed under the Properties pane.
Step 5. [Motive] Under the Sync Input Settings section, configure the source and the corresponding settings. If you are using an external device connected to the External SYNC In port as the sync source, set the input source to Sync In. See more under the Input Source section of this page.
Step 6. [Motive] Once above is configured, the camera system will capture according to the input signal, and you should be able to see the change in the Devices pane camera frame rate section.
Step 1. [Hardware] Connect a child device into the External SYNC Out port of the parent OptiHub2.
Step 2. [Motive] Launch Motive.
Step 3. [Motive] Open the Devices pane and the Properties pane from the view tab.
Step 4. [Motive] Select the parent OptiHub2 in the Devices pane, then its properties will get listed under the Properties pane. By modifying the properties, you can configure the for outputting sync signals to the child devices.
Step 5. [Motive] Under the output section, set the output signal type. This will determine the signal characteristic that the child device will receive. See the Output Source section of this page for details.
Step 6. [Motive] Once this is set, the OptiHub2 will output configured signal through its output ports.
The Sync Input configuration determines how the camera system is synchronized. Depending on which input source is configured under the custom sync settings, the cameras will shutter at the corresponding frequency. To configure the sync input signals, first define an input Source and then configure the respective trigger settings. The following input sources can be configured with the OptiHub2:
(The camera system will be the parent)
When the sync source is set to Internal/Wired the camera system uses the OptiHub2 as the sync source. This is the default configuration, and it uses OptiHub2's sync protocol for synchronizing the cameras. The Parent OptiHub2 will generate an internal sync signal which will be propagated to other (child) OptiHub2(s) via the Hub Sync Out Jack and Hub Sync In Jack, and all of the cameras connected to the OptiHub2s will be synchronized. For V100:R1(legacy) and the Slim 3U cameras, Wired Sync protocol is used. In this mode, the internal sync signal will still be generated but it will be routed directly to the cameras via daisy-chained sync cables.
When the Internal Sync is selected as the sync source, Internal Sync Freq (Hz) can be set under the Synchronization Control section. This setting determines how the OptiHub2 triggers the camera exposures or the camera framerate.
(The camera system will be the child)
When synchronizing the camera system to an external device, set the source to Sync In, and the camera system references the external signal through the SYNC In port as the parent sync. In this mode, the Input Trigger event must be defined so that the camera systems respond to the incoming signal as desired; available triggers are Either Edge, Rising Edge, Falling Edge, High Gated, and Low Gated. Note that the suitable event will vary depending on characteristics of the received signal and how you want the system to synchronize with it.
For syncing to input signals with a frequency higher than the supported camera frame rates, the Input Divider can be applied so that the sync source is down-sampled to the supported rate range.
Duo/Trio Tracking Bars:
In Duo/Trio Tracking bars, when there is external signal detected through its I/O-X box, the tracking bar will no longer be able to operate in free-run mode. In this case, the source must be set to Sync In to utilize the external signal OR the external input must be disconnected from the I/O-X box for free-run.
(The camera system will be the child)
This mode is for customers who use the software development kits and would like to have their software trigger the cameras instead. Using the provided API, the OptiHub2 will can receive the trigger signal from the PC via the OptiHub2's USB uplink connection.
Note that the precise moment of the camera exposure does not exactly coincide with the sync input trigger event. There is a fixed latency between these two events due to specific implementation of the sensors used in the cameras. This delay from the onset of the trigger-event to the start of exposure on the cameras can be calculated as the following:
Camera exposure (measured in scanlines for Flex 3 cameras)
Imager Scan-Rate: Frame Rate Setting defined under the Devices pane.
Flex 13 cameras have a different trigger-to-exposure latency. From the moment the sync trigger event is received, it takes 480 microseconds for the cameras to synchronize and start exposing. After this delay, the Flex 13 cameras expose for the duration defined under the exposure settings in the Devices pane. The camera exposures are measured in microseconds for Flex 13 cameras.
The External SYNC Out port of the OptiHub2 can be used if you want to set up the camera system to be a parent of other systems. The output port sends out the sync signals which can be configured under the External Sync Output section of the OptiHub2 properties. First, understand the characteristic of sync signal that the connected child device is expecting and configure the output source accordingly.
All output signals can be inverted by setting the Output::Polarity to Inverted.
For connecting more than one child devices, output signals may be split using a BNC splitter.
The sync hubs are capable of sending out the following signals:
Exposure output signals to inform when the cameras are exposing. When the output type is set to Exposure Time, the sync hub asserts the output signals when the cameras are exposing, or shuttering, in the live mode. When the output is configured to Recording Pulse, the exposure signal is asserted only when Motive is recording.
The Recording Gate signal can be used to tell the child device when Motive is recording or not. When configured to Recording Gate, the sync hub will output a constant high voltage signal when Motive is recording. Please note that OptiHub2 is not specifically designed for precise synchronization, and there will be a slight delay when it starts outputting the trigger signal. This delay varies on each camera system setup.
Gated Output Signal (OptiHub2) Note:
The gated (while-recording) output signal from the OptiHub2 is not frame-synchronous with the recorded frame data. When the recording trigger is received in the middle of a frame, mocap frames starting from the next one get recorded in the Take. The gate output signal, however, does not respect this and begins outputting the signal as soon as the recording trigger is received. For this reason, there could be a slight offset between the outputted gated signal and the recorded mocap data.
The Pass-through type outputs the signal that is received through the SYNC In port.
This page provides instructions on how to set up and use the OptiTrack active marker solution.
Additional Note
This guide is for OptiTrack active markers only. Third-party IR LEDs will not work with instructions provided on this page.
This solution is supported for Ethernet camera systems (Slim 13E or Prime series cameras) only. USB camera systems are not supported.
Motive version 2.0 or above is required.
This guide covers active component firmware versions 1.0 and above; this includes all active components that were shipped after September 2017.
For active components that were shipped prior to September 2017, please see the compatibility notes page for more information about the firmware compatibility.
The OptiTrack Active Tracking solution allows synchronized tracking of active LED markers using an OptiTrack camera system. Consisting of the Base Station and the users choice Active Tags that can be integrated in to any object and/or the "Active Puck" which can act as its own single rigid body.
Connected to the camera system the Base Station emits RF signals to the active markers, allowing precise synchronization between camera exposure and illumination of the LEDs. Each active marker is now uniquely labeled in Motive software, allowing more stable rigid body tracking since active markers will never be mislabeled and unique marker placements are no longer be required for distinguishing multiple rigid bodies.
Sends out radio frequency signals for synchronizing the active markers.
Powered by PoE, connected via Ethernet cable.
Must be connected to one of the switches in the camera network.
Connects to a USB power source and illuminates the active LEDs.
Receives RF signals from the Base Station and correspondingly synchronizes illumination of the connected active LED markers.
Emits 850 nm IR light.
4 active LEDs in each bundle and up to two bundles can be connected to each Tag.
(8 Active LEDs (4(LEDs/set) x 2 set) per Tag)
Size: 5 mm (T1 ¾) Plastic Package, half angle ±65°, typ. 12 mW/sr at 100mA
An active tag self-contained into a trackable object, providing information with 6 DoF for any arbitrary object that it's attached to. Carries a factory installed Active Tag with 8 LEDs and a rechargeable battery with up to 10-hours of run time on a single charge.
Active tracking is supported only with the Ethernet camera system (Prime series or Slime 13E cameras). For instructions on how to set up a camera system see: Hardware Setup.
Connects to one of the PoE switches within the camera network.
For best performance, place the base station near the center of your tracking space, with unobstructed lines of sight to the areas where your Active Tags will be located during use. Although the wireless signal is capable of traveling through many types of obstructions, there still exists the possibility of reduced range as a result of interference, particularly from metal and other dense materials.
Do not place external electromagnetic or radiofrequency devices near the Base Station.
When Base Station is working properly, the LED closest to the antenna should blink green when Motive is running.
BaseStation LEDs
Note: Behavior of the LEDs on the base station is subject to be changed.
Communication Indicator LED: When the BaseStation is successfully sending out the data and communicating with the active pucks, the LED closest to the antenna will blink green. If this LED lights is red, it indicates that the BaseStation has failed to establish a connection with Motive.
Interference Indicator LED: The middle LED is an indicator for determining whether if there are other signal-traffics on the respective radio channel and PAN ID that might be interfering with the active components. This LED should stay dark in order for the active marker system to work properly. If it flashes red, consider switching both the channel and PAN ID on all of the active components.
Power Indicator LED: The LED located at the corner, furthest from the antenna, indicates power for the BaseStation.
Connect two sets of active markers (4 LEDs in each set) into a Tag.
Connect the battery and/or a micro USB cable to power the Tag. The Tag takes 3.3V ~ 5.0V of inputs from the micro USB cable. For powering through the battery, use only the batteries that are supplied by us. To recharge the battery, have the battery connected to the Tag and then connect the micro USB cable.
To initialize the Tag, press on the power switch once. Be careful not to hold down on the power switch for more than a second, because it will trigger to start the device in the firmware update (DFU) mode. If it initializes in the DFU mode, which is indicated by two orange LEDs, just power off and restart the Tag. To power off the Tag, hold down on the power switch until the status LEDs go dark.
Once powered, you should be able to see the illumination of IR LEDs from the 2D reference camera view.
Puck Setup
Press the power button for 1~2 seconds and release. The top-left LED will illuminate in orange while it initializes. Once it initializes the bottom LED will light up green if it has made a successful connection with the base station. Then the top-left LED will start blinking in green indicating that the sync packets are being received.
For more information, please read through the Active Puck page.
Active Patten Depth
Settings → Live Pipeline → Solver Tab with Default value = 12
This adjusts the complexity of the illumination patterns produced by active markers. In most applications, the default value can be used for quality tracking results. If a high number of rigid bodies are tracked simultaneously, this value can be increased allowing for more combinations of the illumination patterns on each marker. If this value is set too low, duplicate active IDs can be produced, should this error appear increase the value of this setting.
Minimum Active Count
Settings → Live Pipeline → Solver Tab with Default value = 3
Setting the number of rays required to establish the active ID for each on frame of an active marker cycle. If this value is increased, and active makers become occluded it may take longer for active markers to be reestablished in the Motive view. The majority of applications will not need to alter this setting
Active Marker Color
Settings → Views → 3D Tab with Default color = blue
The color assigned to this setting will be used to indicate and distinguish active and passive markers seen in the viewer pane of Motive.
For tracking of the active LED markers, the following camera settings may need to be adjusted for best tracking results:
For tracking the active markers, set the camera exposures a bit higher compared to when tracking passive markers. This allows the cameras to better detect the active markers. The optimal value will vary depending on the camera system setups, but in general, you would want to set the camera exposure between 400 ~ 750, microseconds.
When tracking only active markers, the cameras do not need to emit IR lights. In this case, you can disable the IR settings in the Devices pane.
Rigid body definitions that are created from actively labeled reconstructions will search for specific marker IDs along with the marker placements to track the rigid body. Further explained in the following section.
Duplicate active frame IDs
For the active label to properly work, it is important that each marker has a unique active IDs. When there are more than one markers sharing the same ID, there may be problems when reconstructing those active markers. In this case, the following notification message will show up. If you see this notification, please contact support to change the active IDs on the active markers.
In recorded 3D data, the labels of the unlabeled active markers will still indicate that it is an active marker. As shown in the image below, there will be Active prefix assigned in addition to the active ID to indicate that it is an active marker. This applies only to individual active markers that are not auto-labeled. Markers that are auto-labeled using a trackable model will be assigned with a respective label.
When a trackable asset (e.g. rigid body) is defined using active markers, it's active ID information gets stored in the asset along with marker positions. When auto-labeling the markers in the space, the trackable asset will additionally search for reconstructions with matching active ID, in addition to the marker arrangements, to auto-label a set of markers. This can add additional guard to the auto-labeler and prevents and mis-labeling errors.
Rigid body definitions created from actively labeled reconstructions will search for respective marker IDs in order to solve the rigid body. This gives a huge benefit because the active markers can be placed in perfectly symmetrical marker arrangements among multiple rigid bodies and not run into labeling swaps. With active markers, only the 3D reconstructions with active IDs stored under the corresponding rigid body definition will contribute to the solve.
If a rigid body was created from actively labeled reconstructions, the corresponding Active ID gets saved under the corresponding rigid body properties. In order for the rigid body to be tracked, the reconstructions with matching marker IDs in addition to matching marker placements must be tracked in the volume. If the active ID is set to 0, it means no particular marker ID is given to the rigid body definition and any reconstructions can contribute to the solve.
This page provides general instructions on how to use the eSync 2 for synchronizing external devices.
The eSync 2 is a synchronization hub that allows advanced users to integrate external systems into OptiTrack motion capture systems. With proper sync chain setups, you can have another (parent) system control the mocap system, or have the mocap system control other (child) systems, or both. Note that the setup may change depending on the type and number of the external devices as well as the characteristics of the communicated sync signals. Use this guide for understanding the general idea of how external devices are implemented and apply the knowledge for your needs.
With the eSync 2, Prime series mocap systems can work together with other systems to perform precisely synchronized operations and data collections. This offers benefits in a wide range of applications. Reference video cameras, devices, , and recording triggers are examples of commonly synchronized external devices.
The eSync 2 synchronization hub has multiple sync input and output ports. In general, a parent device connects to the input ports for controlling the mocap system, and the child devices connect to output ports to be controlled by the mocap system. Once the devices are connected to the eSync 2, the input and output signal characteristics need to be specified and configured under the eSync 2's in Motive.
Requirements
Ethernet Camera System (PrimeX series or SlimX 13)
The eSync 2
External Devices (child / parent)
Sync Cables: BNC or other sync cables with BNC adapters.
Child Devices (e.g. Force Plates): Connect Output ports of the eSync 2 into sync input ports of the child devices
Parent Device (e.g. Genlock): Connect the sync output of a parent device into one of the Input ports of the eSync 2. For integrating Genlock, VESA Stereo In, or SMPTE timecode signals, connect them to the corresponding labeled input ports of the eSync 2.
When you are done changing the settings, press Apply to set the configured sync method.
Internal Free Run
Internal Clock
To use the internal clock of the eSync 2 as the parent sync source for both the camera system and the subsequent child devices, set the sync input source to Internal Clock. When selected, the clock frequency can be adjusted in Sync Input → Clock Freq (Hz) settings.
Input Signal
Once the input source is set, next step is to define an appropriate trigger event. Under the Sync Input → Input Trigger option, pick a signal morphology (Either Edge, Rising Edge, or Falling Edge) for the desired trigger event. Note that the suitable event will vary depending on characteristics of the received signal and how you want the system to synchronize with it. The following diagrams show how the camera system responds to received signal triggers. When configured properly, the camera system will expose in respect to the sync signal from the parent device.
Every rising edge of the sync input signal defines either the start or the end of a frame, consecutively.
Falling Edge:
Every falling edge of the sync input signal defines either the start or the end of a frame, consecutively.
Either Edge:
Every rising or falling edge of the sync input signal defines either the start or the end of a frame, consecutively. When using both of the edges as the input trigger, the input signal must have 50% duty cycle for the cameras to synchronize properly.
The frame rate of the camera system gets determined by the selected sync input source. When the frequency of the sync source is higher than the supported frame rate, input dividers and multiplier can be applied to adjust the signal frequency for synchronizing the camera system. The final framerate will be calculated and displayed at the bottom, Sync Input → Final Frame Rate, and you can monitor this rate while you apply the adjustments. When the customized sync configurations are applied at the end of the setup, the cameras will start to capture at this final frame rate.
Camera exposures are always positioned at the center of every frame periods, and the precise moment of the camera exposure does not exactly coincide with the trigger event. When synchronizing the camera exposure timing with the sync input signal triggers, make sure this gap is taken into account. To precisely align the input signal trigger with the exposure timing, an offset delay, in microseconds, of half of the frame period plus half of the camera exposure must be applied in the Sync Input → Sync Offset (us). The camera exposure is measured in microseconds on the Ethernet cameras.
The eSync 2 can output the following types of signals:
Exposure Time
Recording Gate
Record Start/Stop Pulse
Gated Exposure Time
Gated Internal Clock
Selected Sync
Adjusted Sync
Input signals: Relaying input signals.
Selected Sync: Raw input signal.
Adjusted Sync: Adjusted (offset, multiplier, and divider) signal.
Note: All output signals can be inverted by setting the Output → Polarity to Inverted.
The gated (while-recording) output signal from the eSync 2 is frame-synchronous with the recorded mocap data. In the live mode, the cameras are continuously shuttering and capturing frames at a defined frame rate. If the recording trigger is received in the middle of a frame period, the eSync waits until the next frame to start recording and asserting gated output signal. This mechanism ensures that the recorded mocap data and the gated output signals are precisely synchronized.
Exposure Time/Gated Exposure Time output signals indicate when the cameras are exposing.
Recording gate/pulse output signals can be used to tell the child device when Motive is recording or not. When configured to Recording Gate, the eSync 2 will output a constant high voltage signal when Motive is recording. When configured to Recording Start/Stop Pulse, the sync hub will output a pulse signal when Motive either starts or stops recording.
When configured to Gated Internal Clock, the eSync 2 outputs its internal clock signal while Motive is recording. The internal clock signal has a 50% duty cycle with the signal frequency defined under the Sync Input → Clock Freq (Hz) section.
Using Internal Clock Signal to drive both the camera system and external devices
To achieve per-frame synchronization between the camera system and an external device (e.g. force plates, NI-DAQ), the internal clock signal from the eSync 2 can be used to drive both the camera system as well as the external device. This is possible only if the external device has the capability of receiving external clock signal. When the external system runs at a higher sampling rate, a divisor or a multiplier must be applied to the clock signal to achieve the desired framerate on the camera system. Here, please note that depending on the applied divisor/multiplier, the alignment of the outputted signal may vary. The exposure timing of the camera system will always be aligned at the center of the divided or multiplied signal, and whether the exposure timing aligns with the rising edge or falling edge of the output signal may vary depending on the applied divisor. Please see the below image:
By default, the remote trigger source is set to Software, which is the record start/stop button click events in Motive. Set the trigger source to the corresponding input port and select an appropriate trigger edge when an external trigger source (Trigger Source → isolated or input) is used. Available trigger options include Rising Edge, Falling Edge, High Gated, or Low Gated. The appropriate trigger option will depend on the signal morphology of the external trigger. After the trigger setting have been defined, press the recording button in advance. It sets Motive into a standby mode until the trigger signal is detected through the eSync 2. When the trigger signal is detected, Motive will start the actual recording. The recording will be stopped and return to the 'armed' state when the second trigger signal, or the falling edge of the gated signal, is detected.
Steps
Under the Record Triggering section, set the source to the respective input port where the trigger signal is inputted.
Choose an appropriate trigger option, depending on the morphology of the trigger signal.
Press the record button in Motive, which prepares Motive for recording. At this stage, Motive awaits for an incoming trigger signal.
When the first trigger is detected, Motive starts recording.
When the second trigger is detected, Motive stops recording and awaits for next trigger for repeated recordings. For High Gated and Low Gated trigger options, Motive will record during respective gated windows.
Once all the recording is finished, press the stop button to disarm Motive.
2. Place the pivot point at the midpoint between the two markers. Enable Two Marker Distance visual aid from the perspective pane, and select the two landmark markers in Motive. This will provide a distance between two markers. Then, using this information, translate the pivot point laterally by half of the distance so that it is placed right on the midpoint between two markers.
With a BaseStation and Active Markers communicating on the same RF, active markers will be reconstructed and tracked in Motive automatically. From the unique illumination patterns, each active marker gets labeled individually, and a unique marker ID gets assigned to the corresponding reconstruction in Motive. These IDs can be monitored in the Live-reconstruction mode or in the 2D Mode. To check the marker IDs of respective reconstructions, enable the Marker Labels option under the visual aids (), and the IDs of selected markers will be displayed. The marker IDs assigned to active marker reconstructions are unique, and it can be used to point to a specific marker within many reconstructions in the scene.
For eSync 2 technical specifications please visit our webpage.
For general instructions on setting up the mocap system, refer to the pages. This guide assumes the camera system and the eSync 2 have been already set up.
Once you have connected the external devices to the eSync 2, the first step is to select and configure the sync source under th . The selected source will become the parent sync of both the camera system and other external devices. There are multiple sync source options to choose from under the drop-down menu. Ultimately, only one sync source will be selected and used to synchronize the cameras and subsequent child devices for any particular configuration.
The first step is to define a parent sync source for the camera system. This is configured in the Sync Input → Source entry under the :
By default, the sync input source is set to Internal Free Run, meaning that the camera system is sampling at a frame rate defined in the . In this mode, Prime series cameras are synchronized by communicating the time information with each other through the camera network itself using a high-precision algorithm for timing synchronization. This is the default synchronization protocol for Ethernet camera systems without an eSync 2.
To use an external sync signal as the parent sync source, set the sync input source to the corresponding input port where the parent device is connected to. See the for more details on each of the input ports.− Input ports (1-3): VIH(max): 3.3V− Isolated Input port: VIH(max): 12V− VESA Stereo Input port: VIH(max): 3.3V− Video Genlock In and SMPTE Timecode
In case if you need to delay the camera exposure from the input trigger, there is a sync offset can be applied. Click on the icon at the top, and click Show Advanced to view the advanced settings and you can set the Sync Offset (in microseconds) to apply the delay. This is typically used to synchronize other infrared systems with the camera system to avoid IR interference to each other.
The eSync 2 has total 4 output ports (3.3 V). If you want to setup the camera system to be a parent of other systems, connect the child devices into the Output ports of the eSync for receiving the reference sync signals. Once the devices are connected, you can configure the output signal source under the. Know what type of sync signals are expected by the child devices, and configure the sources accordingly so that appropriate signals are outputted.
When the Output:Type is set to Exposure Time, a high voltage (3.3V) signal will be outputted from the corresponding output port whenever the camera system is exposing, or shuttering, in the . The Gated Exposure Time signal works similarly, but the signal will be sent out only when Motive is recording.
With the eSync 2, external triggering devices (e.g. remote start/stop button) can integrate into the camera system and set to trigger the recording start and stop events in Motive. Such devices will connect to input ports of the eSync 2 and configured under the Record Triggering section of the .
Note: For capturing multiple recordings via recording trigger, only the first TAK will contain the 3D data. For the subsequent TAKs_, the 3D data must be reconstructed through the_ pipeline.
Open the and the to access the .