This page provides instructions on how to use the OptiTrack Unreal Engine Live Link plugin. The plugin communicates with Unreal's built-in Live Link system by providing a Live Link source for receiving tracking data streamed from Motive. This plugin can be used for controlling cameras and objects in virtual production applications. When needed, the OptiTrack Unreal Engine Plugin can also be alongside this plugin. For a specific guide to InCamera VFX (i.e. LED Wall Virtual Production) please see this wiki page Unreal Engine: OptiTrack InCamera VFX.
1. [Motive] Setup rigid body streaming in Motive.
Get Motive streaming with at least one rigid body or Skeleton asset. Make sure the Streaming settings are configured correctly, and the asset is active under the Assets pane.
2. [UE] Install the OptiTrack plugins in Unreal Engine (UE).
You can install the OptiTrack Unreal Engine plugin by putting the plugin files into one of the following directories:
A global engine plugin can be placed in C:\Program Files\Epic Games\[Engine Version]\Engine\Plugins
A project-specific plugin can be placed in [Project Directory]\Plugins
3. [UE] Enable the plugins in UE project.
Go to Edit → Plugins and enable two of the required plugins. First one is the OptiTrack - Live Link plugin under Installed group, and the second one is the built-in Live Link plugin under Built-In group.
4. [UE] Open the LiveLink pane
Open the LiveLink pane from Window → Virtual Production → Live Link in the toolbar.
5. [UE] Configure and create a new OptiTrack source
In the LiveLink pane under Source options, go to the OptiTrack Source menu and configure the proper connection settings and click Create. Please make sure to use matching network settings configured from the Streaming pane in Motive.
6. [UE] Check the Connection.
If the streaming settings are correct and the connection to Motive server is successful, then the plugin will list out all of the detected assets. They should have green dots next to them indicating that the corresponding asset has been created and is receiving data. If the dots are yellow, then it means that the client has stopped receiving data. In this case, check if Motive is still tracking or if there is a connection error.
1. Add the camera object or static mesh object that you wish to move
Add a camera actor from the Place Actors pane or a static mesh from the project into your scene. For the static meshes, make sure their Mobility setting is set to Movable under the Transform properties.
2. Add a LiveLinkController Component
Select an actor you want to animate. In the Details tab select your "actor" (Instance). In the search bar, type in Live Link. Then click on the Live Link Controller from the populated list.
3. Select the target rigid body
Under the Live Link properties in the Details tab click in the Subject Representation box and select the target rigid body.
4. Check
Once the target rigid body is selected, each object with the Live Link Controller component attached and configured will be animated in the scene.
When the camera system is synchronized to another master sync device and a timecode signal is feeding into eSync 2, then the received timecode can be used in UE project through the plugin.
1. Set Timecode Provider under project settings
From Edit → Project Settings, search timecode and under Engine - General settings, you should find settings for the timecode. Here, set the the Timecode Provider to LiveLinkTimeCodeProvider.
2. Set OptiTrack source in the Live Link pane as the Timecode Provider
Open the Live Link pane, and select the OptiTrack subject that we created when first setting up the plugin connection. Then, under its properties, check the Timecode Provider box.
3. Check
The timecode from Motive should now be seen in the Take Recorder pane. Take Recorder pane can be found under Window → Cinematics → Take Recorder in the toolbar.
1. Create a new Animation Blueprint
Right click the mesh you would like to use and select "Create > Anim Blueprint"
2. Name and Open the Animation Blueprint
Name the animation blueprint something appropriate, then double click it to open the blueprint.
3. Hook up your Blueprint
Create a "Live Link Pose" component and connect it to the "Output Pose". Assign the "Live Link Subject Name" to the Skeleton that you would like to use.
Change the "Retarget Asset" property in the Details pane of the blueprint editor to "OptiTrackLiveLinkRetarget"
4. Getting the Skeleton to Animate
To animate the Skeleton in real time click the Animation Blueprint from earlier. In the Details pane under the skelteonLive Link Skeleton Animation". After you add that component the mesh should start animating.
To animate the Skeleton in a game, just press the play button. Adding the "Live Link Skeleton Animation" object is not necessary to animate in play mode.
Debugging Note
If the retargeting doesn't match the mesh correctly, then you can create a new OptiTrackLiveLinkRetarget blueprint from scratch and modify the bone mapping names.
Animating a MetaHuman follows basically the same steps as another Skeleton, but requires hooking into the Skeleton at a very specific location. For more information about MetaHuman setup outside of our scope, please visit Epic Games's website.
1. Find the Skeletal Mesh
Navigate to the Skeletal Mesh for your MetaHuman. This is typically located in a folder such as MetaHumans > [Name] > [Female/Male] > [Height] > [Weight] > Body. Double click the Skeletal Mesh to open the blueprint.
2. Open the AnimGraph Tab
Click the "Blueprint" option on the top bar of new window. In the bottom left corner navigate to My Blueprint > Animation Graphs > AnimGraph and double click.
3. Hook up your Blueprint
You'll see a very complex AnimGraph already setup. Make a new Live Link Pose object like in the Skeleton creation steps. Connect the Input Pose or Control Rig to the input of the Live Link Pose. Connect the output of the Live Link Pose to the Output Pose AnimGraph object.
4. Retarget in the Details Tab
The last step in this window is to set the Retarget Asset to OptiTrackLiveLinkRetarget for the Live Link Pose node. To do this, simply click on the dropdown in the Details tab and click the dropdown under Retarget Asset and select OptiTrackLiveLinkRetarget. After it has been set, click Compile in the top left of this window. You may now close this window and move on to the next steps.
5. Level of Detail (LOD)
MetaHumans will change their Level of Detail (LOD), i.e. how complex the asset is, based on how far the camera is from the actor among other factors. In order for things to animate correctly the Forced LOD must be a minimum of 1 (default of -1). To change this setting, click on the (Instance) you wish to change in the Details tab on the main workspace window. Below the selected (Instance) select the LOD tab. From here you can change the value in the Forced LOD Model field.
6. Animate your MetaHuman
At this point if you drag the base MetaHuman object into the scene then it will animate like other Skeletons.
For testing the project in standalone game mode, or when developing an nDislay application, the Live Link plugin settings must be saved out and selected as the default preset to be loaded onto the project. If this is not done, the configured settings may not get applied. After configuring the LiveLink plugin settings, save out the preset from the Live Link pane first. Then, open the Project Settings and find Live Link section in the sidebar. Here, you can select the default Live Link preset to load onto the project, as shown in the screenshot below. Once the preset is properly saved and loaded, the corresponding plugin settings will be applied to the standalone game mode.
If all the configuration is correct, the actors will get animated in the newly opened game window when playing the project in the standalone game mode.
Another path to get data into Unreal Engine is to stream data from Motive -> MotionBuilder (using the OptiTrack MotionBuilder Plugin) -> Unreal Engine (using the Live Link plugin for MotionBuilder). This has the benefit of using the Human IK (HIK) retargeting system in MotionBuilder, which will scale characters of very different sizes/dimensions better than the base Live Link plugin. More information can be found by consulting Unreal Engine's Live Link to MotionBuilder documentation.
For our streaming applications, Unreal Engine 4 and 5 have essentially the same setup. The main difference is the UI and where to find the appropriate settings and buttons. All our guides on this Wiki have been updated to feature Unreal Engine 5. If you need assistance with Unreal Engine 4 please feel free to reach out to our support team.
The OptiTrack Unreal Engine Plugins allow you to stream real-time tracking data from Motive into Unreal Engine. This includes tracking data of Rigid Bodies, Skeletons, and HMDs that are tracked within Motive. This article focuses on the organization of those plugins. For basic instructions on setting up a motion capture system, please refer to the Getting Started guide instead.
OptiTrack - Live Link
OptiTrack - Streaming Client
OptiTrack - VR Latency Optimization
A variety of head mounted displays (HMDs) can be integrated using the OptiTrack OpenVR Driver.
For plugin version 1.23 and above, support for Oculus HMDs has been deprecated.
First, you'll want to follow the below instructions to set up the data streaming settings in Motive. Once this is configured, Motive will be broadcasting tracking data onto a designated network interface where client applications can receive them.
Open the Data Streaming Pane in Motive's Settings window and set the following settings:
Enable - Turn on the Enable setting at the top of the NatNet section.
Local Interface - Choose the desired IP network address from this dropdown to stream data over.
Loopback
This is the local computer IP address (127.0.0.1 or Localhost).
Used for streaming data locally on the PC you are running Motive on that does not interact with the LAN.
Good option for testing network issues.
192.168.0.1x (typical, but may be different depending on which interface is used to establish a LAN connection)
This IP address is the interface of the LAN either by Wi-Fi or Ethernet.
This will be the same address the Client application will use to connect to Motive.
Transmission Type
For streaming over a Wi-Fi network, setting the Transmission Type to Unicast is strongly advised.
Select desired data types to stream under streaming options:
Rigid Bodies - Enabled (required).
Skeletons - Optional for Skeleton tracking.
Markers (Labeled, Unlabled, Asset) - Disabled for HMDs (advised).
Devices - Disabled.
Skeleton Coordinates
Set to Local.
Bone Naming Convention
Set the appropriate bone naming convention for the client application. For example, if the character uses the FBX naming convention, this will need to be set to FBX.
Additional Tips
For best results, it is advised to run Motive and Unreal Engine separately on different computers, so that they are not competing for processing resources.
When streaming the data over a wifi network, Unicast transmission must be used.
In order to stream data from the Edit mode, a capture-recording must be playing back in Motive.
For additional information on data streaming in general, read through the Data Streaming page.
The Unreal Engine: OptiTrack Live Link Plugin is primarily intended for Virtual Production applications that require use of real-time cameras or Skeleton tracking in Unreal Engine. It interfaces with Unreal Engine's built-in Live Link plugin to allow for for streaming data from Motive in both real-time and runtime. When timecode data is available in Motive, streamed timestamps can be used to align the data with other components within the project.
The Unreal Engine: OptiTrack Streaming Client Plugin is primarily used for Virtual Reality due to it's low latency optimizations. It provides streaming of Rigid Body and Skeleton tracking data from Motive into Unreal Engine, and it also includes helper functions and actors to provide quick integrating of the plugin into the scene.
The OptiTrack - VR Latency Optimization plugin is an optional plugin for SteamVR projects. When enabled, it provides HMD render compensation to minimize latency in VR applications. For users developing for SteamVR using the OptiTrack plugin, whether or not using the OptiTrack OpenVR Driver, it's suggested to have this plugin enabled.
This version of the plugin is compatible with Unreal Engine 5.
This version of the plugin is compatible with Unreal Engine 5.
OptiTrack motion capture systems can be used to track head mounted displays (HMD) and integrate the tracking data into Unreal Engine for VR applications. For instructions on integrating HMD tracking data into Unreal Engine, please refer to the corresponding page:
Supported HMDs
At the time of writing, the following HMDs are supported:
HTC VIVE
HTC VIVE Pro 1/2
Valve Index
HP Reverb G2
Deprecated support for Oculus HMDs:
Support for Oculus Integration have been deprecated starting from UE plugin version 1.23; Plugin version 1.22 or below must be used for Oculus HMDs.
Vive and Valve Index HMDs are supported through the OpenVR driver.
When setting up multiplayer games on wireless clients, it is more beneficial for each client to make direct connection to both the tracking-server (Motive) and the game-server, rather than rebroadcasting the streamed tracking data through the game-server. Then, any of the game related actions that interacts with the tracking data can be processed on the game-server, and this server can send out the corresponding updates to the wireless clients. This allows the wireless clients to only receive both the tracking data or updates without having to send back any information; in other words, minimizing the number of data transfers needed. If wireless clients are sending data there will be a minimum of two transfers on the wireless network, and each transfer of data through wireless network is at risk of latency or lost packets.
This page provides instructions on how to configure VCS inputs in Unreal Engine. The basic configuration is similar to configuring any other input triggers in Unreal Engine. Please note that only one VCS controller can be connected and configured due to some limitations. Having two controllers connected at the same time is not supported.
Create a Rigid Body from your tracking controller’s markers using the Builder pane or by selecting the markers and using the keyboard hotkey CTRL + T. You'll want to orient the controller along the +Z axis during creation to define the 'neutral' or 'zero' orientation.
In Motive, configure the data streaming settings. Use the Data Streaming pane to configure streamed packets. Make sure Rigid Body data is streamed out in order to use VCS.
Start up a project in Unreal Engine (UE).
Go to Edit tab → Plugins to open the plugins panel. Enable the Windows RawInput plugin under the Input Devices group.
In Edit tab → Project Settings, scroll to the bottom on the left side panel until you see Raw Input under the plugins group. Here you will let UE project know which input devices to use.
To find these IDs, you will need to look at the windows device properties. Go to Windows Control Panel -> Devices and Printers. Then right-click on the VCS controllers to access its properties. In the properties, go to the Hardware tab and click properties for “HID-compliant game controller”.
Once you access the controller properties, go to the details tab. Select Hardware ID in the drop-down menu and the hardware ID (HID) and product ID (PID) will be shown under the highlighted section.
Under the project settings panel Raw Input plugin properties, input both the vendor ID (Hardware ID) and the product ID (PID) that was found under the controller properties.
Register the Input Buttons
Now the project has the IDs to look for the controllers, next step is to setup and register the input buttons. To do so, you will play the project scene, and trigger on the buttons to register them.
In UE, hit Play and press (~) to access the console. In the console, input command ShowDebug INPUT". This will list out all of the input actions on the left side of the viewport.
Use all of the keys on the controller to register the inputs; total three axis and seven buttons. Please note that these keys may not exactly match the keys on your controller.
Axis 1: Joystick left/right
Axis 2: Joystick up/down
Axis 3: Nob rotate
Button 1: Blue
Button 2: Black
Button 3: White
Button 4: Red
Button 6: Joystick click
Button 7: Nob click
Map the Registered Inputs
Now that the buttons have been registered, next step is to map the keys. They will be mapped under Edit → Project Settings → Inputs. Choose either the Axis mapping or the action mapping to map the controls to desired actions.
Now that all of the buttons are set up, use them to control the VCS in UE.
This page provides instructions on how to set up the OptiTrack Streaming Client Unreal Engine plugin. This plugin is intended for Virtual Reality customers, but can be used with many other applications.
Next step is to configure the client. Follow below instructions to install and configure the OptiTrack Unreal Engine plugin to receive the streamed tracking data.
OptiTrack - Streaming Client Plugin (required)
Download the .
Extract the contents from the ZIP file.
Open the extracted OptiTrack folder, transfer the entire "OptiTrack" folder into the Unreal Engine's plugin directory located in the C:\Program Files\Epic Games\5.#\Engine\Plugins
folder (there will be other plugins in that folder already).
Open/Create a new Unreal Engine project.
Under the Edit menu, click Plugins to open up the panel where all of the available plugins are listed.
Browse to OptiTrack section and enable the "OptiTrack - Streaming Client".
Click Apply to submit the changes. It will require the Unreal Engine project to be restarted
Once the OptiTrack - Streaming Client plugin is enabled, the OptiTrack Client Origin actor will be available in Unreal Engine.
OptiTrack Client Origin
The OptiTrack Client Origin class enables the Unreal Engine (client) to communicate with the Rigid Body, Skeleton, and HMD tracking data streamed from Motive.
To add the client origin, simply drag-and-drop the OptiTrack Client Origin from the Place Actors panel into the level. Once the client origin is placed within the level, its position and orientation will reconcile the global origin of Motive in Unreal Engine. In other words, the tracking data will be represented relative to where this Client Origin object is positioned and oriented.
Global Origin: Both position and orientation of the OptiTrackClientOrigin will represent the global origin of the tracking volume within Motive.
[Unreal] Once the plugin is added and enabled in the project, the OptiTrack Client Origin class will be available from the Place Actors panel.
[Unreal] Drag and drop the OptiTrack Client Origin into the scene.
[Unreal] Place the OptiTrack Client Origin at the desired location within the scene.
[Unreal] Select the instantiated OptiTrackClientOrigin object from the World Outliner panel.
[Unreal] In the Details panel, make sure its Auto Connect setting is checked. This configures the client origin to automatically search the network and connect to Motive.
Now that the client origin is set, the client origin will attempt to connect to Motive and start receiving the tracking data whenever the scene is played.
Connecting to a designated IP address
Advance settings: Auto-initialize
By default, the auto-initialize feature is enabled and the client origin will get auto-initialized whenever the scene is played. But when needed, you can disable this and set up the project so the client origin gets initialized when a user-defined event is triggered.
[Unreal] From the Place Actors panel, search for OptiTrack Rigid Body Actor, then drag-and-drop the actor into the scene.
[Unreal] With this Rigid Body actor selected, attach the target actor that you wish to animate using the Details panel. Make sure the target actor's transformation is set to movable.
[Unreal] Set the relative locations and rotations to all zeroes on this target actor. This actor should be listed as a child of the Rigid Body actor.
[Motive] In Motive, assign a value to Streaming ID property for the target Rigid Body.
[Unreal] In the properties of the OptiTrack Rigid Body Actor component, match the Tracking ID with the Streaming ID of the Rigid Body asset in Motive.
Make sure both Motive and OptiTrack Client Origin is set up for streaming, hit Play, and the attached actor object will be animated according to the live-streamed Rigid Body tracking data.
When this is checked, the corresponding Rigid Body actor will be hidden from the level until the associated Rigid Body data is streamed out from Motive and received by the plugin.
Low latency update feature allows Rigid Body position and orientation transform to be updated immediately before rendering minimizing the latency. This is enabled by default. For debugging, you can check this setting to disable this behavior.
This sets a specific client origin to use for receiving tracking data. When this is unset, the plugin will default to the first client origin that it finds in the scene.
When this is set to true, the Rigid Body transform data from Motive will be applied in respect to the parent actor's pivot coordinates. By default, this is set to false, and all of the tracking data will be applied in respect to the pivot axis of the client origin.
When needed, you can also draw labeled marker data from Motive into the scene in UE. In most applications, you do not have to draw the markers as Rigid Body data and the Skeleton data will be used instead; however, getting markers generated in the scene may be helpful for debugging purposes. To enable drawing of the markers:
[UE4] Expand the OptiTrackClientOrigin (Instance) properties, and enable the Draw Markers checkbox.
Skeleton streaming is supported only in plugin versions 1.9 or above.
Follow the below steps to set up Skeleton streaming onto Unreal Engine.
1. Create a Animation Blueprint in the 3D View
Step 2. Right-click the blank space in the Content Browser pane, then select Animation → Animation Blueprint.
Step 3. On the pop-up window, select the OptiTrackAnimInstance at the parent class section at the top and click on the target Skeleton name at the bottom. Then click OK.
Step 4. In the content browser, assign a name to the created animation blueprint.
Step 5. Drag the character blueprint into the scene.
Step 6. Select the character blueprint in the 3D View
In the Details Pane, select “+ ADD” and create a new an “OptiTrack Skeleton Instance” on the model.
Set the “Source Skeleton Asset” equal to the Skeleton name in Motive.
2. Setup the Blueprint
**Step 1.**Double-click the animation blueprint in the content browser to open its editor.
**Step 2.**Right-click the animation graph, then create a new "OptiTrack Skeleton".
**Step 3.**Right-click the animation graph, then create a new "Get Streaming Client Origin" and connect its output to the Streaming Client Origin.
**Step 4.**Right-click the animation graph, then create a new "Get Source Skeleton Asset Name" and connect its output to the Source Skeleton Asset Name.
Step 5. Right-click the animation graph, then create a new "Component To Local" and connect the output from "OptiTrack Skeleton" into its input.
**Step 6.**Connect all of the nodes together. The basic animation flow chart should look like the following.
Bone Transformation
Roll Bone Interpolation
For characters with unmapped shoulder roll bones, the Skeleton plugin will detect its existence and apply a slight twist to the roll bones to keep smooth swinging motion on the arms. In the OptiTrack Skeleton blueprint, you can enable/disable this feature from the Roll Bone Interpolation checkbox, and you can also adjust how much of twist is applied by setting the Roll Bone Blending parameter. When this parameter is set to 0, the plugin will not adjust the roll bone motion, and when this is set to 1, the plugin will attempt to adjust its motion to keep the shoulder steady on the character.
Please note that this feature may not work on some characters.
3. Assign Bone Mapping
Step 1. Select the OptiTrack Skeleton plugin in the blueprint graph area.
Step 2. Drop down the Bone Mappings property in the Details Pane.
Step 3. Click “Auto Fill Bone Mapping” to automatically assign the bones in the Skeleton to the OptiTrack Skeleton names.
Note: There is no standard for bone naming conventions, so bone names may vary depending on characters. After doing the auto-fill, review the list and double-check that the auto-assigned names are correct. You may need to manually use the drop-down menu to adjust the assigned mapping for missing, or incorrect, items.
Step 4. Hit "Compile" in the top left to build the blueprint.
4. Setup OptiTrack Streaming
Step 1. Open the 3D View
Step 2. Search OptiTrack Client Origin in the Modes pane.
Step 3. Drag the OptiTrack Client Origin into the 3D scene, then select it to access its properties.
(Optional) put it at 0,0,0 location.
Make sure that streaming settings on both Motive and Unreal match.
5. Click _Play_
The OptiTrack Unreal Engine Skeleton Plugin uses bone mapping, not retargeting. This means that the bone segments in Motive map directly to the character model (bone mapping), instead of being translated into something that is usable by a more abstract biped model (retargeting). Because of this non-anatomical Skeletons will not map correctly without some additional tweaking.
Practically, this means that you will need to do things like turn off the toe mapping for characters with high heels, adjusting the pelvis bone in Motive or in the model for characters with non-anatomical hip bones, and not use bipeds that are too anatomically different than humans, such as a gorilla or swamp monster.
For example, the character sample below has both a non-anatomical pelvis and high heels. It is preferable to use character models that are more anatomically correct, but in this case, you can do a couple things to mitigate these issues:
1. Turn-off toe streaming
In the example below, since this character is wearing heels, any actor providing data for this model will also need to be wearing heels. To get around this you can just turn off the toe mapping in the OptiTrack Unreal Engine Skeleton Plugin.
2. Adjust the bone segments in Motive
The hip segment on the Countess actor is centered in the stomach rather than in the pelvis, the neck bone in Motive is a bit too long for the model, and the shoulders in Motive do not match the width of the character’s shoulders. By adjusting bones' positions and lengths in Motive, you can make the streamed Skeleton to better match the model; however, please note that there are limitations to how much you can do this.)
When streaming Skeleton data to animate characters that have different bone lengths compared to the mocap actor, the UE character will need to be scaled accordingly. In this case, the "Scale Bones" feature in the OptiTrack Skeleton node automatically scales the character bones to match the mocap actor. This setting is enabled by default.
The OptiTrack Unreal Engine Skeleton Plugin uses bone mapping, not retargeting. This means that the bone segments in Motive map directly to the character model (bone mapping), instead of being translated into something that is usable by a more abstract biped model (retargeting). Because of this, non-anatomical Skeletons will not map correctly without some additional tweaking. Starting from plugin version 1.23, you can tweak the alignment of the bone mapping by adding sockets to the Skeleton blueprint:
Adding Sockets to the Bone Mapping
Under the Skeleton tree, right-click on the bone that you wish to add the sockets to.
Right click and select_Add Socket_.
Go to the Animation blueprint, and change the bone mapping of the bone which you have created sockets for, and map it to the socket that was just created.
Play the scene, and adjust the socket location from the Skeleton Editor to adjust alignment of the bone.
[Motive] Make sure that NatNet streaming is enabled in the in Motive.
If you wish to connect to a server on a specific network address, you can uncheck the Auto Connect setting and manually enter the Server IP Address chosen in the in Motive, Client IP Address, and Connection Type associated with Motive. You may need to run the ipconfig command in the command prompt to obtain an appropriate IP address of the client.
Actor objects in Unreal Engine can be animated using Rigid Body tracking data from Motive. Once the OptiTrack - Streaming Client plugin is enabled in the project, OptiTrack Rigid Body component will be available to use. By attaching this component onto an actor, you can animate its child actors according to the movement of a Rigid Body in Motive. Each Rigid Body component is given a Tracking ID value which associates with the of a Rigid Body in Motive. Once associated, the data from the corresponding Rigid Body will be used to update the transform of the target actor in Unreal Engine.
ID of the Rigid Body used to derive the position and orientatation transform of the attached actor. This ID must match with the of the respective Rigid Body in Motive.
[Motive] setting in the data streaming pane must be enabled.
Step 1. Navigate to a character folder. With sample characters, it is located in Characters → Heros → [Character Name] → Meshes.
Within the animation blueprint, you can utilize other blueprint utility tools from UE4 to modify the streamed data. For example, Transform (Modify) Bone nodes can be included after the OptiTrack Skeleton node to apply a transform to specific Skeleton bones as needed. Please refer to for more information on using animation blueprints.
See: page for more instructions on setting up the client origin.
Open of the character you wish to modify