Unreal Engine: Live Link Camera Stream Setup

Setup guide to stream video cameras from Motive using the Camera Role in Unreal Engine.

Overview

Motive’s Live Link plug-in for Unreal Engine includes the ability to stream tracked rigid bodies for Virtual Production and InCamera VFX (ICVFX).

Motive tracks video cameras using a CinePuck, an active device equipped with an IMU. The CinePuck mounts directly to the video camera and, when aligned with the camera lens as a Rigid Body, tracks the camera's focal point so it can be replicated virtually in the Unreal Engine environment.

The Camera Role in Unreal Engine can now be used for the Tracked Camera Rigid Body. This means that you can now connect a lens file with the associated Live Link Camera Role Asset. In Unreal Engine, users can calibrate for lens distortion and nodal offset, and use the camera with a Lens Encoder, if desired.

This document provides instructions to setup and link a Rigid Body in Motive to a Live Link Camera Role in Unreal Engine. For information on ICVFX production in general, please see the page Unreal Engine: OptiTrack InCamera VFX.

We used the standard InCameraVFX template in Unreal Engine for our sample project. The template includes all the necessary macros and assets needed for virtual production.

This template is located under Film / Video & Live Events in the Unreal Project Browser.

Create Camera Asset in Motive

Motive will stream the camera as a Live Link Rigid Body asset.

Connect the CinePuck and BaseStation

  • Plug the BaseStation into one of the Power over Ethernet (PoE) switches on the OptiTrack camera network.

  • Firmly attach the CinePuck to the Studio Camera using the SmallRig NATO Rail and Clamps on the cage of the camera.

  • The CinePuck can be mounted anywhere on the camera. For best results, put the puck closer to the lens.

  • Power on the CinePuck, and let it calibrate the IMU bias. The lights will flash red and orange during calibration and change to green when done.

We recommend powering the CinePuck with a USB cable while filming to avoid running out of battery power. A light on the CinePuck will indicate when the power is connected.

Create a Rigid Body

A CinePuck is tracked in Motive as a Rigid Body asset. For detailed instructions on creating a rigid body asset, Please see the Builder pane page. For instructions on pairing the IMU, please see the IMU Sensor Fusion page.

Attach the Camera Role in Unreal Engine

  • Select OptiTrack Source, check Connect Automatically, enter the IP address for the Motive PC in the Server Address field, the IP address for the Unreal PC in the Client Address field, and click Create. Enter 127.0.0.1 in both fields if running both on the same PC. Leave the Connection Type as Multicast and click create.

  • This will display a list of assets streaming from Motive. Note that the rigid body camera, named HandCam2, is currently in the Transform role.

  • Click OptiTrack in the Subject list to display the OptiTrack Live Link Properties.

  • In the Live Link Roles section, click the Live Link Subject dropdown. This will open the list of Assets streaming from Motive.

  • Select the camera asset, HandCam2 in this example.

  • Directly below the Live Link Subject field is a check box to Stream as Camera (Rigid Bodies Only). Check this box to apply the Camera Role to the camera rigid body (HandCam2).

Configure Camera for Streaming

Add OptiTrack LiveLinkDisplay

Move all the NDisplay assets to nest under OptitrackLiveLinkDisplay in the Outliner pane.

Add the LiveLinkController

  • Select the video camera, now nested under OptitrackLiveLinkDisplay in the Outliner pane..

  • Search for and select the Live Link Controller component.

  • Properties for the new Controller will display in the Camera details pane.

  • Click the dropdown next to Subject Representation and select the camera Rigid Body.

  • The Subject and Role fields will both update to display the name of the Rigid Body name.

The LiveLinkController label can only be applied to Rigid Body assets.

Camera Calibration

As noted above, Unreal Engine includes properties that allow you to calibrate a video camera streaming into Unreal Engine. This includes the ability to use a lens encoder with the camera.

Lens Encoder

When using a Lens Encoder or a calibrated Lens in Unreal Engine, use the Lens File Picker settings to select the appropriate Lens File.

Please see Camera Lens Calibration Overview in Unreal Engine for more information on calibrating cameras and working with lens files in Unreal Engine.

Last updated