LogoLogo
WebsiteSupportDownloadsForumsQuick LinksContact Us
v3.2
v3.2
  • OptiTrack Documentation
  • WHAT'S NEW
    • What's New in Motive 3.2
  • QUICK START GUIDES
    • Quick Start Guide: Getting Started
    • Quick Start Guide: Prime Color Camera Setup
    • Quick Start Guide: Precision Capture
    • Quick Start Guide: Tutorial Videos
    • Quick Start Guide: Active Marker Tracking
    • Quick Start Guide: Outdoor Tracking Setup
  • HARDWARE
    • Cameras
      • Ethernet Cameras
        • PrimeX 120
        • PrimeX 41
        • PrimeX 22
        • PrimeX 13
        • PrimeX 13W
        • SlimX 13
        • Prime Color
        • VersaX 22
        • VersaX 41
        • VersaX 120
      • USB Cameras
        • Slim 3U
        • Flex 13
        • Flex 3
        • V120:Duo
        • V120:Trio
        • Adjusting Global Origin for Tracking Bars
    • Prepare Setup Area
    • Camera Mount Structures
    • Camera Placement
    • Ethernet Camera Network Setup
      • General Overview and Specs
      • Windows 10 Network Settings
      • Cabling and Load Balancing
      • Switch Configuration for PrimeX 120
      • NETGEAR ProSafe GSM7228S: Disabling the Broadcast Storm Control
      • White/Blacklisting Cameras
    • USB Camera System Setup
      • USB Camera Network Overview and Specs
      • V120:Duo and Trio Setup
      • Tracking Bar Coordinate System
        • Transforming Coordinate System: Global to Local
    • Aiming and Focusing
    • Camera Status Indicators
  • MOTIVE
    • Installation and License Activation
    • Motive Basics
    • Calibration
      • .mcal XML Calibration Files
      • Continuous Calibration
      • Continuous Calibration (Info Pane)
      • Calibration Squares
    • Markers
    • Assets
      • Gizmo Tool: Translate, Rotate, and Scale
    • Rigid Body Tracking
      • Aligning Rigid Body Pivot Point with a Replicated 3D Model
    • Skeleton Tracking
    • Trained Markersets
    • IMU Sensor Fusion
    • Data Recording
      • Data Types
    • Labeling
    • Data Editing
    • Data Export
      • Data Export: BVH
      • Data Export: C3D
      • Data Export: CSV
      • Data Export: FBX
      • Data Export: TRC
    • Data Streaming
    • Camera Video Types
    • Audio Recording
    • Motive HotKeys
    • Measurement Probe Kit Guide
    • Motive Batch Processor
    • Reconstruction and 2D Mode
  • MOTIVE UI PANES
    • Settings
      • Settings: General
      • Settings: Assets
      • Settings: Live Pipeline
      • Settings: Streaming
      • Settings: Views
      • Settings: Mouse and Keyboard
      • Settings: Audio
    • Assets Pane
    • Builder Pane
    • Constraints Pane
      • Constraints XML Files
    • Calibration Pane
    • Data Pane
    • Devices Pane
    • Edit Tools Pane
    • Graph View Pane
    • Info Pane
    • Labels Pane
    • Log Pane
    • Probe Pane
    • Properties Pane
      • Properties Pane: Camera
      • Properties Pane: Force Plates
      • Properties Pane: NI-DAQ
      • Properties Pane: OptiHub2
      • Properties Pane: Rigid Body
      • Properties Pane: Skeleton
      • Properties Pane: Take
      • Properties Pane: Trained Markerset
      • Properties Pane: eSync2
    • Status Panel
    • Toolbar/Command Bar
    • Control Deck
    • Viewport
  • PLUGINS
    • OptiTrack Blender Plugin
      • OptiTrack Blender Plugin
    • OptiTrack Unreal Engine Plugin
      • Unreal Engine: OptiTrack Live Link Plugin
        • Quick Start Guide: Real-Time Retargeting in Unreal Engine with Live Link Content
        • Unreal Editor for Fortnite (UEFN): OptiTrack Plugin for Live Link Hub
        • Unreal Engine: Live Link Camera Stream Setup
        • Live Link Content: Active Puck Static Meshes
      • Unreal Engine: MotionBuilder Workflow
      • Unreal Engine: HMD Setup
      • Unreal Engine VCS Inputs
    • OptiTrack Unity Plugin
      • Unity: HMD Setup
    • OptiTrack OpenVR Driver
    • OptiTrack MATLAB Plugin
    • Autodesk Maya
      • Autodesk Maya: OptiTrack Insight VCS Plugin
    • Autodesk MotionBuilder
      • Autodesk MotionBuilder Plugin
      • Autodesk MotionBuilder: OptiTrack Skeleton Plugin
      • Autodesk MotionBuilder: OptiTrack Optical Plugin
      • Autodesk MotionBuilder: OptiTrack Insight VCS Plugin
      • Autodesk MotionBuilder: Timecode Data
    • OptiTrack Peripheral API
    • External Plugins
      • Houdini 19 Integration
  • ACTIVE COMPONENTS
    • Active Components Hardware
      • Active Puck
      • Wired AnchorPuck
      • CinePuck
      • Wired CinePuck
      • BaseStation
      • Information for Assembling the Active Tags
      • Manus Glove Setup
    • Configuration
      • Active Batch Programmer
      • Active Hardware Configuration: PuTTY
      • Active Component Firmware Compatibility
    • Active Marker Tracking
      • Active Finger Marker Set
  • SYNCHRONIZATION
    • Synchronization Hardware
      • External Device Sync Guide: eSync 2
      • External Device Sync Guide: OptiHub2
    • Synchronization Setup
    • OptiTrack Timecode
  • VIRTUAL PRODUCTION
    • Unreal Engine: OptiTrack InCamera VFX
    • Entertainment Marker Sets
    • PrimeX 41
  • MOVEMENT SCIENCES
    • Movement Sciences Hardware
      • General Motive Force Plate Setup
      • AMTI Force Plate Setup
      • Bertec Force Plate Setup
      • Kistler Force Plate Setup
      • Delsys EMG Setup
      • NI-DAQ Setup
      • Multiple Device Setup
    • Movement Sciences Marker Sets
      • Biomechanics Marker Sets
      • Biomech (57)
      • Rizzoli Marker Sets
    • For Visual3D Users
    • Prime Color Camera Setup
      • Prime Color Setup: Required Components
      • Prime Color Setup: Hardware Setup
      • Prime Color Camera Setup: Camera Settings
      • Prime Color Camera Setup: Prime Color FS Calibration
      • Prime Color Setup: Data Recording / Export
      • Prime Color Camera Setup: FAQ / Troubleshooting
      • Prime Color Camera Setup: Windows Network Settings
  • VIRTUAL REALITY
    • VR Plugins
      • VR Unreal Engine
        • OptiTrack Unreal Engine Plugin
        • Unreal Engine: OptiTrack Live Link Plugin
          • UE5.1 Live Link Retarget External Workaround
        • Unreal Engine VCS Inputs
      • VR Unity
        • OptiTrack Unity Plugin
      • VR OpenVR
        • OptiTrack OpenVR Driver
    • VR HMD Setup
      • Unreal Engine: HMD Setup
      • Unity: HMD Setup
      • Manually Calibrating the HMD Pivot Point
      • Sync Configuration with an HTC Vive System
    • SlimX 13
    • Active Marker Tracking
      • Active Finger Marker Set
    • Synchronization Hardware
      • External Device Sync Guide: eSync 2
      • External Device Sync Guide: OptiHub2
  • ANIMATION
    • Autodesk Maya
      • Autodesk Maya: OptiTrack Insight VCS Plugin
    • Autodesk MotionBuilder
      • Autodesk MotionBuilder Plugin
      • Autodesk MotionBuilder: OptiTrack Skeleton Plugin
      • Autodesk MotionBuilder: OptiTrack Optical Plugin
      • Autodesk MotionBuilder: OptiTrack Insight VCS Plugin
      • Autodesk MotionBuilder: Timecode Data
  • ROBOTICS
    • MoCap4ROS2 Setup
    • OptiTrack Robot Applications
    • Outdoor Tracking Setup
  • DEVELOPER TOOLS
    • Developer Tools Overview
    • Camera SDK
      • Class: cCameraModule
      • Class: cUID
    • Motive API
      • Motive API: Quick Start Guide
      • Motive API Overview
      • Motive API: Function Reference
      • Motive API Camera Calibration
    • NatNet SDK
      • NatNet 4.1
      • NatNet: Class/Function Reference
      • NatNet: Creating a Managed (C sharp) Client Application
      • NatNet: Creating a Native (C++) Client Application
      • NatNet: Data Types
      • NatNet: Matlab Wrapper
      • NatNet: Migration to NatNet 3.0 libraries
      • NatNet: Remote Requests/Commands
      • NatNet: Sample Projects
      • NatNet: Unicast Data Subscription Commands
      • Latency Measurements
    • VRPN Sample
    • Peripheral API: Glove Devices
  • SKELETON MARKER SETS
    • Full Body
      • Baseline (41)
      • Core (50)
      • Biomech (57)
      • Conventional (39)
    • Full Body + Fingers
      • Baseline + Passive Fingers (49)
      • Baseline + Active Fingers (57)
      • Core + Passive Fingers (54)
      • Core + Active Fingers (62)
    • Upper
      • Baseline Upper (25)
      • Conventional Upper (27)
    • Lower
      • Baseline Lower (20)
      • Helen Hayes Lower (19)
      • Conventional Lower (16)
    • Hand and Fingers
      • Left/Right Hand (4) Active
      • Left/Right Hand (10) Active + Passive
      • Active Finger Marker Set
    • Glove Device Setup
      • Manus Glove Setup
      • StretchSense Glove Setup
    • Rizzoli Marker Sets
    • Entertainment Marker Sets
    • Rigid Body Skeleton Marker Set
  • GENERAL TROUBLESHOOTING
    • Licensing Troubleshooting
    • Windows 11 Optimization for Realtime Applications
    • Network Troubleshooting
    • Troubleshooting Q&A
    • Running Motive on High DPI Displays
    • Firewall Settings
Powered by GitBook
On this page
  • Overview
  • Application Settings: Live Pipeline
  • Solver Settings
  • Cameras Tab: Camera Filters - Software
  • Camera Settings
  • Enable Reconstruction
  • Threshold Setting
  • Visual Aids
  • Marker Rays
  • Real-time Solve
  • Live Mode
  • 2D Edit Mode
  • Post-Processing Reconstruction

Was this helpful?

Export as PDF
  1. MOTIVE

Reconstruction and 2D Mode

An in-depth explanation of the reconstruction process and settings that affect how 3D tracking data is obtained in Motive.

PreviousMotive Batch ProcessorNextMOTIVE UI PANES

Was this helpful?

Overview

Reconstruction is the process of deriving 3D points from 2D coordinates obtained by captured camera images. When multiple synchronized images are captured, the 2D centroid locations of detected marker reflections are triangulated on each captured frame and processed through the solver pipeline to be tracked. This involves the trajectorization of detected 3D markers within the calibrated capture volume and the booting process for the tracking of defined assets.

  • For real-time tracking in Live mode, settings are configured under the Live-Pipeline tab in the . Click the icon on the main toolbar to open the Settings panel.

  • When post-processing recorded Takes in Edit mode, the solver settings are found under the corresponding .

The optimal configuration may vary depending on the capture application and environmental conditions. For most common applications, the default settings should work well.

In this page, we will focus on:

  • the Real-Time Solve process; and

  • Post-production Reconstruction.

Application Settings: Live Pipeline

When a camera system captures multiple synchronized 2D frames, the images are processed through two filters before they are reconstructed into 3D tracking: first through the camera hardware then through a software filter. Both filters are important in determining which 2D reflections are identified as marker reflections and reconstructed into 3D data.

Solver Settings

The default solver settings work for most tracking applications. Users should not need to modify these settings.

Minimum Rays to Start / Minimum Rays to Continue

These settings establish the minimum number of tracked marker rays required for a 3D point to be reconstructed (to Start) or to continue being tracked (to Continue) in the Take. In other words, this is the minimum number of calibrated cameras that need to see the marker for it to be tracked.

Increasing the Minimum Rays value may prevent extraneous reconstructions. Decreasing it may prevent marker occlusions from occurring in areas with limited camera coverage.

In general, we recommend modifying these settings only for systems with either a high or very low camera count.

Additional Settings

Cameras Tab: Camera Filters - Software

The 2D camera filter is applied by the camera each time it captures a frame of an image. This filter examines the sizes and shapes of the detected reflections (IR illuminations) to determine which reflections are markers.

Camera filter settings apply to Live tracking only as the filter is applied at the hardware level when the 2D frames are captured. Modifying these settings will not affect a recorded Take as the 2D data has already been filtered and saved.

Minimum / Maximum Pixel Threshold

The Minimum and Maximum Pixel Threshold settings determines the lower and upper boundaries of the size filter. Only reflections with pixel counts within the range of these thresholds are recognized as marker reflections, and reflections outside the range are filtered out.

For common applications, the default range should suffice. In a close-up capture application, marker reflections appear bigger on the camera's view. In this case, you may need to adjust the maximum threshold value to allow reflections with more thresholded pixels to be considered as marker reflections.

Circularity

The camera looks for circles when determining if a given reflection is a marker, as markers are generally spheres attached to an object. When captured at an angle, a circular object may appear distorted and less round than it actually is.

The Circularity value establishes the degree (as a percentage) to which a reflection can vary from circular for the camera to recognize it as a marker. Only reflections with circularity values greater than the defined threshold will be identified as marker reflections.

The valid range is between 0 and 1, with 0 being completely flat and 1 being perfectly round. The default value of .60 requires a reflection to be at least 60% circular to identify it as a marker.

The default value is sufficient for most capture applications. This setting may require adjustment when tracking assets with alternative markers (such as reflective tape) or whose shape and/or movement creates distortion in the capture.

Camera Settings

In general, the overall quality of 3D reconstructions is determined by the quality of the captured camera images.

Enable Reconstruction

There are three methods to switch between camera video types:

  • Select the camera and use the O, U, or I hotkeys to switch to Object, Grayscale, or MJPEG modes, respectively.

Object mode vs. Precision Mode

  • In object mode, cameras capture 2D centroid location, size, and roundness of markers and transmit that data to the host PC.

  • In precision mode, cameras send the pixel data from the capture region to the host PC where additional processing to determine the centroid location, size, and roundness of the reflections takes place .

Threshold Setting

The Threshold value determines the minimum brightness level required for a pixel to be tracked in Motive, when the camera is in tracking mode.

Pixels with a brightness value that exceeds the configured threshold are referred to as thresholded pixels and only they are captured and processed in Motive. All other pixels that do not meet the brightness threshold are filtered out. Additionally, clusters of thresholded pixels are filtered through the 2D Object Filter to determine if any are possible marker reflections.

We do not recommend lowering the threshold below the default value of 200 as this can introduce noise and false reconstructions in the data.

Visual Aids

Marker Rays

After the 2D camera filter has been applied, each 2D centroid captured by a camera forms a 3D vector ray, known as a Marker Ray in Motive. The Marker Ray connects the centroid to the 3D coordinates of the camera. Marker rays are critical to reconstruction and trajectorization.

Monitoring marker rays using the Visual Aids in the 3D Viewport is an efficient way of inspecting reconstruction outcomes by showing which cameras are contributing to the reconstruction of a selected marker.

There are two different types of marker rays in Motive: tracked rays and untracked rays.

Tracked Ray (Green)

Tracked rays are marker rays that contribute to 3D reconstructions within the volume.

There are three Visual options for tracked rays:

  • Show Selected: Only the rays that contribute to the reconstruction of the selected marker(s) are visible, all others are hidden. If nothing is selected, no rays are shown.

  • Show All: All tracked rays are displayed, regardless of the selection.

  • Hide All: No rays are visible.

Untracked Ray (Red)

An untracked ray does not contribute to the reconstruction of a 3D point. Untracked rays occurs when reconstruction requirements, such as the minimum ray count or the max residuals, are not met.

Untracked rays can occur from errant reflections in the volume or from areas with insufficient camera coverage.

Marker Size

  • Markers that are within the minimum and maximum pixel threshold are marked with a yellow crosshair at the center. The size label is shown in White.

  • Markers that are outside the boundaries of the size filter are shown with a small red X and the text Size Filter. The label is red.

Only markers that are close to the size boundaries but not within them will display in the Camera view in red. Markers with a significant size variance from the limits will be filtered out of the Camera view.

Circularity

As noted above, the Camera Software Filter also identifies marker reflections based on their shape, specifically, the roundness. The filter assumes all marker reflections have circular shapes and filters out all non-circular reflections detected.

The allowable circularity value is defined under the Circularity setting on the Cameras tab of the Live Pipeline settings in the Applications Setting panel.

  • Markers that exceed the Circularity threshold are marked with a yellow crosshair at the center. The Circularity label is shown in White.

  • Markers that are below the Circularity threshold are shown with a small red X and the text Circle Filter. The label is red.

Pixel Inspector

Technically a mouse tool rather than a visual aid, the Pixel Inspector displays the x, y coordinates and, when in reference mode, the brightness value for individual pixels in the 2D camera view.

Drag the mouse to select a region in the 2D view for the selected camera, zooming in until the data is visible. Move the mouse over the region to display the values for the pixel directly below the cursor and the eight pixels surrounding it. Average values for each column and row are displayed at the top and bottom of the selected range.

If the Brightness values display 0 for illuminated pixels, it means the camera is in tracking mode. Change the video mode to Grayscale or MJPEG to display the brightness.

Real-time Solve

Motive performs real-time reconstruction of 3D coordinates from 2D data in:

  • Live mode (using live 2D data capture)

  • 2D Edit mode (using recorded 2D data)

When Motive is processing in real-time, you can examine the marker rays and other visuals from the viewport, review and modify the Live-Pipeline settings, and otherwise optimize the 3D data acquisition.

Live Mode

2D Edit Mode

When a capture is recorded in Motive, both 2D camera data and reconstructed 3D data are saved into the Take file. By default, the 3D data is loaded when the recorded Take file is opened.

Recorded 3D data contains the 3D coordinates that were live-reconstructed at the moment of capture and is independent of the 2D data once it's recorded. However, You can still view and edit the recorded 2D data to optimize the solver parameters and reconstruct a fresh set of 3D data from it.

Post-Processing Reconstruction

Open 2D Edit mode

Click the Edit button in the Control Deck and select EDIT 2D from the list.

Update the Reconstruction Settings

Applying changes to 3D data

Once the reconstruction/solver settings are optimized for the recorded data, it's time to perform the post-processing reconstruction pipeline on the Take to reconstruct a new set of 3D data.

This step overwrites the existing 3D data and discards all of the post-processing edits completed on that data, including edits to the marker labels and trajectories.

Right-click the take in the Data Pane to open the menu. post-processing options are in the third section from the top.

There are three options to Reconstruct 3D data:

  • Reconstruct: Creates a new 3D data set.

  • Reconstruct, Auto-Label and Solve: Creates a new 3D data set, auto-labels and solves all assets in the Take. When an asset is solved, Motive stores the tracking data for the asset in the Take then reads from that Solved data to recreate and track the asset in the scene.

Key system-wide settings that directly impact the reconstruction outcome under the ;

that apply to individual cameras;

related to reconstruction and tracking;

The control tracking quality in Motive. Adjust these settings to optimize the 3D data acquisition in both live-reconstruction and post-processing reconstruction of capture data.

To open the panel, click the button on the main toolbar to open. Click the Live Pipeline settings, which contains two tabs: Solver and Cameras.

Motive processes markers rays based on the camera system to reconstruct the respective markers. The solver settings determine how 2D data is trajectorized and solved into 3D data for tracking Rigid Bodies, Trained Markersets, and/or Skeletons. The solver combines marker ray tracking with pre-defined asset definitions to provide high-quality tracking.

There are other reconstruction settings on the Solver tab that affect the acquisition of 3D data. For a detailed description of each setting, please see the page.

These values can be modified in a recorded Take and the 3D data reconstructed during post-processing. See the section for more information.

Maximum Pixel Threshold is an advanced setting. Click the button in the upper right corner of the Cameras tab and select Show Advanced to access this setting.

Ensure the cameras are on the tracking volume and markers are clearly visible in each camera view.

Adjust the on the camera if necessary.

Check and optimize camera properties such as and values.

Camera settings are configured under the or under the when one or more camera is selected. The following section highlights settings directly related to 3D reconstruction.

Tracking mode vs. Reference mode: Only cameras recording in tracking mode (Object or Precision) contribute to reconstructions; Cameras in reference mode (MJPEG or Grayscale) do NOT contribute. For more information, please see the page.

Click the icon under Mode for the desired camera in the until the desired mode is selected.

Right-click the camera in the of the viewport and select Video Type, then select the desired mode from the list.

and deliver slightly different data to the host PC:

The Threshold setting is located in the .

The has an array of Visual Aids for both the and . This next section focuses on Visual Aids that display data relevant to reconstruction.

To select a Visual Aid from either view, click the button on the pane's toolbar.

Trajectorization is the process of using 2D data to calculate 3D marker trajectories in Motive. When the minimum required number of rays (as defined in the setting) converge and intersect within the allowable maximum offset distance, trajectorization of the 3D marker occurs. The maximum offset distance is defined by the setting on the Solver tab of the Live Pipeline settings.

Click the button in the to select the Marker Size visual. This will add a label to each centroid that shows the size, in pixels, and indicates whether it falls inside or outside the boundaries of the size filter (too small or too large).

Click the button in the to select the Circularity visual.

To enable, click the button in the to open the menu and select Pixel Inspector.

In , Any changes to the Live Pipeline settings (on either Solver or Camera tabs) are reflected immediately in the Live capture.

2D Edit Mode is used in the post-processing of a captured Take. Playback in Edit 2D performs a live reconstruction of the 3D data, immediately reflecting changes made to settings or assets. These changes are not applied to the recording until the Take is and saved.

Alternately, you can click the button in the top right corner of the to select 2D Mode.

Changes made to the Solver or Camera filter configurations in the Live Pipeline settings do not affect the recorded data. Instead, these values are adjusted in a recorded Take from the .

Select the Take in the to display the Camera Filter values and Solver properties that were in effect when the recording was made. These values can be adjusted and the 3D data reconstructed as part of the post-processing workflow.

To see additional settings not shown here, click the button in the top right corner of the pane and select Show Advanced.

Additionally, recorded Skeleton marker labels, which were intact during the live capture, may be discarded, and the reconstructed markers may not be auto-labeled correctly again if the Skeletons are never in well-trackable poses during the captured Take. This is another reason to always start a capture with a good (e.g., a T-pose).

Reconstruct and Auto-Label: Creates a new 3D data set and auto-labels markers in the Take based on existing asset definitions. To learn more about the , please see the page.

Post-processing reconstruction can be performed on the entire frame range in a Take or applied to a specified frame range by selecting the range under the or in the . When nothing is selected, reconstruction is applied to all frames.

Multiple Takes can be selected and processed together by holding the shift key while clicking the Takes in the . When multiple takes are selected, the reconstruction will apply to the entire frame range of every Takes in the selection .

Live Pipeline settings
Camera Settings
Live Pipeline settings
calibration
Application Settings: Live Pipeline
focused
Devices pane
Properties pane
Camera Video Types
Devices pane
Object Mode
Precision Mode
camera properties
Take Properties
Data pane
Control Deck
Graph pane
Data pane
Post-Processing Reconstruction
reprocessed
Application Settings
Take properties
Applications Settings
Viewport
Minimum Rays
Data pane
Labeling
Exposure
Threshold
F-Stop
Live mode
3D Marker Threshold
calibration pose
Visual Aids
Cameras view
3D Perspective
Cameras Views
Visual Aids
Cameras View
Visual Aids
Cameras View
auto-labeling process
3D markers reconstructed from captured 2D images.
Reconstruction settings are located under the Live Pipeline tab in the Application Settings.
2D Filter settings on the Cameras tab in the Live Pipeline Application Settings.
Cameras View: Select Video Type.
the 3D Viewport with All Tracked Rays displayed.
Untracked rays in a volume.
Reflections accepted (white) or rejected (red) by the size filter.
Reflections accepted (white) or rejected (red) by the Circularity filter.
Analyzing pixel brightness values using the pixel inspector.
The current mode is highlighted in Cyan on the control deck.
Edit menu in the Control Deck.
Camera Filter and Solver settings in the Take Properties.
Post-processing Options from the Data Pane menu.
Cameras View
Mouse Actions