LogoLogo
WebsiteSupportDownloadsForumsQuick LinksContact Us
v3.2
v3.2
  • OptiTrack Documentation
  • WHAT'S NEW
    • What's New in Motive 3.2
  • QUICK START GUIDES
    • Quick Start Guide: Getting Started
    • Quick Start Guide: Prime Color Camera Setup
    • Quick Start Guide: Precision Capture
    • Quick Start Guide: Tutorial Videos
    • Quick Start Guide: Active Marker Tracking
    • Quick Start Guide: Outdoor Tracking Setup
  • HARDWARE
    • Cameras
      • Ethernet Cameras
        • PrimeX 120
        • PrimeX 41
        • PrimeX 22
        • PrimeX 13
        • PrimeX 13W
        • SlimX 13
        • Prime Color
        • VersaX 22
        • VersaX 41
        • VersaX 120
      • USB Cameras
        • Slim 3U
        • Flex 13
        • Flex 3
        • Duo 3
        • Trio 3
        • Adjusting Global Origin for Tracking Bars
    • Prepare Setup Area
    • Camera Mount Structures
    • Camera Placement
    • Ethernet Camera Network Setup
      • General Overview and Specs
      • Windows 10 Network Settings
      • Cabling and Load Balancing
      • Switch Configuration for PrimeX 120
      • NETGEAR ProSafe GSM7228S: Disabling the Broadcast Storm Control
      • White/Blacklisting Cameras
    • USB Camera System Setup
      • USB Camera Network Overview and Specs
      • Duo 3 and Trio 3 Setup
      • Tracking Bar Coordinate System
        • Transforming Coordinate System: Global to Local
    • Aiming and Focusing
    • Camera Status Indicators
  • MOTIVE
    • Installation and License Activation
    • Motive Basics
    • Calibration
      • .mcal XML Calibration Files
      • Continuous Calibration
      • Continuous Calibration (Info Pane)
      • Calibration Squares
    • Markers
    • Assets
      • Gizmo Tool: Translate, Rotate, and Scale
    • Rigid Body Tracking
      • Aligning Rigid Body Pivot Point with a Replicated 3D Model
    • Skeleton Tracking
    • Trained Markersets
    • IMU Sensor Fusion
    • Data Recording
      • Data Types
    • Labeling
    • Data Editing
    • Data Export
      • Data Export: BVH
      • Data Export: C3D
      • Data Export: CSV
      • Data Export: FBX
      • Data Export: TRC
    • Data Streaming
    • Camera Video Types
    • Audio Recording
    • Motive HotKeys
    • Measurement Probe Kit Guide
    • Motive Batch Processor
    • Reconstruction and 2D Mode
  • MOTIVE UI PANES
    • Settings
      • Settings: General
      • Settings: Assets
      • Settings: Live Pipeline
      • Settings: Streaming
      • Settings: Views
      • Settings: Mouse and Keyboard
      • Settings: Audio
    • Assets Pane
    • Builder Pane
    • Constraints Pane
      • Constraints XML Files
    • Calibration Pane
    • Data Pane
    • Devices Pane
    • Edit Tools Pane
    • Graph View Pane
    • Info Pane
    • Labels Pane
    • Log Pane
    • Probe Pane
    • Properties Pane
      • Properties Pane: Camera
      • Properties Pane: Force Plates
      • Properties Pane: NI-DAQ
      • Properties Pane: OptiHub2
      • Properties Pane: Rigid Body
      • Properties Pane: Skeleton
      • Properties Pane: Take
      • Properties Pane: Trained Markerset
      • Properties Pane: eSync2
    • Status Panel
    • Toolbar/Command Bar
    • Control Deck
    • Viewport
  • PLUGINS
    • OptiTrack Blender Plugin
      • OptiTrack Blender Plugin
    • OptiTrack Unreal Engine Plugin
      • Unreal Engine: OptiTrack Live Link Plugin
        • Quick Start Guide: Real-Time Retargeting in Unreal Engine with Live Link Content
        • Unreal Editor for Fortnite (UEFN): OptiTrack Plugin for Live Link Hub
        • Unreal Engine: Live Link Camera Stream Setup
        • Live Link Content: Active Puck Static Meshes
      • Unreal Engine: MotionBuilder Workflow
      • Unreal Engine: HMD Setup
      • Unreal Engine VCS Inputs
    • OptiTrack Unity Plugin
      • Unity: HMD Setup
    • OptiTrack OpenVR Driver
    • OptiTrack MATLAB Plugin
    • Autodesk Maya
      • Autodesk Maya: OptiTrack Insight VCS Plugin
    • Autodesk MotionBuilder
      • Autodesk MotionBuilder Plugin
      • Autodesk MotionBuilder: OptiTrack Skeleton Plugin
      • Autodesk MotionBuilder: OptiTrack Optical Plugin
      • Autodesk MotionBuilder: OptiTrack Insight VCS Plugin
      • Autodesk MotionBuilder: Timecode Data
    • OptiTrack Peripheral API
    • External Plugins
      • Houdini 19 Integration
  • ACTIVE COMPONENTS
    • Active Components Hardware
      • Active Puck
      • Wired AnchorPuck
      • CinePuck
      • Wired CinePuck
      • BaseStation
      • Information for Assembling the Active Tags
      • Manus Glove Setup
    • Configuration
      • Active Batch Programmer
      • Active Hardware Configuration: PuTTY
      • Active Component Firmware Compatibility
    • Active Marker Tracking
      • Active Finger Marker Set
  • SYNCHRONIZATION
    • Synchronization Hardware
      • External Device Sync Guide: eSync 2
      • External Device Sync Guide: OptiHub2
    • Synchronization Setup
    • OptiTrack Timecode
  • VIRTUAL PRODUCTION
    • Unreal Engine: OptiTrack InCamera VFX
    • Entertainment Marker Sets
    • PrimeX 41
  • MOVEMENT SCIENCES
    • Movement Sciences Hardware
      • General Motive Force Plate Setup
      • AMTI Force Plate Setup
      • Bertec Force Plate Setup
      • Kistler Force Plate Setup
      • Delsys EMG Setup
      • NI-DAQ Setup
      • Multiple Device Setup
    • Movement Sciences Marker Sets
      • Biomechanics Marker Sets
      • Biomech (57)
      • Rizzoli Marker Sets
    • For Visual3D Users
    • Prime Color Camera Setup
      • Prime Color Setup: Required Components
      • Prime Color Setup: Hardware Setup
      • Prime Color Camera Setup: Camera Settings
      • Prime Color Camera Setup: Prime Color FS Calibration
      • Prime Color Setup: Data Recording / Export
      • Prime Color Camera Setup: FAQ / Troubleshooting
      • Prime Color Camera Setup: Windows Network Settings
  • VIRTUAL REALITY
    • VR Plugins
      • VR Unreal Engine
        • OptiTrack Unreal Engine Plugin
        • Unreal Engine: OptiTrack Live Link Plugin
          • UE5.1 Live Link Retarget External Workaround
        • Unreal Engine VCS Inputs
      • VR Unity
        • OptiTrack Unity Plugin
      • VR OpenVR
        • OptiTrack OpenVR Driver
    • VR HMD Setup
      • Unreal Engine: HMD Setup
      • Unity: HMD Setup
      • Manually Calibrating the HMD Pivot Point
      • Sync Configuration with an HTC Vive System
    • SlimX 13
    • Active Marker Tracking
      • Active Finger Marker Set
    • Synchronization Hardware
      • External Device Sync Guide: eSync 2
      • External Device Sync Guide: OptiHub2
  • ANIMATION
    • Autodesk Maya
      • Autodesk Maya: OptiTrack Insight VCS Plugin
    • Autodesk MotionBuilder
      • Autodesk MotionBuilder Plugin
      • Autodesk MotionBuilder: OptiTrack Skeleton Plugin
      • Autodesk MotionBuilder: OptiTrack Optical Plugin
      • Autodesk MotionBuilder: OptiTrack Insight VCS Plugin
      • Autodesk MotionBuilder: Timecode Data
  • ROBOTICS
    • MoCap4ROS2 Setup
    • OptiTrack Robot Applications
    • Outdoor Tracking Setup
  • DEVELOPER TOOLS
    • Developer Tools Overview
    • Camera SDK
      • Class: cCameraModule
      • Class: cUID
    • Motive API
      • Motive API: Quick Start Guide
      • Motive API Overview
      • Motive API: Function Reference
      • Motive API Camera Calibration
    • NatNet SDK
      • NatNet 4.1
      • NatNet: Class/Function Reference
      • NatNet: Creating a Managed (C sharp) Client Application
      • NatNet: Creating a Native (C++) Client Application
      • NatNet: Data Types
      • NatNet: Matlab Wrapper
      • NatNet: Migration to NatNet 3.0 libraries
      • NatNet: Remote Requests/Commands
      • NatNet: Sample Projects
      • NatNet: Unicast Data Subscription Commands
      • Latency Measurements
    • VRPN Sample
    • Peripheral API: Glove Devices
  • SKELETON MARKER SETS
    • Full Body
      • Baseline (41)
      • Core (50)
      • Biomech (57)
      • Conventional (39)
    • Full Body + Fingers
      • Baseline + Passive Fingers (49)
      • Baseline + Active Fingers (57)
      • Core + Passive Fingers (54)
      • Core + Active Fingers (62)
    • Upper
      • Baseline Upper (25)
      • Conventional Upper (27)
    • Lower
      • Baseline Lower (20)
      • Helen Hayes Lower (19)
      • Conventional Lower (16)
    • Hand and Fingers
      • Left/Right Hand (4) Active
      • Left/Right Hand (10) Active + Passive
      • Active Finger Marker Set
    • Glove Device Setup
      • Manus Glove Setup
      • StretchSense Glove Setup
    • Rizzoli Marker Sets
    • Entertainment Marker Sets
    • Rigid Body Skeleton Marker Set
  • GENERAL TROUBLESHOOTING
    • Licensing Troubleshooting
    • Windows 11 Optimization for Realtime Applications
    • Network Troubleshooting
    • Troubleshooting Q&A
    • Running Motive on High DPI Displays
    • Firewall Settings
Powered by GitBook
On this page
  • Overview
  • Environment Setup
  • Library Files
  • Header Files
  • Motive Files
  • Initialization and Shutdown
  • Initialization
  • Update
  • Shutdown
  • Setup the Project
  • Motive Application Profile
  • Camera Calibration
  • Camera Settings
  • Fetching Camera Settings
  • Configuring Settings
  • Other Settings
  • Updating the Frames
  • 3D Marker Tracking
  • Rigid Body Tracking
  • Importing Rigid Body Assets
  • Creating New Rigid Body Assets
  • Rigid Body 6 DoF Tracking Data
  • Rigid Body Properties
  • Data Streaming
  • Data Streaming Settings

Was this helpful?

Export as PDF
  1. DEVELOPER TOOLS
  2. Motive API

Motive API: Quick Start Guide

An overview of the Motive API.

PreviousMotive APINextMotive API Overview

Last updated 19 hours ago

Was this helpful?

Overview

SDK/API Support Disclaimer

OptiTrack provides developer tools to enable our customers across a broad set of applications to utilize their systems in the ways that best suit them. Our Motive API through the NatNet SDK and Camera SDK is designed to enable experienced software developers to integrate data transfer and/or system operation with their preferred systems and pipelines. Sample projects are provided with each tool, and we strongly recommend users reference or use the samples as reliable starting points.

The following list specifies the range of support OptiTrack provides for the SDK and API tools:

  • Using the SDK/API tools requires background knowledge of software development. We do not provide support for basic project setup, compiling, and linking when using the SDK/API to create your own applications.

  • We ensure the SDK tools and their libraries work as intended. We do not provide support for custom developed applications that have been programmed or modified by users using the SDK tools.

  • Ticketed support will be provided for licensed Motive users using the Motive API and/or the NatNet SDK tools from the included libraries and sample source codes only.

  • The Camera SDK is a free product. We do not provide free ticketed support for it.

  • For other questions, please check out the . Very often, similar development issues are reported and solved there.

This guide provides detailed instructions for commonly used functions of the Motive API for developing custom applications. For a full list of functions, refer to the page. For a sample use case of the API functions, please check out the provided project.

In this guide, the following topics will be covered:

  • Library files and header files

  • Initialization and shutdown

  • Capture setup (Calibration)

  • Configuring camera settings

  • Updating captured frames

  • 3D marker tracking

  • Rigid body tracking

  • Data streaming

Environment Setup

Library Files

When developing a Motive API project, the linker needs to know where to find the required library files. Do this either by specifying its location within the project or by copying these files to the project folder.

MotiveAPI.h

Motive API libraries (.lib and .dll) are in the lib folder within the Motive install directory, located by default at C:\Program Files\OptiTrack\Motive\lib. This folder contains library files for both 64-bit (MotiveAPI.dll and MotiveAPI.lib) platforms.

When using the API library, all of the required DLL files must be located in the executable directory. If necessary, copy and paste the MotiveAPI.dll file into the folder with the executable file.

Third-party Libraries

  • Additional third-party libraries are required for Motive API, and most of the DLL files for these libraries can be found in the Motive install directory C:\Program Files\OptiTrack\Motive\. Copy and paste all of the DLL files from the Motive installation directory into the directory of the Motive API project to use them. Highlighted items in the below image are all required DLL files.

  • Lastly, copy the C:\Program Files\OptiTrack\Motive\plugins\platforms folder and its contents into the EXE folder since those libraries will also be used.

Header Files

Function declarations and classes are contained in the header file MotiveAPI.h, located in the folder C:\Program Files\OptiTrack\Motive\inc\.

Always include the directive syntax for adding the MotiveAPI.h header file for all programs that are developed against the Motive API.

Motive Files

Motive API, by default, loads the default calibration (MCAL) and Application profile (MOTIVE) files from the program data directory unless otherwise specified. Motive also loads these files at startup. They are located in the following folders:

  • Default System Calibration: C:\ProgramData\OptiTrack\Motive\System Calibration.mcal

  • Default Application Profile: C:\ProgramData\OptiTrack\MotiveProfile.motive

Both files can be exported and imported into Motive as needed for the project:

Initialization and Shutdown

When using the API, connected devices and the Motive API library need to be properly initialized at the beginning of a program and closed down at the end.

Initialization

Initialize(); // Initializing all connected cameras

Motive Profile Load

Update

Initialize(); // Initializing all connected cameras

Update();     // Update for newly arrive cameras

Shutdown

Shutdown(); // Closing down all of the connected cameras

Setup the Project

Motive Application Profile

The Motive application profile (MOTIVE) stores the following critical information:

  • All the trackable assets involved in a capture;

LoadProfile("UserProfile.motive"); // Loading application profile, UserProfile.motive

Camera Calibration

Cameras must be calibrated to track in 3D space. Because camera calibration is a complex process, it's easier to calibrate the camera system from Motive, export the camera calibration file (MCAL), then load the exported file into custom applications that are developed against the API.

LoadCalibration("CameraCal.mcal"); // Loading MCAL file

Loading Calibration

  1. When successfully loaded, you will be able to obtain 3D tracking data using the API functions.

  • Calibration Files: When using an exported calibration file, make sure it remains a valid calibration. The file will no longer be valid if any aspect of the system setup has been altered after the calibration, including any quality degradation that can over time due to environmental factors. For this reason, we recommend re-calibrating the system routinely to guarantee the best tracking quality.

  • Tracking Bars: camera calibration is not required for tracking 3D points.

Camera Settings

Connected cameras are accessible by index numbers, which are assigned in the order the cameras are initialized. Most API functions for controlling cameras require the camera's index value.

This section covers Motive API functions to check and configure camera frame rate, camera video type, camera exposure, pixel brightness threshold, and IR illumination intensity.

Fetching Camera Settings

CameraProperty( int cameraIndex, const std::wstring& propertyName );

Configuring Settings

Use the SetCameraProperty function to configure properties outlined below.

SetCameraProperty( int cameraIndex, const std::wstring& propertyName, const sPropertyValue& value );
CameraNodeCameraEnabled

A Boolean value to indicate whether the camera is enabled (true) or disabled (false).

CameraNodeReconstructionEnabled

A Boolean value to indicate whether the selected camera will contribute to the real-time reconstruction of 3D data. Set the value to true to enable or false to disable.

CameraNodeImagerPixelSize

Length and width (in pixels) of the camera imager.

CameraNodeCameraVideoMode

An Integer value that sets the video mode for the selected camera.

Video Mode
Value

Segment

0

Grayscale

1

Object

2

Precision

4

MJPEG

6

Color Video

9

CameraNodeCameraExposure

An integer value that sets the exposure for the selected camera.

CameraNodeCameraThreshold

An integer value that sets the minimum brightness threshold for pixel detection for the selected camera. Valid threshold range is 0 - 255.

CameraNodeCameraLED

A Boolean value to indicate whether the camera's LED light are enabled (true) or disabled (false).

CameraNodeCameraIRFilterEnabled

A Boolean value to indicate whether the camera's IR filter is enabled (true) or disabled (false).

CameraNodeCameraGain

An integer value that sets the imager gain for the selected camera.

CameraNodeCameraFrameRate

An integer value that sets the frame rate for the selected camera.

Applicable values vary based on camera models. Refer to the hardware specifications for the selected camera type to determine the frame rates at which it can record.

CameraNodeCameraMJPEGQuality

An integer value that sets the video quality level of MJPEG mode for the selected camera.

MJPEG Quality
Value

Minimum Quality

0

Low Quality

1

Standard Quality

2

High Quality

3

CameraNodeCameraMaximizePower

A Boolean value to indicate whether High Power mode is enabled (true) or disabled (false) for the Slim 3U and Flex 3 cameras only.

CameraNodeBitrate

An integer value that sets the bitrate for the selected camera.

CameraNodePartition

An integer value that sets the bitrate for the selected camera.

CameraNodeFirmwareVersion

A string value that displays the Firmware version of the selected camera.

/// <item><description>CameraNodeFirmwareVersion (std::wstring)</description></item>
CameraNodeLogicVersion

A string value that displays the Logic version of the selected camera.

/// <item><description>CameraNodeLogicVersion (std::wstring)</description></item>

Other Settings

Updating the Frames

int main()
{
        Initialize();
	int frameCounter = 0; // Frame counter variable
 	while (!_kbhit())
	{
		if(Update() == eRESULT_SUCCESS)
		{
			// Each time the Update function successfully updates the frame,
			// the frame counter is incremented, and the new frame is processed.
			frameCounter++;

			////// PROCESS NEW FRAME //////
		}
	}
}

Update vs. UpdateSingleFrame

At the most fundamental level, these two functions both update the incoming camera frames, but may act differently in certain situations. When a client application stalls momentarily, it can get behind on updating the frames and the unprocessed frames may accumulate. In this situation, these two functions behave differently.

In general, we recommend using the Update function. Only use UpdateSingleFrame in the case when you need to ensure the client application has access to every frame of tracking data and you are not able to call Update in a timely fashion.

Update()            // Process all outstanding frames of data.
UpdateSingleFrame() // Process one outstanding frame of data.

3D Marker Tracking

Marker Index

In a given frame, each reconstructed marker is assigned a marker index number, which is used to point to a particular reconstruction within a frame. Marker index values may vary between different frames, but unique identifiers always remain the same.

Marker Position

For obtaining 3D position of a reconstructed marker, use the MarkerXYZ function

int totalMarker = MarkerCount();
printf("Frame #%d: (Markers: %d)\n", framecounter, totalMarker);

int x = 0;
int y = 0;
int z = 0;

//== Use a loop to access every marker in the frame ==//
for (int i = 0 ; i < totalMarker; i++) {
        MarkerXYZ(i,x,y,z);
        printf("\tMarker #%d:\t(%.2f,\t%.2f,\t%.2f)\n\n", 
		i, x, y, z);
}

Rigid Body Tracking

This section covers functions for tracking Rigid Bodies using the Motive API.

To track the 6 degrees of freedom (DoF) movement of an undeformable object, attach a set of reflective markers to it and use the markers to create a trackable Rigid Body asset.

There are two methods for obtaining Rigid Body assets when using the Motive API:

  • Import existing Rigid Body data.

Once Rigid Body assets are defined, Rigid Body tracking functions can be used to obtain the 6 DoF tracking data.

Importing Rigid Body Assets

Let's go through importing RB assets into a client application using the API. In Motive, Rigid Body assets can be created from three or more reconstructed markers, and all of the created assets can be exported out into either application profile (MOTIVE) Each Rigid Body asset saves marker arrangements when it was first created. As long as the marker locations remain the same, you can use saved asset definitions for tracking respective objects.

Exporting all RB assets from Motive:

  • Exporting application profile: File → Save Profile

Exporting individual RB asset:

LoadProfile("UserProfile.motive"); 	// Loading application profile
LoadRigidBodies("asset1.motive"); 	// Replaces RBs with RBs from "asset1.motive"  AddRigidBodies("asset1.motive");        // Adds RBs from file to already existing RBs
SaveRigidBodies("asset1.motive");       // Saves RBs from RB list to file

Creating New Rigid Body Assets

Rigid body assets can be defined directly using the API. The CreateRigidBody function defines a new Rigid Body from given 3D coordinates. This function takes in an array float values which represent x/y/z coordinates or multiple markers in respect to Rigid Body pivot point. The float array for multiple markers should be listed as following: {x1, y1, z1, x2, y2, z2, …, xN, yN, zN}. You can manually enter the coordinate values or use the MarkerXYZ function to input 3D coordinates of tracked markers.

When using the MarkerXYZ function, you need to keep in mind that these locations are taken in respect to the RB pivot point. To set the pivot point at the center of created Rigid Body, you will need to first compute pivot point location, and subtract its coordinates from the 3D coordinates of the markers obtained by the MarkerXYZ function. This process is shown in the following example.

CreateRigidBody(const wchar_t* name, int id, int markerCount, float* markerList);

Example: Creating RB Assets

int markerCount = MarkerCount;
vector<float> markerListRelativeToGlobal(3*markerCount);

// add markers to markerListRelativeToGlobal using MarkerXYZ, etc
int x = 0;
int y = 0;
int z = 0;

for (int i = 0; i < markerCount; ++i)
{
        MarkerXYZ(i, x, y, z);
    	markerListRelativeToGlobal.push_back(x);
    	markerListRelativeToGlobal.push_back(y);
	markerListRelativeToGlobal.push_back(z);
}

// then average the locations in x, y and z
for (int i = 0; i < markerCount; ++i)
{
    	float sx += markerListRelativeToGlobal[3*i];
    	float sy += markerListRelativeToGlobal[3*i + 1];
    	float sz += markerListRelativeToGlobal[3*i + 2];
}


float ax = sx/markerCount;
float ay = sy/markerCount;
float az = sz/markerCount;

vector<float> pivotPoint = {ax, ay, az};
vector<float> markerListRelativeToPivotPoint(3*markerCount);

// subtract the pivot point location from the marker location
for (int i = 0; i < markerCount; ++i)
{
    markerListRelativeToPivotPoint.push_back(markerListRelativeToGlobal[3*i] - ax);
    markerListRelativeToPivotPoint.push_back(markerListRelativeToGlobal[3*i + 1] - ay);
    markerListRelativeToPivotPoint.push_back(markerListRelativeToGlobal[3*i + 2] - az);
}

CreateRigidBody("Rigid Body New", 1, markerCount, markerListRelativeToPivotPoint);

Rigid Body 6 DoF Tracking Data

6 DoF Rigid Body tracking data can be obtained using the RigidBodyTransform function. Using this function, you can save 3D position and orientation of a Rigid Body into declared variables. The saved position values indicate location of the Rigid Body pivot point, and they are represented in respect to the global coordinate axis. The Orientation is saved in both Euler and Quaternion orientation representations.

RigidBodyTransform(int rbIndex, 				//== RigidBody Index
			float *x, float *y, float *z, 			//== Position
			float *qx, float *qy, float *qz, float *qw, 	//== Quaternion
			float *yaw, float *pitch, float *roll);   	//== Euler

Example: RB Tracking Data

//== Declared variables ==//
float	x, y, z;
float 	qx, qy, qz, qw;
float	yaw, pitch, roll;
int rbcount = RigidBodyCount();

for(int i = 0; i < rbcount; i++)
{
	//== Obtaining/Saving the Rigid Body position and orientation ==//
	RigidBodyTransform( i, &x, &y, &z, &qx, & qy, &qz, &qw, &yaw, &pitch, &roll );
	
	if( IsRigidBodyTracked( i ) )
	{
		wchar_t name[ 256 ];
		RigidBodyName( i, name, 256 );
		wprintf( L"\n%s: Pos (%.3f, %.3f, %.3f) Orient (%.1f, %.1f, %.1f)\n", name, x, y, z, yaw, pitch, roll );
	}
}

Rigid Body Properties

RigidBodyProperty(int rbIndex, const std::wstring& propertyName);
SetRigidBodyProperty(int rbIndex, const std::wstring& propertyName, const sPropertyValue& value);
    /// <list type="bullet">
    ///<item><description> NodeName                 (String) </description></item>
    ///<item><description> AssetName                (String) </description></item>
    ///<item><description> GeometryYawPitchRoll     (eVector3f) </description></item>
    ///<item><description> BoneMajorAxis            (Int) </description></item>
    ///<item><description> DefaultBoneLength        (double) </description></item>
    ///<item><description> DefaultBoneDiameter      (double) </description></item>
    ///<item><description> JointName                (String) </description></item>
    ///<item><description> ParentInfo               (String) </description></item>
    ///<item><description> ChildInfo                (String) </description></item>
    ///<item><description> JointVisible             (Bool) </description></item>
    ///<item><description> JointType                (String) </description></item>
    ///<item><description> DegreesOfFreedom         (Int) </description></item>
    ///<item><description> RotationOrder            (Int) </description></item>
    ///<item><description> RotationOffset           (eRotationf) </description></item>
    ///<item><description> TranslationOffset        (eVector3f) </description></item>
    ///<item><description> TipOffset                (eVector3f) </description></item>
    ///<item><description> AssetVisible             (Bool) </description></item>
    ///<item><description> Comment                  (String) </description></item>
    ///<item><description> MinimumBootingLabels     (Int) </description></item>
    ///<item><description> MinimumMarkerCount       (Int) </description></item>
    ///<item><description> MinimumBootingActive     (Int) </description></item>
    ///<item><description> Scale                    (double) </description></item>
    ///<item><description> SyntheticLabelGraphScale (double) </description></item>
    ///<item><description> ShowLabel                (Bool) </description></item>
    ///<item><description> ShowIMUState             (Int) </description></item>
    ///<item><description> DisplayTracked           (Bool) </description></item>
    ///<item><description> Color                    (Int) </description></item>
    ///<item><description> ShowBones                (Bool) </description></item>
    ///<item><description> BoneColor                (Int) </description></item>
    ///<item><description> ShowAxis                 (Bool) </description></item>
    ///<item><description> DisplayPositionHistory   (Bool) </description></item>
    ///<item><description> DisplayHistoryLength     (Int) </description></item>
    ///<item><description> ShowDOF                  (Bool) </description></item>
    ///<item><description> ShowMarkerSet            (Bool) </description></item>
    ///<item><description> ShowTargetMarkerLines    (Bool) </description></item>
    ///<item><description> ShowMarkerLines          (Bool) </description></item>
    ///<item><description> Smoothing                (double) </description></item>
    ///<item><description> PredictionTime           (double) </description></item>
    ///<item><description> PositionDamping          (eVector3f) </description></item>
    ///<item><description> RotationDamping          (double) </description></item>
    ///<item><description> RotationDampingAxis      (Int) </description></item>
    ///<item><description> ModelAlpha               (double) </description></item>
    ///<item><description> GeometryType             (Int) </description></item>
    ///<item><description> GeometryFile             (String) </description></item>
    ///<item><description> GeometryScale            (eVector3f) </description></item>
    ///<item><description> GeometryOffset           (eVector3f) </description></item>
    ///<item><description> GeometryPitchYawRoll     (eVector3f) </description></item>
    ///<item><description> Name                     (String) </description></item>
    ///<item><description> UserData                 (Int) </description></item>
    ///<item><description> ActiveTagID              (Int) </description></item>
    ///<item><description> ActiveTagRfChannel       (Int) </description></item>
    ///<item><description> TrackingAlgorithmLevel   (Int) </description></item>
    ///<item><description> ShareMarkers             (Bool) </description></item>
    ///<item><description> MarkerID                 (Int) </description></item>
    ///<item><description> MarkerLocation           (eVector3f) </description></item>

Data Streaming

Once the API is successfully initialized, there are two methods of data streaming available.

Stream over NatNet

Once the data streaming is enabled, connect the NatNet client application to the server IP address to start receiving the data.

StreamNP(true);	// Enabling NatNet Streaming.

Stream over VRPN

StreamVRPN(true); // Enabling VRPN Streaming.

Data Streaming Settings

The Motive API does not support data streaming configuration directly from the API. These properties must be set in Motive.

  • Export the Motive profile (MOTIVE file) that contains the desired configuration.

  • Load the exported profile through the API.

Note: You can define this directory by using the MOTIVEAPI_INC, MOTIVEAPI_LIB environment variables. Check the project properties (Visual Studio) of the provided project for a sample project configuration.

The can be imported using the function to obtain software settings and trackable asset definitions.

The Calibration file can be imported using the function to ensure reliable 3D tracking data is obtained.

To initialize all of the connected cameras, call the function. This function initializes the API library and gets the cameras ready to capture data, so always call this function at the beginning of a program. If you attempt to use the API functions without initializing first, you will get an error.

loads the default Motive profiles (MOTIVE) from the ProgramData directory during the initialization process. To load a Motive profile from a different directory, use the function.

The function is primarily used for updating captured frames, but it can also be called to update a list of connected devices. Call this function after initialization to make sure all of the newly connected devices are properly initialized in the beginning.

When exiting out of a program, call the function to completely release and close all connected devices. Cameras may fail to shut down completely if this function is not called.

Software configurations including and .

When using the API, we recommend first configuring settings and defining the trackable assets in Motive, then exporting the profile MOTIVE file, to load by calling the function. This allows you to adjust the settings for your needs in advance without having to configure individual settings through the API.

Once the calibration data is loaded, the 3D tracking functions can be used. For detailed instructions on camera calibration in Motive, please read through the page.

In Motive, calibrate the camera system using the Calibration pane. Follow the page for details.

After the system has been calibrated, from Motive.

Using the API, Import the calibration to your custom application by calling the function.

When processing all of the cameras, use the function to obtain the total camera count and process each camera within a loop. To point to a specific camera, use the function to check and use the camera with its given index value.

Camera settings are also located in the Devices pane of Motive. For more information on each of these camera settings, refer to the page.

Corresponds to the setting in Motive's .

Corresponds to the setting in Motive's .

Corresponds to the value in Motive's

Corresponds to the setting in Motive's .

Please see the page for more information on video modes.

Corresponds to the setting in Motive's .

Corresponds to the setting in Motive's .

Corresponds to the setting in Motive's .

Corresponds to the setting in Motive's .

Corresponds to the setting in Motive's .

Corresponds to the setting in Motive's .

Corresponds to the MJPEG Quality setting in Motive's .

Corresponds to the Maximize Power setting in Motive's .

Corresponds to the Bitrate setting in Motive's .

Corresponds to the Bitrate setting in Motive's .

Corresponds to the property in Motive's .

Corresponds to the Logic Version property in Motive's .

There are other camera settings, such as imager gain, that can be configured using the Motive API. Please refer to the page for descriptions on other functions.

To process multiple consecutive frames, call the functions or repeatedly within a loop. In the example below, the Update function is called within a while loop as the frameCounter variable is incremented:

The function disregards accumulated frames and services only the most recent frame data. The client application will not receive the previously missed frames to process.

The function ensures only one frame is processed each time the function is called. If there are significant stalls in the program, using this function may result in accumulated processing latency.

After loading a valid , you can use API functions to track retroreflective markers and get their 3D coordinates. Since marker data is obtained for each frame, always call the or the function each time newly captured frames are received.

You can use the function to obtain the total marker count and use this value within a loop to process all of the reconstructed markers.

Please read the page for detailed instructions on creating and working with Rigid Body assets in Motive.

Define new Rigid Bodies using the function.

Exporting Rigid Body file (profile): Under the , right-click on a RB asset and click Export Rigid Body

When using the API, you can load exported assets by calling the function for application profiles and the or function. When importing profiles, the LoadRigidBodies function will entirely replace the existing Rigid Bodies with the list of assets from the loaded profile. On the other hand, AddRigidBodes will add the loaded assets onto the existing list while keeping the existing assets. Once Rigid Body assets are imported into the application, the API functions can be used to configure and access the Rigid Body assets.

In Motive, Rigid Body assets have assigned to each of them. Depending on how these properties are configured, display and tracking behavior of corresponding Rigid Bodies may vary.

For detailed information on individual Rigid Body settings, read through the page.

The function enables/disables data streaming via the . This client/server networking SDK is designed for sending and receiving OptiTrack data across networks, and can be used to stream tracking data from the API to client applications from various platforms.

The StreamNP function is equivalent to Broadcast Frame Data from the pane in Motive.

Mocap data can be livestreamed through the Virtual Reality Peripheral Network (VRPN) using the function.

Please see the page for information on working with the OptiTrack VRPN sample.

In Motive, configure the streaming server IP address and other data streaming settings. See the page for more information.

application settings
data streaming settings
Calibration
Calibration
Devices pane
Camera Video Types
Camera Properties
Camera Properties
Camera Properties
Camera Properties
Camera Properties
Motive API: Function Reference
Rigid Body Tracking
Assets pane
Rigid Body properties
Properties: Rigid Body
Data Streaming
VRPN Sample
Data Streaming
NaturalPoint forums
Motive API: Function Reference
Camera Properties
Camera Properties
Camera Properties
Camera Properties
Camera Properties
Camera Properties
Camera Properties
Camera Properties
Camera Properties
Camera Properties
Camera Properties
camera calibration
NatNet SDK
Camera Calibration.
Enabled
Reconstruction
Pixel Dimensions
Video Mode
Exposure
Threshold
LED
IR Filter
Gain
Frame Rate
Firmware Version
marker
marker
application profile
LoadProfile
LoadCalibration
Initialize
Initialize
LoadProfile
Update
Shutdown
LoadProfile
LoadCalibration
CameraCount
CameraID
Update
UpdateSingleFrame
Update
UpdateSingleFrame
Update
UpdateSingleFrame
MarkerCount
CreateRigidBody
LoadProfile
LoadRigidBodies
AddRigidBodes
StreamNP
StreamVRPN
export the calibration file (MCAL)
Library files required to run a Motive API application.
A Quadrocopter and the corresponding Rigid Body defined in Motive.