Tracking Hub

Version 1.1 | Published December 03, 2018 ©

Motion Analysis Integration

Motion Analysis (MA) is handled like any other tracking system in Viz Virtual Studio. There is no hard limit to the number of objects which can be tracked by Tracking Hub. This section provides some information specific for the MA systems.

The CORTEX system

The setup and calibration of the MA Raptor camera system is done by trained Motion Analysis Corporation Engineers. The software to control the cameras and the tracking is called Cortex. Initially Cortex was developed for Motion Capturing and not for camera tracking (camera tracking functionality was added at a later date), therefore currently most of the camera settings are not immediately obvious. This is especially true when it comes to camera offset fine adjustment and CCD calibration.

CCD calibration

Cortex calibrates the relative position of the CCD to the target base by an image based method. The camera observes a target, which will be moved in several positions. From this data the software is able to calculate the position and the field of view of the camera. Like all image based methods, this calculation is not 100% accurate and there is always need to fine-tune the pan, tilt and roll angles.

The fine-tuning of position and angles is always done in the Cortex system and never in the offset page of the tracking driver.

Sync and FPS (Frames pr second)

Every part of a Virtual Studio must be on sync. This is true for Cortex and the Raptor cameras as well. You should check that MA are connected to the same sync as Visual Studio or if the signals times are moving to each other.

Timing

Depending on the timing family of the sync signal the Cortex system must be set to the following settings:

  • 50 Hz format: Cortex runs at 150FPS with three frame reduction.

  • 59.94 Hz formats: Cortex runs at 120 FPS with two frame reduction.

These settings guarantee the lowest latency of the cortex system.

Network Connection

The Cortex system communicates through a network with the Tracking Hub. This communication is time critical. Every delayed package is worthless for the Virtual Set and will result in a jitter.

IMPORTANT! A separate tracking network between the computer running Tracking Hub and the Cortex machine is mandatory.

The use of a managed Switch is allowed, when it is possible to define a separate subnet for the tracking connection. If such a Switch is not available the use of a direct connection or a (basic) unmanaged switch for the connection is recommended.

Cortex precision limit

All optical tracking systems show a jitter in position and angle. The jitter in the rotation angles is much more visible than position jitter. The Cortex precision limit for angles is below 1/100th of degree. Even though this is quite accurate, the jitter is still visible when zooming completely in on a telephoto lens. It is not recommended to use complete near shots with any optical tracking system. One possibility is to use a mechanical tracking head for those shots.

Zoom and focus encoder

The Motion Analysis system uses a Zoom and Focus encoder to allow VIZ to calculate the actual Field of View of the Lens. The choice of the Encoder is important for the perfect result.

The following external encoders are recommended when no internal digital encoders on the lens are available:

  • MoSys

  • Shotoku

  • EncodaCam

  • Internal digital lens encoders

Motion Capturing

Please see Topology Panel.