/
Understand object tracking in Shogun Live

Understand object tracking in Shogun Live

With Shogun 1.6 and later, you can specify that a selected rigid object is tracked using a custom object tracker.

Conventional object tracking requires at least three cameras with an unoccluded view of a marker to be able to reconstruct it in 3D. The object tracker is an alternative set of algorithms that are based on computing the 6DoF pose of the object directly, using 2D centroid data.

The advantage of this approach is that Shogun produces tracking that is robust to occlusion and can track the object into parts of the volume where it may not be possible to form enough reconstructions for the conventional approach (Full Body) to apply. Latency is also minimized as it is not necessary to wait for reconstruction, labeling and solving to complete before an object pose is obtained.


The reconstructions generated by the reconstructor are used by the object tracker to boot a solution but are not needed on every frame, allowing the object tracker to generate results even in situations where the Full Body solution drops frames.

To use the object tracker:

  1. Create the object in the usual way:
    1. In the 3D Scene, select the relevant markers.
    2. In the Tracking panel, with the Setup tab selected, go to the Prop field, enter a name for the object and click Create.
  2. On the Properties tab, ensure Track with ObjectTracker is selected.

Specify settings for optimal object tracking

If you can't see these parameters, at the top of the relevant panel, click Show Advanced.

System panel settings

Towards the top of the System tab, the following Advanced global parameters that relate to object tracking are displayed:

SettingDescription
Force Lowest Latency

Some models of Vicon cameras can be forced into a mode to provide the lowest possible latency at the cost of reducing grayscale and centroid throughput. Generally, select this setting only when using low-latency object tracking.
However, note that for some large systems with a high number of cameras, this may cause network problems that manifest as cameras appearing to drop out. For more information, see Use the Force Lowest Latency option.
Default: Cleared

Object Tracking LevelSet the camera and processing parameters to the required level of object tracking:
  • Standard: Basic level involving no camera changes
  • Use Grayscale: Enables object pose jitter reduction using grayscale data.
    Note that this requires camera to send only grayscale data, which for large camera counts may cause network issues.
  • Object Tracking Only: As for Use Grayscale and enables lower quality centroid data to further reduce jitter.

Default: Standard
See also
Handling jitter.

Use the Force Lowest Latency option

To enable you to to achieve lowest latency possible for object tracking, so that external devices driven by object pose (such as an LED wall virtual camera driven by a Vicon-tracked hero camera) are as close to real time as possible, you can control whether any attached Vicon cameras have their DSPs (Digital Signal Processors) enabled or not. You do this by selecting or clearing the Force Lowest Latency option. Selecting this option provides the lowest possible latency at the cost of reducing grayscale and centroid throughput. However, it is normally only useful if you are using low-latency object tracking (see Use the object tracker).

Note that this feature is not available for Vicon cameras that do not have a DSP.

Caution
  • If you are tracking only a single object or a few rigid objects, you may want to reduce the camera delivery latency and instead allow Shogun, rather than the camera, to fit grayscale blobs. In this case, you can select Force Lowest Latency.
  • However, we advise you not to select Force Lowest Latency if the system contains a significant number of Vantage cameras and/or a lot of markers, as this can overload the network with grayscale data.

To use the Force Lowest Latency option:

  1. Connect Shogun Live to a system containing Vicon optical cameras that have a DSP, such as Vantage and/or ViperX, with a number of markers scattered throughout the volume.

  2. On the System panel, ensure the Advanced options are displayed. If not, at the top right of the Systems panel, click Show Advanced.

  3. Select Force Lowest Latency.



    This ensures that any attached cameras with DSPs do not use them.

If you select Force Lowest Latency in a system with a large number or cameras and/or markers, the cameras may overload the network with grayscale data, which can result in HAL error messages in the Log, and cameras appearing to drop out or show as not transmitting data (ie, in the System panel, their icons are gray or flicker to gray). To minimize this effect:

  • If possible, use 10 GB networking, rather than 1 GB networking.
  • On the System panel, check the Object Tracking Level. If possible, set it to Standard, rather than Use Grayscale or Object Tracking Only, which forces the cameras to send grayscale data.
  • For all cameras, check that their Grayscale Mode (found in the Optical Setup section of the System panel) is NOT set to Only.

Note that the Force Lowest Latency setting is not available if no cameras with DSPs are attached to the system.

Handling jitter

Jitter is mainly caused by these issues:

  • Changing sets of centroid-label correspondences (rays) contribute to the object, interacting with small inaccuracies in the camera calibration so that flickering contributions cause the optimum pose to jump about.
  • Noise on the centroids cause the optimum pose to change.

If you experience large levels of jitter and/or you know that it's caused by the first issue, first try increasing the Object Tracking Level (the lowest level is Standard, the highest is Object Tracking Only).

If it's caused by the second issue, try selecting Enable Motion Model.

Processing panel settings

On the Processing tab, in the Object Tracking section, the following Advanced parameters that relate to object tracking are displayed:

SettingDescriptions
Thread CountNumber of threads to use. If set to zero (the default setting), the thread count is calculated automatically. Default: 0
Reprojection ThresholdMaximum allowable distance (in pixels) between a centroid and the projection of a marker into the same camera. Applies only to markers that are tracked using the Object Tracker. If you need to increase the Environmental Drift Tolerance, also increase this value.
Entrance ThresholdMinimum proportion of markers that must be displayed before an object is booted. If the proportion of markers that are displayed is less than this value, the object is not booted. You can override this value for selected objects by using object presets. Default: 1.00. See also Create and apply object presets.
Minimum Object Marker SeparationMinimum allowable separation distance between objects (in mm) to enable the objects to be tracked separately, based on the smallest distance between a marker on each object. Default: 10

Enable Motion Model

When selected, ensures stationary objects don't have a pose jitter arising from image noise. Select this setting to eliminate jitter that occurs when a scene is viewed through a stationary tracked camera. This setting may cause issues with very slow moving cameras. Default: Cleared. See also Handling jitter.

Create and apply object presets

Object presets enable you to specify the object tracking and smoothing (filtering) properties for selected objects. (Note that these properties override the overall object tracking properties that you can set in the Processing panel.)

By experimenting with different values for the filtering properties and applying them to an object using a saved object preset, you can evaluate filter performance to ensure the smoothest possible tracking for selected objects.

To create an object preset:

  1. In the Tracking panel, ensure the object whose tracking and/or filtering properties that you want to refine is selected.
  2. On the Properties tab below, ensure the Advanced properties are displayed and in the General section, on the right of the Object Preset field, click Manage.
  3. In the Object Presets dialog box, enter a name for the new preset, click Add and ensure the new preset is selected.



  4. Specify the required settings and close the dialog box.
    Object presets are saved in the Subjects.mcp file (found in C:\Users\Public\Documents\Vicon\ShogunLive1.8\LastRun\UserName), or in any exported tracking configuration.

To apply an object preset:

  1. In the Tracking panel, ensure that the object to which you want to apply the preset is selected and that Advanced properties are displayed.
  2. On the Properties tab below, ensure the Advanced properties are displayed and in the General section, from the Object Preset list, select the required preset.
    The tracking and/or filtering properties specified in the preset are applied to the object.

Tracking panel parameters

On the Tracking tab, with the required object (prop) selected, on the Properties tab, the following Advanced setting is displayed:

SettingDescription
Track With ObjectTrackerWhen selected, objects are tracked using the object tracker. Default: Cleared

© Copyright Vicon Motion Systems. All rights reserved.

Vicon trademarks