/
Set up character solving

Set up character solving

Evoke provides human skeleton solving, enabling you to drive characters from clusters (Pulsars). The following topics explain the procedures that are related to character solving in the order you are likely to need them:

See also the Vicon video:

Prepare for character solving

Before you begin, ensure you have created any necessary Smart Object templates:

  • Smart Object templates are provided for Backstrap, Hat, OculusRift, HPReverb, HTCVive, and Pulsar, so you don't need to create these.
  • Note that you can merge Smart Objects, which is useful for the head in particular.
  • If you are using a backpack, you must create your own template (see Use Smart Object templates). When you do this, adjust the object origin offset (see Change a Smart Object's origin) to match the depth of the backpack, so Evoke knows where the participant's back is in relation to the markers that it tracks on the backpack.

Also ensure you have enabled Evoke to track Pulsars and props by creating the necessary Smart Objects and basic objects (see Create Smart Objects and Create basic objects).

Object tracking only
If only object tracking is required, you can auto-assign clusters without character solving or retargeting.
To use this option, ensure the Advanced options are displayed and in the Processing panel, under Characters From Clusters, select Disable Solving.

Create characters

Create characters in Evoke to represent each participant.

To create a character:

  1. In the Tracking pane, on the Setup tab, enter the name of the character that you want to create from clusters in the volume.

  2. At the top right of the pane below the Tracking tree, click Show Advanced, and in the Template list, ensure ClusterTemplate is selected.
  3. From the Retarget list, select a retarget file. (You can use the supplied ViconFemaleSample or ViconMaleSample.)
  4. In the Slots section below, click the first slot (Head), select either an OculusRift, HPReverb, HTCVive or a Hat, depending on the accessories that you are using. (Note that the supplied templates for supported headsets include two device slots for the front and top of the head.)
  5. If you will be using two reference objects (typically Pulsar clusters for both the head and the spine), in the Spine slot, select the Backstrap template or your own template for the backpack you're using. As shown in the following examples, which show commonly used setups, you can leave the rest of the slots empty.

  6. At the right of the the Character line, click Create.

Prepare the participants

  1. Attach the Pulsars to the Vicon accessories, making sure you use the correct mounting plates for each accessory, normally:
    • Flexible mounting plates for the foot straps and chest strap
    • Rigid mounting plates for the gloves and hat
  2. Make sure each participant is wearing the relevant Vicon accessories. At a minimum, these are:
    • Mocap hat (Pulsar attached with status light facing forward)
      or
      HMD clips attached to HMD (two Pulsars (front and top) attached - status light facing up)
    • Gloves (status light facing down)
    • Foot straps (status light facing forward)
    • Back strap, with Pulsar on back (status light facing up)
      or
      Backpack PC (status light facing up)

The following images show Pulsars correctly attached to a participant.

Assign objects and calibrate characters

After you have created the characters in Evoke, you can assign objects and calibrate the character for each participant in these ways:

  • Automated workflow: For each character, assign one or two reference objects (typically these are the Pulsar clusters for the head or spine) to the correct slots, designating the remaining objects as auto-assignable. The remaining objects are automatically assigned to the correct slots in a single step when you calibrate the character (see Assign clusters and calibrate characters (automated workflow)). For best results, use two reference objects, although this is not essential.

    This is normally the quickest and easiest way to assign objects and calibrate characters.

    (tick) Auto-assign for object tracking only: If only object tracking is required, you can auto-assign clusters without character solving or retargeting.

    To use this option, ensure the Advanced options are displayed and in the Processing panel, under Characters From Clusters, select Disable Solving.

  • Manual workflow: Manually assign each Smart Object (or basic object) to the correct slot, and finally, calibrate the character (see Assign clusters (manual workflow)).

Assign clusters and calibrate characters (automated workflow)

Object labeling
In the volume, physically label the Pulsars that are linked to the reference objects (usually the headset and backpack), to indicate where to place them (for example, Player1_Head). You can leave the auto-assignable objects in a general charging area and place them onto any character and any limb.

To quickly assign objects to a character and calibrate it, use the following automated workflow.

Prepare objects for auto-assignment

Prepare the objects for auto-assignment, as described in the following steps, which are required for first time setup only.

Prepare the unassigned objects

In the 3D Scene, select the unassigned objects that you want to be auto-assigned and in the Tracking panel, on the Properties tab, select Auto Assign Enabled.

Tip
You may find it easiest to select the required unassigned clusters by first arranging them in a group in the volume, so you can select them easily.

In the Tracking tree, when an object has Auto Assign Enabled selected, its icon displays a small triangle in the lower left corner, giving you a quick visual indication of the object's status.

Prepare the reference object(s)

Assign one or (preferably) two objects (these can be Smart Objects, composite Smart Objects or basic objects) to the appropriate slots. Typically the reference objects are the Pulsar clusters for the head and/or spine. To assign them to their slots:

  1. In the Tracking panel, select the reference object, display the Advanced options and on the Properties tab make sure that Auto Assign Enabled is cleared.

  2. Ensure the objects are positioned in the volume in a way that makes it easy to tell which one is which (you may want to place them on a person or mannequin).
  3. In the 3D Scene or in the Tracking tree, select the object andCtrl+select its slot in the Tracking tree, then right-click either the object or the slot and select Assign Object.

  4. Ensure the reference object is displayed in the correct slot for the character.

  5. You can now auto-assign the remaining objects for the character, as described in Auto-assign objects and calibrate characters.

Auto-assign objects and calibrate characters

Ensure you have prepared both the unassigned objects and one or more reference object(s) (see Prepare objects for auto-assignment), and the participants (see Prepare the participants).

At the start of each experience, auto-assign objects and calibrate each character, as described in these steps:

  1. Get the participant(s) to stand in the capture volume in a neutral pose (known as an N-pose), which is a relaxed pose with the hands by the sides.

    The following image shows a character in a neutral pose (to show the pose clearly, the character has been calibrated).

  2. Ensure that the following objects are attached to each participant:
    • One or (preferably) two reference object(s). Typically these are the Pulsar clusters for the head or spine. For best results, use both, although this is not essential.
    • Auto-assignable objects on their other limbs (eg, hands and feet).

  3. In the Tracking panel, select one or more participant's characters, right-click and then click Calibrate (or, to calibrate all characters, press the shortcut Shift-C).

    The objects are 'fitted', based on their positions in relation to the available and unpopulated slots, and assigned correctly. If you're solving a character, it is displayed with a skeleton.

Assign clusters (manual workflow)

  1. As for the automated workflow, set up the name, template and retarget file, and select the required templates for the slots (see Create characters).
  2. At the right of the the Character line, click Create.

    The new character is added to the Tracking tree, with the populated slots as child nodes. Slots for which Smart Objects were selected are automatically named to match the character and slot (eg, Miss Black_Head in the following example). The yellow warning icons indicate that some information is missing.

    (tick) Tip: After you have created a character, you can create multiple new characters based on the same configuration without having to use the Advanced properties. Just enter a new character name and click Create.

    You then assign objects to the empty fields as described in the next step.

  3. Assign objects (these can be a single Smart Object, a composite Smart Object or a basic object) to the slots. To do this:

    1. Ensure the objects are positioned in the volume in a way that makes it easy to tell which one is which (you may want to place them on a person or mannequin).
    2. In the Tracking panel, select the relevant objects, display the Advanced options and on the Properties tab make sure that Auto Assign Enabled is cleared.
    3. In the 3D Scene, select an object andCtrl+select its slot in the Tracking tree, then right-click either the object or the slot and select Assign Object.
    4. In the same way, assign the remaining objects to the appropriate slots.
    In the Tracking tree, the slots now all have the correctly assigned objects, which are also displayed in the 3D Scene.

  4. If you have set up characters or composite Smart Objects, but have not linked Pulsars to them (eg, if you created Smart Objects from the supplied templates for some or all of the slots in a new character), link the new Smart Objects to the correct devices. To do this:
    1. Select an object andCtrl+select the required Smart Object in the Tracking tree.
    2. If either of the Smart Objects is a composite Smart Object, from the sub-menu, select which device slot is to be affected by the swap.
    3. Right-click either object and select Swap Cluster.
      For more information, see Swap clusters.

When you have finished assigning objects, you can calibrate the character (see Calibrate characters).

To display or hide character slots and assigned objects in the Tracking tree, click the + or - symbol next to the character icon.

You can clear slots that you have assigned manually at any time after creation:

  • To clear manual slot assignments, in the Tracking tree, right-click one or more character slots and then select Unassign object(s).

Calibrate characters

You must ensure each character is calibrated, but depending on the way in which you create your characters, the workflow is slightly different:

  • If you use the automated workflow for object assignment and character calibration, both object assignment and calibration occur when you click Calibrate. Calibration assigns the clusters for which Auto Assign Enabled was selected to empty character slots; scales the source skeleton; and accounts for differences between the 'reference' position and where the clusters were actually placed on the participant, eg, if the participant was wearing heels, the clusters slipped, or the backpack straps were loose, etc (see About cluster calibration). Calibration also starts retargeting (if required).
  • If you have used the manual workflow for object assignment and character calibration and have therefore manually assigned the clusters to the slots, calibration does not perform any further cluster assignment and just scales the source skeleton to the participant and accounts for differences between the reference position and actual cluster placement (see About cluster calibration). Calibration also starts retargeting (if required).

    To calibrate your characters, complete the following steps.

To calibrate characters:

  1. Ensure each person to be calibrated is standingin the volume ina neutral pose (known as an N-pose), which is a relaxed pose with the hands by the sides.
  2. In the Tracking tab tree, select the character(s) that are to be calibrated and right-click.
  3. In the context menu, click Calibrate (or to calibrate all characters, press the shortcut Shift-C).

    (If the selected character is already calibrated, Recalibrate is displayed on the context menu. If you select this option, the existing calibration is overwritten.)

    Each character is calibrated and Evoke renders the character in the3D Scene view. You can set the View Filters to show either the character source (solving) view or the retargeting view. The scale value that Evoke calculates for each character is used by the Unreal or Unity Plugin to render the character at the correct size for each participant.

About cluster calibration

Cluster calibration corrects for minor variability in Pulsar placement on the feet and spine, along with refining the skeleton to give more lifelike character-solving. This helps adapt Characters From Clusters (CFC) to inconsistent application of the clusters to the body for specific objects on certain common axes. This happens automatically when you calibrate characters.

Example of a misaligned foot cluster:


Without cluster calibration


With cluster calibration

Note
Not all clusters are calibrated on all axes. This feature adapts to minor deviations for foot and spine orientation and position to adjust for commonly misplaced locations.

Change a character's properties (optional)

Select the skin and color

By default, to enable pre-visualization of solve quality, new characters are displayed in the Workspace with the Vicon source skin. To enable you to identify the character more easily, you can select a male or female skin variant as well as a skin color.

To change a character's skin and/or color:

  1. Ensure the character whose skin or color you want to change is selected.
  2. In the Tracking pane, on the Properties tab, click Skin or Color in the General section.
  3. Select the required skin or color.

    To view the character's skin, you must calibrate the character.

Select the retarget file and retarget skin

You can retarget the source skeleton to a suitable character skeleton for use in a game engine or visualization tool. You can also preview the character mesh in Evoke. Retargeting requires a retarget setup file (*.vsr) that has been created in the Vicon Retarget application (see Set up character retargeting).

You can choose the retarget file for the character when you first create the character (see Step 3 of Create characters), but you can change both the retarget file and select a retarget skin.

  1. In the Tracking tree, ensure the required character is selected.
  2. On the Properties tab, in the General section, from the Retarget menu, select the required file.

  3. From the Retarget Skin menu, you can also select a retarget skin for visualization in Evoke. For example you might have one retarget file for your character, but a number of different colored skins to represent each player.

    To display correctly, the retarget skin must have the same skeleton as the retarget setup.

Retarget files are found in this default location:

C:\Users\Public\Documents\Vicon\Retargets

Installed retarget subject files are located by default in:

C:\Program Files\Vicon\Evoke1.#\Configuration\Retargets

Clear a calibration and un-assign clusters

Clearing a calibration stops the character solve and returns the skeleton to the default scale, ready for the next participant.

It also returns a character's assignable objects to their unassigned state, ready for their next use.

To return a character's calibration to the default scale and un-assign clusters:

  • In the Tracking tree, right-click the character name and then select Clear Calibration.

© Copyright Vicon Motion Systems. All rights reserved.

Vicon trademarks