Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • Smart Object templates are provided for Backstrap, Hat, OculusRift, HPReverb, HTCVive, and Pulsar, so you don't need to create these.
  • Note that you can merge Smart Objects, which is useful for the head in particular.
  • If you are using a backpack, you must create your own template (see Use Smart Object templates). When you do this, adjust the object origin offset (see Change a Smart Object's origin) to match the depth of the backpack, so Evoke knows where the participant's back is in relation to the markers that it tracks on the backpack.

Also ensure you have enabled Evoke to track Pulsars and props by creating the necessary Smart Objects and basic objects (see Create Smart Objects and Create basic objects).

Tip

Object tracking only: If only object tracking is required, you can auto-assign clusters without character solving or retargeting.
To use this option, ensure the Advanced options are displayed and in the Processing panel, under Characters From Clusters, select Disable Solving.

...

  1. As for the automated workflow, set up the name, template and retarget file, and select the required templates for the slots (see Create characters).
  2. At the right of the the Character line, click Create.
    The new character is added to the Tracking tree, with the populated slots as child nodes. Slots for which Smart Objects were selected are automatically named to match the character and slot (eg, Miss Black_Head in the following example). The yellow warning icons indicate that some information is missing.

    (tick) Tip: After you have created a character, you can create multiple new characters based on the same configuration without having to use the Advanced properties. Just enter a new character name and click Create.

    You then assign objects to the empty fields as described in the next step.

    Scroll pagebreak

  3. Assign objects (these can be a single Smart Object, a composite Smart Object or a basic object) to the slots. To do this:
    1. Ensure the objects are positioned in the volume in a way that makes it easy to tell which one is which (you may want to place them on a person or mannequin).
    2. In the Tracking panel, select the relevant objects, display the Advanced options and on the Properties tab make sure that Auto Assign Enabled is cleared.
    3. In the 3D Scene, select an object andCtrl+select its slot in the Tracking tree, then right-click either the object or the slot and select Assign Object.
    4. In the same way, assign the remaining objects to the appropriate slots.
    In the Tracking tree, the slots now all have the correctly assigned objects, which are also displayed in the 3D Scene.

    Scroll pagebreak
  4. If you have set up characters or composite Smart Objects, but have not linked Pulsars to them (eg, if you created Smart Objects from the supplied templates for some or all of the slots in a new character), link the new Smart Objects to the correct devices. To do this:
    1. Select an object andCtrl+select the required Smart Object in the Tracking tree.
    2. If either of the Smart Objects is a composite Smart Object, from the sub-menu, select which device slot is to be affected by the swap.
    3. Right-click either object and select Swap Cluster.
      For more information, see Swap clusters.

When you have finished assigning objects, you can calibrate the character (see Calibrate characters).

...

  1. Ensure each person to be calibrated is standingin the volume ina neutral pose (known as an N-pose), which is a relaxed pose with the hands by the sides.
  2. In the Tracking tab tree, select the character(s) that are to be calibrated and right-click.
    Scroll pagebreak
  3. In the context menu, click Calibrate (or to calibrate all characters, press the shortcut Shift-C).
    (If the selected character is already calibrated, Recalibrate is displayed on the context menu. If you select this option, the existing calibration is overwritten.)

    Each character is calibrated and Evoke renders the character in the3D Scene view. You can set the View Filters to show either the character source (solving) view or the retargeting view. The scale value that Evoke calculates for each character is used by the Unreal or Unity Plugin to render the character at the correct size for each participant.
    Scroll pagebreak

...