Skip to main content
Version: Reality 5.4 SP3

Context

In this tutorial, we'll start by launching our project without using Graph. Next, we'll create a node network by adding a UE5 node to the nodegraph canvas. We'll also spawn a Reality Camera and a Reality Tracked Billboard.

Then, we'll add AJAIn and AJAOut nodes to manage video inputs and outputs directly within the nodegraph. After that, we'll add camera tracking and talent tracking nodes, adjusting their properties to fit our R&D studio setup.

In the following steps, we'll set up a keying scenario using Reality Keyer and Cyclorama nodes. We'll add a TrackedTalentFly node to the nodegraph canvas and perform various dynamic node operations to create Rotation and Location Offset. Additionally, we'll use the Node Property Context Menu to expose different properties as inputs.

In the second phase, we'll create In and Out animations using the Actions module, which involves setting an Initial Keyframe.

For more information on the context, please refer to:

Before You Begin

Before starting, ensure the following:

  • You are receiving a signal from your AJA card and have selected the correct device details.
  • Your talent tracking and camera tracking data are healthy. You can check this via your dedicated device/interface or using the corresponding nodes. In our example, we use the Xync node for camera tracking data and the FreeD node for talent tracking data generated by Traxis Talent Tracking.
Essential

It is crucial to use full zoom out with your broadcast camera. Zooming in or out during the fly can lead to unwanted results.