To be able to create an augmented pipeline, we will use the ready templates by right-clicking on nodegraph and Import Template > Composite with Cyclorama and Keyer and add an AJADEVICE node. The nodegraph will look like this :

The SDI Inputs and Outputs

After making sure that AJA card/s is properly installed, driver updated and firmware checked, you should manage the values on AJA DEVICE and AJA INPUT.

Please make sure the DEVICE ID is selected on AJA INPUT.

 

Choose the INPUTMODE the same as the input pin on physical AJA Card, in this example, it will be SingleLink (1) as an HD Input:

Now go to the AJA OUTPUT and see that the PROGRAM pin of MIXER node is connected to the VIDEO pin of the AJA OUTPUT node.

Choose the OUTPUTMODE the same as the input pin on physical AJA Card, in this example, it will be SingleLink (5) as an HD Output:

And it is very important to choose the right VIDEO FORMAT to be able to see the display on your SDI Output, otherwise you will either see a distorted image or no image at all at your display monitor.

Lens and Tracking

Lens and tracking nodes provides the visualization and configuration of the physical lens on the camera and the tracking system installed in the studio. By default the lens and tracking related nodes are as below:

Choose the right lens on your system, if you think that your lens is not yet on this list, please choose the closest one that applies and in this case please consult to Zero Density Support Team for any inquiries for deciding the correct lens.

By default, the USER TRACK node is on the templates and you can choose and modify the tracking parameters by right-clicking on the nodegraph and go to Create > Tracking to see the available tracking devices on Reality and note that Reality is compatible with any tracking device that is using FreeD protocol which can be activated through FREED nodes.

Once you add the right tracking node, go to DEVICE properties and make sure that:

  • ENABLE DEVICE is set to TRUE.

  • UDPPORT or PORT NAME is correct which is configurable on this property according to the physical device`s UDPPort or the PortName on the engine.

  • PROTOCOL is correctly set to D1 or A1 which should be consulted to your tracking solution provider.

After finishing the configuration of Lens and Tracking nodes, you should be seeing the data flowing from tracking device. Please make the necessary check if the data flow is correct and fine tuned,

Projection

In Reality, talent is composited with graphics in 3D scene. rather than layer based compositing. This is provided by projecting the talent on a mesh called Projection Cube. PROJECTION node requires an undistorted image of the SDI Input. Thus, we add an UNDISTORT node before connecting the SDI Input to  VIDEO pin of the PROJECTION node as shown below:

Modify the GEOMETRY of the PROJECTION CUBE so that it fits the scene and the physical studio.

You can enter the below values which correspond to 50 meters to the right, left, near, and far from the zero point of the set if you have not changed the TRANSFORM of the PROJECTION CUBE. The values below correspond to 100*100 meters (shown in centimeters) projection area which will be big enough for your needs within your production.

Final Compositing

Finally, camera input, virtual objects, and post-processing parameters are all going into COMPOSITE PASSES node for the final output.