Below you can find short answers for frequently asked questions, improving your everyday Reality knowledge and allowing you to use Reality in the best possible way. Some of the answers have links that can lead you to in-depth documentation for that topic. 

Can I use the Cleanplates of Cyclorama in another engine?

Answer

 As Reality is an Image Based Keyer, the image that is taken from a specific camera, with a specific lens which might has a different Lens Center Shift Value will differ in each separate system. We recommend that you take the clean plates separately for each engine to make sure that you get the best result from each engines.

Can I use the Cleanplates of Cyclorama in another engine?

Answer

 As Reality is an Image Based Keyer, the image that is taken from a specific camera, with a specific lens which might has a different Lens Center Shift Value will differ in each separate system. We recommend that you take the clean plates separately for each engine to make sure that you get the best result from each engines.

I have different lighting setups in my studio. Which lighting is the best to take clean plates for Cyclorama?

Answer

 As Reality uses Image Based keying, it is important to take the captures to be used as Clean plates while you have the general lighting. It is advised to test and compare the result and to take specific captures for Clean plates if you observe a drastic change in the keying quality when you change the lighting setup in your physical studio. But if you are using just talent lighting in close shots, this might night be a need. Please refer to link here for more information on Reality Keyer.

How to use FlyCam

Answer

This is a step-by-step guide to make FlyCam virtual camera movement(s) including a sample action file.

  • Launch your studio and from Reality Control applications open Reality Setup, right-click and select Import Template > TrackedTalent Template.

Figure 1: Reality Setup application.

  • Prepare the Video I/O setup by adding AJA Card and AJA Out nodes and selecting the input and output ports accordingly.

  • If you have a camera tracking system, replace ZDTLNT_CameraTrack node with a relevant tracking node.

  • Define the cyclorama size, location and add capture under the function tab of the cyclorama node.

  • In the ZDTLNT_FLY_Offset node modify the Transform values to define the FlyCam position.

Figure 2: Custom Actor node for Fly Offset and its properties.

  1. Once you decide the position to fly to, open Reality Action Builder application (press F6 if you launched the studio from Launcher, or go to C:\Program Files\Zero Density\Reality Control Applications\2.x\ZDRealityActionBuilder.exe if you have run the project directly from the Reality Editor), and open the sample file below.

FlyCam.raction

  1. Under the Action “FLY” select “SetNodePropertyTask” and enter the FlyCam position, the flight duration and the interpolation. In our sample action file, the duration is set to 3 seconds and the interpolation is EaseInOut. Please note that “BACK” action has a duration of 2 seconds and the Transform value is set to “0”.

Figure 3: Reality Action Builder application.

Note: You may read more about how to create actions with Action builder here.

Is there an easy way to search for a property of the nodes?

Answer

 Use the FILTER area to search for the specific property on the node. Do not forget to clean the filter to search for another property or to view full properties of the nodes.

What is DilateRGBA node? How can i use it?

Answer

 Basically, it is a shrinking process on RGBA Channels separately. This node is used for overcoming the edge problems caused by spill and camera artifacts. The DILATE ALPHA node must be used.

Why can`t we receive data from tracking system?

Answer

 There are two possible reasons if the tracking node stopped receiving data:

  • Windows Firewall is turned ON. You need to turn OFF the Windows Firewall.

  • Check the destination IP address, source IP address and UDP Ports on your tracking device or its control device and verify that these values are matching with the engine's network configurations and also the UDP Port on the TRACKING node on your nodegraph.

We have finished the tracking device setup in our studio. How can we make sure if the tracking is well calibrated?

Answer

 Please go to your Tracking node, and make sure that CCD of the camera is situated in the zero point of the physical set. After making sure that the camera sensor is in zero point, you should check if moving the camera within 2 meters range to both negative and positive directions. Also the pan, tilt and roll values should be measured. For more information on track and lens calibration, please click here.

I am connected to my device via Teamviewer and why doesn't Reality Editor launch?

Answer

 You might have downloaded and installed Reality without any issues via Teamviewer session. But Reality Editor requires by design that the monitor is always open and never turned off. You might choose a KVM configuration as well.

Is there a way to group the nodes in the nodegraph?

Answer

Yes, you can group any number of nodes regardless of their category, please refer to this tutorial for more information.

Is there any simple way to add node to rgraph?

Answer

 Yes. Just press the tab button on the keyboard and you will see the search dialog where you can start write node name and add it to your node graph more faster. This feature also saves time in navigating for a specific node across its various categories.

Can Reality Engine work with or without GPUDirect technology?

Answer

 Absolutely YES.

The biggest benefit of Reality Engine is that it works with or without GPUDirect feature. Reality engine has an additional option to use GPUDirect for accelerating communication with video I/O devices. And also have the liberty to use it without the GPUDirect feature. This flexibility offers the choice to our users how they want to work with Reality engine.

What is GPUDirect and how to enable it in Reality Engine?

Answer

 GPUDirect is optimized pipeline for frame-based video I/O devices. The combined solution delivers the capability to maximize the performance capability of the GPUs. GPUDirect for Video is a technology that enables to take faster advantage of the parallel processing power of the GPU for image processing by permitting industry-standard video I/O devices to communicate directly with NVIDIA professional Quadro GPUs at ultra-low latency. Enabling GPUDirect in Reality engine, please refer to the link here.

Which GPU should be used for Ray Trace?

Answer

RTX 6000 GPU include dedicated ray tracing acceleration hardware, use an advanced acceleration structure and implement an entirely new GPU rendering pipeline to enable real-time ray tracing in graphics applications.

 More information on Certified GPUs. For more information about Ray trace on RTX GPU, click the link: https://developer.nvidia.com/discover/ray-tracing

Does Reality Engine support hardware Fill and Key?

Answer

 Yes. It is possible to get Fill and Key with AJA card’s physical ports with the introduction of pixel format properties. This will allow possibilities to send Fill and Key video channels over independent SDI ports from Reality as or can accept Fill and Key channels from other CG systems to use as DSK.

See Fill and Key.

Does Reality Engine support Real-time Data driven graphics?

Answer

 Yes. Reality 2.8 comes with a UMG based sample project with real-time data driven graphics for live football scores. JSON representation is used in the project to fetch real-time data updates. Know more about Creating and Running Data Driven template.

Is there a cook server installer?

Answer

 Yes. Cook Server installer is an optional installer component comes with the Reality Editor installer. More information on Cook Server installation guide.

Why cant I see Level Sequencer option in Create menu on node graph?

Answer

Level sequencer created in other versions must be reopened in current version of Reality Editor and click on “Save”to get compiled to the current version.

See Level Sequencer.

How can I mask AR reflections?

Answer

While using a second camera for AR pipelines to have reflection of the AR graphic on real world floor, and exclude some real areas for casting reflections on, please create a 3D model of places where you do not want to see reflections, then connect it as a separate actor for the projection and connect to showonly pin of the reflection camera as shown below.

p.s. Use reflection pin of the composite passes instead of MaskedReflection pin which had been used before while using second reflection camera at AR pipelines.

What are the Network ports and protocols for ZD Suite?

Answer

By default, Reality Engine uses only 2 TCP ports (6665 and 6666) for operation, and 5561 for Cook server.

  1. Reality Processing Engine Port
    This port is defined as 6666 with the default installation. It is used for control applications such as setup, action builder etc. to communicate with the Reality Engine(s).

  2. Reality .NET Port
    This port is defined as “6665” with the default installation. It is used to communicate with Reality Agent application, which is responsible for starting, stopping and monitoring engine statuses. The agent should be configured to run at the start-up and it will listen for the Launcher to send “start” and “stop” commands.

  3. Regarding to the tracking data UDP port, it is configurable through the tracking data supplier most of the time.

  4. Beside from above, if you use pixelstreaming node in your rgraph, then those ports defined in this node would be used in your workflow.

And if you have an API implementation through ActionBuilder, you can use other TCP/UDP ports according to your needs.

How To Define A Default Rgraph For Reality Editor?

Answer

  1. Right click to Reality Editor shortcut on your desktop and select Properties, then define the path to your rgraph; -Rgraph=<path to your rgraph>

  2. Hitting play button from Editor will load your defined rgraph automatically.

How to Create Hybrid Studio Rgraph?

Answer

In this document you will find how to create hybrid studio rgraph. Hybrid studio configuration is a 3D Mask topic. Basically, we define the mask type according to the color of the graphic. Generally, there are 4 different colors that we can define mask areas.

  • Black: Only graphic

  • Cyan: Video output

  • Red: Keying area

  • Yellow: Spill suppression

In order to create Hybrid Studio please follow the instruction below.

  • Create a Cyclorama Keyer nodegraph. Go to Cyclorama node, click on the AddProjection under Functions tag after you done with measuring dimensions of cyclorama. If you have any doubt about that please click here.

  • If you connect the mask pin of Cyclorama to Video mixer channel you will see the mask output as in figure1 which means this area will be keyed, if model or object get into greenbox. Black area will be full graphic.

-Figure 1

  • We define the borders of our virtual environment. Now it is time to define the real area. In order to do that create a new cyclorama node. Change the mask color as Cyan. Now, we have two different mask types in our configuration, we need to merge them with Merge Node. You will see the two different masks in channel output when you merge them. Change the transform values of Video mask cyclorama and bring it near to the first cyclorama. You can follow various scenarios depending on your real studio configuration.

-Figure 2

  • Most probably you will see green spill on the output. To get rid of this, duplicate the Video mask Cyclorama and change the mask color to yellow. Just like Cyan mask you need to connect mask pin to Merge Node.

-Figure 3

  • Merge mode properties and mask connections must look like to Figure4.

-Figure 4

 The hybrid studio is almost ready, but we also need to make some adjustments in order to get ready for demonstration or On air show . There might be problems on the edges of masks. There are a couple of ways of adjusting the masks. It depends on how accurate the tracking or lens calibration is. You can change the smoothness of Cyclorama like 0.001 and/or change CleanplateFOV, or another way is opening the capture via mspaint that you took for clean plate before, extending the green area and load again under the cyclorama node.

How to create animation list and play them from blueprints?

474px60

Answer

  • Create a blueprint actor in your project and add a skeletal mesh component to your blueprint

  • Select the Skeletal Mesh and select “Use Animation Asset” on Animation mode

  • Add ZDActor Component and name how your Blueprint will appear on Reality Setup (We’ll call this Hero)

  • Add a Custom Event On Event Graph ( We’ll call this event START )

  • Create a variable and select Anim Sequence Base object reference as Variable type and make this variable an Array. Make this variable public and compile your blueprint. After these steps select your animations on the Default value part of this Variable (We'll call this variable Animations)

  • Compile and Save your Blueprint and return to the content Browser of your Project. Create an Enumaration under Blueprints Tab. (We'll call this AnimList)

  • Open the created Enumeration and by clicking the “New” button on top right side of the window, create a list and Name all the Animations that you have. Keep in mind that these will be visible to user on Reality Setup. After you are finished, save close the Enumeration. (Our Animation names are Emote1, Emote2 and Emote3)

  • Now open your Blueprint again create an Integer to select which index of your animations and Enumeration will be played. This variable does not have to be public. And for a better use, adjust the slider range and Value Range of your Integer. If you have 3 Animations like this example, you can select 0 – 2 for your parameters. Compile your blueprint again.

  • Create a new Variable, make it Public, Select the name of the Enumeration List you created under Enum Tab from Variable Type and Compile your Blueprint. You should be able to see the list you created as a dropdown list on your Variable.

  • Create the Event Graph as Shown Below. Compile and Save the Blueprint. For detailed step-by-step creation of this event graph, see the end of this document.

  • Create a new function and name it OnChanged_AnimationsList. Right click on an empty space on the On Changed Animations List window and select “Call Function > Start” and connect the node to ON CHANGED ANIMATIONS LIST node. Now, Compile, Save and close the blueprint.

  • Drag this blueprint to the project and it should be visible on World Outliner.

  • Press Play. You should be able to see the multiviewer screen of Reality. Now open Reality Setup Application. You should see The Hero Node automatically created and when you click on this node, you can select the animation to play under Default > Animations List as a dropdown list.

How to Create BP_Hero Event Graph

  1. Drag and drop AnimationsList Enum variable to the Event Graph Window and Select “Get AnimationsList”

  2. Click and drag the output pin of the Animations List reference and choose Math > Conversions > ToInt (Byte)

  3. Drag and Drop Index Variable to the Event Graph Window and Select “Set Index”

  4. Connect Exec output of the Start Event to the Set Index Node

  5. Connect Animations Lists Converted Integer Output to the Integer Input of the Set Index Node

  6. Drag and Drop Animations Anim Sequence Base Variable to the Event Graph Window and Select “Get Animations”

  7. Click and drag the output pin of the Animations reference and Choose “Utilities > Array > Get (a copy)

  8. Connect the Integer Output Pin of Set Index Node To the Get Animation Nodes Integer Input Pin

  9. Click and drag the Exec output pin of the Set Node and Choose “Components > Animation > Play Animation (SkeletalMesh)”

  10. Connect the output of Get Animations Node to “New Anim to Play” Input of “Play Animation” Node.

  11. Enable or disable Looping on “Play Animation” Node

How to make a basic teleportation with Reality Setup?

Answer

Easiest and most basic teleportation setup is possible using Billboard Node found under UE Actor Nodes. Follow the steps below to create a basic teleportation in your projects;

  • You need to key the talent separately you’re willing to teleport and feed this keyed video to the Video pin of the billboard node.

  • Connect tracking pin of the billboard node to your real tracking.

  • Enable “TalentFacing” property of your billboard node so that virtual billboard will face the camera no matter where your camera moves.

  • Position this Billboard to the desired location in your project.

You have few options to give this billboard a teleportation effect.

  • You can Enable and disable its “EnableRender” property for an instant appearance and disappearance of your talent

  • You can change the scale of the billboard for giving a stretching effect.

  • You can change the transform values of the billboard node to give it a sliding effect.

  • You can even combine these methods to get the best result that suits your project the most.

You can see a basic Rgraph example below;

How to map network R drive?

Answer

  • Find the r drive on your network using file explorer (this folder needs to contain reality folder which contains assets, cooks, project, studios, config folders)

  • Right click on the folder you want to map as an r drive and choose ‘’map network drive…’’ option

  • Select the letter ‘’R’’ as for drive and click ‘’finish’’

  • When you click finish, new window should open and you should see reality folder inside the new window. and on the left part of the window, you should see your new network drive. After following these steps, you can configure your config files and reach r drive without any problems.

  • Inside of reality folder

  • This Pc

To get more detailed information about r drive configuration please visit: R Drive Mapping

How to do Multi-Format video IO with Reality?

Answer

In AJACARD node. you can open the hidden tab under Device catagory to reveal UseMultiFormat property. Enabling This Property will allow users to do do Multi-format Input-Output.

UseMultiFormat property on AjaCard Node

*This feature was developed on Reality 2.9. Older versions might not support multi-format Video I/O

What are the supported HDR formats in Reality

Answer

We are planning on implementing HDR in Reality 2.9 with standards ST2084(PQ) and HLG. These standards are most popularly used in the industry today.


How to render background graphics only?

Answer

There are two ways to render background separately on Reality Setup.

1st Way:

It is possible to use second virtual camera to render background graphics. Connecting Projection cube to hidden pin of the virtual camera render nodes output will hide the projected video on the render and only background graphics will be rendered. This way, reflections will not get lost on the final composite output but second rendering will increase the load on the GPU. Below, there is a screenshot for this process. On Channel 1, ZDCamera node is used to project video and making final compositing. And on Channel 2, ZDCamera_0 node is used to render background graphics only.

Channel1: Final Composite Output

Channel2: Background Graphics

2nd Way:

If losing reflections and refractions are not important for the project, it is possible to use color node with 0 Alpha instead of video on Projection node. Using this method will not project keyed talents on the 3D world and there will be no reflections or refractions of the keyed talents, but GPU load will not increase. And below, there is a screenshot for this process. Color node with 0 Alpha is connected to the Video input pin of the Projection node. Now, camera render can be used for both compositing and showing background graphics.

Channel1: Final Composite Output

Channel2: Background Graphics

How can I use Timecode Event Trigger node and its functionality?

Answer

You can use a TIMECODEEVENT TRIGGER node to trigger your events inside your blueprints via using the exact timecodes and queue them. TIMECODEEVENT TRIGGER node is not intended to be used through GUI, it is rather designed to be called through an API to access the values and change the values of this node. Overriding the AJA Timecode FSM is not available as this timecode is a counter of frames since the beginning of year 2000.

How to add a timecode for executing a later event?

Go to Functions tab of the  TIMECODEEVENT TRIGGER node and type a TriggerTime and click on EXECUTE button on AddTime as the screenshot attached. This will add this value to the queue shown in Properties of the same node. The timecode trigger clears the trigger times as soon as it has played the trigger. This is happening because of the functionality of this node. If that timecode passes and as it is past and never going to happen again, this is deleted from the queue.

How to run your project using command prompt on Editor mode?

Answer

We can use the command line to run the Reality Engine project.

Follow below;

  1. Press the Windows button and write Command Prompt on the search tab.

  2. Write below command on the command line. You should be modified your command according to your installation and project folder structure.

C:\ProgramFiles\ZeroDensity\RealityEditor\2.8\Engine\Binaries\Win64\UE4Editor.exe" "R:\Reality\Projects\StarterProject\StarterProject.uproject” -game

  1. Press Enter

Then, it will launch Reality Engine with editor mode.

How to set integer variable via API?

Answer

public void SetPropertyInt(string PropertyName, string NodeName, object Value, float StartTime, float Duration, string Interpolation)

{

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var _ IntegerProperty = new ZDProperty_Integer(PropertyName, “Default”, “”, (int)Value);

aGraph.RequestSetNodeProperty(NodeName, _IntegerProperty, StartTime, Duration, Interpolation);

}

}

How to set string variable via API?

Answer

public void SetPropertyString(string PropertyName, string NodeName, object Value, float StartTime, float Duration, string Interpolation)

{

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var _StringProperty = new ZDProperty_String(PropertyName, “Default”, “”, (string)Value);

aGraph.RequestSetNodeProperty(NodeName, _StringProperty, StartTime, Duration, Interpolation);

}

}

How to set array via API?

Answer

public void SetStringArray(string PropertyName, string NodeName, object Value, float StartTime, float Duration, string Interpolation)

{

ObservableCollection<string> array = new ObservableCollection<string>();

array.Add(“First1”);

array.Add(“Second2”);

array.Add(“Third3”);


foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var properties = aGraph.Nodes[NodeName].Properties;

foreach(var prop in properties)

{

            if(prop.MemberName == PropertyName)

{

            ((ZDProperty_String)prop).StringValues = array;

aGraph.RequestSetNodeProperty(NodeName, prop, 0, 0, “Jump”);

}

}

}

}

How to set boolean variable via API?

Answer

public void SetPropertyBoolean(string PropertyName, string NodeName, object Value, float StartTime, float Duration, string Interpolation)

{

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var _BooleanProperty = new ZDProperty_Boolean(PropertyName, “Default”, “”, (bool)Value);

aGraph.RequestSetNodeProperty(NodeName, _BooleanProperty, StartTime, Duration, Interpolation);

}

}

How to set float variable via API?

Answer

public void SetPropertyFloat(string PropertyName, string NodeName, object Value, float StartTime, float Duration, string Interpolation)

{

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var _FloatProperty = new ZDProperty_Float(PropertyName, “Default”, “”, (float)Value);

aGraph.RequestSetNodeProperty(NodeName, _FloatProperty, StartTime, Duration, Interpolation);

}

}

How to get float variable via API?

Answer

public float GetPropertyFloat(string PropertyName, string NodeName)

{

float value= 0.0F;

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var properties = aGraph.nodes[NodeName].Properties;


foreach(var prop in properties)

{

if(prop.MemberName==PropertyName)

{

value=((ZDCore.Model.ZDProperty_Float)prop).Value;

}                             

}     

}

return value;

}

How to get string variable via API?

Answer

public string GetPropertyString(string PropertyName, string NodeName)

{

string value= “”;

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var properties = aGraph.nodes[NodeName].Properties;


foreach(var prop in properties)

{

if(prop.MemberName==PropertyName)

{

value=((ZDCore.Model.ZDProperty_String)prop).Value.ToString();

}                             

}     

}

return value;

}

How to get transform variable via API?

Answer

public ZDTransform GetPropertyTransfrom(string PropertyName, string NodeName)

{

ZDTransform value= new ZDTransform(new ZDVector(0.0F,0.0F,0.0F), new ZDRotator(0,0,0), new ZDVector(0,0,0));

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var properties = aGraph.nodes[NodeName].Properties;


foreach(var prop in properties)

{

if(prop.MemberName==PropertyName)

{

value=((ZDCore.Model.ZDProperty_Transform)prop).Value;

}                      

}     

}

return value;

}

How to get boolean variable via API?

Answer

public bool GetPropertyBoolean(string PropertyName, string NodeName)

{

bool value= false;

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var properties = aGraph.nodes[NodeName].Properties;


foreach(var prop in properties)

{

if(prop.MemberName==PropertyName)

{

value=((ZDCore.Model.ZDProperty_Boolean)prop).Value;

}                      

}     

}

return value;

}

How to get int variable via API?

public void GetPropertyInteger(string PropertyName, string NodeName)

{

int value= 0;

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var properties = aGraph.nodes[NodeName].Properties;


foreach(var prop in properties)

{

if(prop.MemberName==PropertyName)

{

value=((ZDCore.Model.ZDProperty_Integer)prop).Value;

}                      

}     

}

return value;

}

How to set transform variable via API?

Answer

public void SetPropertyTransform(string PropertyName, string NodeName, object Value, float StartTime, float Duration, string Interpolation)

{

foreach(ZDRootNodeGraph aGraph in World.NodeGraphs)

{

var _TransformProperty = new ZDProperty_Transform(PropertyName, “Default”, “”, (ZDTransform)Value);

aGraph.RequestSetNodeProperty(NodeName, _TransformProperty, StartTime, Duration, Interpolation);

}

}

How to prepare an action for changing the file path?

Answer

VIDEO.raction

  • You can download the “ActionforVideoWall.mp4” file from this document. In this video, you will see the how to prepare an action which changed the dynamically file path of video wall using the media input node, step by step.

  • Also, you can download the VIDEO.raction file and modify it.

How to use 3rd party plugin in Reality Editor?

Answer

  • Please make sure the plug-in and Reality Editor version you are using are compatible (same). You can follow these steps to learn your version of Reality Editor.

  • Please open the Reality Editor

  • You can see “Reality Editor X.X.X. based on Unreal Editor Y.Y.Y”

Figure 1: Learning the current version of Reality (Unreal) Editor

Figure 2: Choosing the correct version of plug-in

  • Please download and install the 3rd party plug-in(s) to “C:\Program Files\Zero Density\Reality Editor

  • After the installation, you need the change Build Id of plug-in(s) with manually.

    1. Please go to the “C:\Program Files\Zero Density\Reality Editor\Engine\Plugins\Marketplace\Substance\Binaries\Win64

    2. Right click to “UE4Editor.modules” and open with Notepad

    3. Copy to “BuildId”

Figure 3: Build ID

  • Please go to the “C:\Program Files\Zero Density\Reality Editor\Engine\Plugins” and locate your new plug-in(s)

  •  After the locate folder(s)

  • Please go to the “C:\Program Files\Zero Density\Reality Editor\Engine\Plugins\XXXXX\Binaries\Win64

  • Right click to “UE4Editor.modules” and open with Notepad

  • Paste/change to “BuildId” and save it

  • If you have more than one plug-in(s), you need to repeat these steps for all of them

Figure 4: Before the change of Build ID

Figure 5: After the change of Build ID

Figure 6: Enable the plug-in(s)