Run your Reality Engine output on a workstation and stream its rendered video frames to browsers and mobile devices over WebRTC.
For SDI video output signals, Reality Engine uses AJA video I/O cards, but now with the introduction of Pixel Streaming, it is extended even further to be able to stream Reality Engines' output to browsers and mobile devices.
With Pixel Streaming, you run output from Reality Engine on a desktop PC or workstation, along with a small stack of web services that are included with the Reality Engine. People connect using any modern Web browser on their platform of choice, whether desktop or mobile, and stream the rendered video frames from the Reality Engine application. There's no need for users to install or download anything. It's just like streaming a video from YouTube or Netflix and even custom HTML5 UI that you create in your player Web page.
Follow the steps below to stream the rendered output from your Reality Engine Project over your local network to browsers and mobile devices.
How Pixel Streaming Works
With Pixel Streaming, however, uses Reality Engine workstation to render the final video output which uses the resources available to that workstation — Video I/O, CPU, GPU, memory, and so on and render every frame. It continuously encodes this rendered output into a media stream, which passes through a lightweight stack of Web services. Users can then view that broadcast stream in standard Web browsers running on other computers and mobile devices.
The diagram below shows output from Reality Engine is streamed to multiple devices using a web browser.
Multiple output from Reality Engine is streamed to multiple devices using a web browser.
Pixel streaming is not restricted to only one output from Reality Engine, but rather multiple outputs can be streamed based on the configuration done in the graph. PIXEL STREAM node has an INPUT pin which can receive output from any node. Multiple PIXEL STREAM nodes can be used to stream to multiple browsers. In the image below, 2 outputs are streamed to 2 independent browsers, one is Program out and another is Multiviewer out.
When you add PIXEL STREAMnode into the .rgraph and execute the "Start" function, it begins sending the media stream to the web browser through configured HTTP port.
A client connects to the signaling server.
A direct connection is established between the client browser and the PIXEL STREAM node.
As soon as the connection between the client and the PIXEL STREAM node is established, thePIXEL STREAM node starts streaming media directly to the browser.
IP addresses - You'll need to know the IP address of Reality Engine (Source of streaming output). It's a good idea to get started with Pixel Streaming within a LAN or VPN, which means that you'll need the internal IP address of your computer. You can get this by running the ipconfig command from a command prompt or console window, and finding the line that starts with IPv4 Address. If you're trying to connect from a computer or mobile device on a different network, you'll need your external IP address.
Supported Client Browsers
Pixel Streaming playback works on any modern browser that supports the WebRTC protocol. For example, it has been tested and is known to work in recent versions of the following browsers without additional configuration:
Google Chrome (desktop and mobile)
Apple Safari (desktop and mobile)