UE4 Display Cluster in CAVE

Game Engine Evaluation and Implementation of Distributed Real-Time Rendering in Sensorimotor Laboratory CAVE

UE4 Display Cluster Summary

  • Game Engine based Distributed Real-time Rendering
  • Stereoscopy and 360° Video Playback
  • MoCap and Eye Tracking Support
  • Laboratory Management System Integration
Video: 3D Scene Stereoscopic Rendering in the CAVE

The story behind

Distributed Real-time Rendering in CAVE


The aim was to evaluate different game engines with respect to functionality needed for applying them inside research contexts at the Institute of Sport Science ISPW.

For the selected game engine a display cluster for distributed real-time rendering was set up. A game with 3D and 360° video scenes was also implemented.

To integrate the game engine into the recently developed experiment management system, the game engine was extended by a messaging interface.

  • Distributed Real-time Rendering
  • Support for Stereoscopy and 360° Video Playback
  • Support for Processing Data generated from MoCap and Eye Tracking Devices
  • Service Bus Integration by Messaging Interface


Fig.: 3D Rendering in the CAVE
Fig.: 360° Video Playback in the CAVE

Game Engine Evaluation

Applicability criteria were possibility of distributed operation including the synchronization of game state and interaction across multiple game engine instances, but also the fidelity of the graphics and sound rendering functionality as well as the ease of the content-generation workflow.

Therefore the game engine Godot and the world's most widely used game engines Unity3D and Unreal Engine 4 were compared. UE4 won the race, not least because of its Blueprint visual scripting system and above all because of its cluster rendering plugin ‘nDisplay’ with Blueprint API, VRPN and nVidia’s GPU technology support.

Fig.: UE4 Plugin nDisplay for Clustered Rendering
Fig.: UE4 nDisplay Network with Cluster Nodes of varying Specifications1

Display Cluster Setup

System Management

The CAVE rendering cluster workstations were setup running operating system Windows 10. For system administration Chocolatey Package Manager and RESTful WinRM based Windows Admin Center were set in charge. For remote management the open-source tool Multi-Remote Next Generation mRemoteNG was installed on the launcher workstation. The software supports a various of protocols. We make use of RDP, VNC and SSH and have also included external tools like PowerShell Session, PuTTY SSH Client and FileZilla FTP. To support the development of the messaging interface, the network analysis tool Wireshark was also installed.

Fig.: GUI of Multi-Remote Tool mRemoteNG
Parallel Rendering

Each of the seven rendering cluster workstations operates with two nVidia Quadro M4000 graphics cards – exept of the front and back wall workstations with one GPU, each. A graphic card is dedicated to a single of twelve projectors. In stereo mode left- and right-eye images, or viewports, are rendered separately. A rendering workstation with two graphic cards therefore manages four viewports. These viewports are rendered in parallel and managed using nVidia Mosaic technology.

Fig.: nVidia Mosaic Parallel Rendering Set Up on a Rendering Workstation
Fig.: Parallel Rendering in Stereo Mode as seen in the CAVE
Warp and Blending

Depending on the position and orientation of a projector relative to the wall or floor, a projected image is distorted. To correct this image geometry, the images are warped.

Since the projections overlap, certain regions are illuminated by more than just one projector and results in bright stripes in the overall image. This is adjusted with the luminous intensity and black levels of the images and the projectors – the blending.

To manage these complex corrections we make use of a calibration tool from Scalable Display Technologies. The correction information of each projector then is stored as EasyBlend files – a technology which in turn can be used in the UE4 nDisplay configuration.

Fig.: Warped and Blended Left Wall Viewports
Cluster Synchronisation

The combination of each of the nVidia Quadro graphics cards with an nVidia Sync card results in support for generator locking (genlock), synchronised framebuffer swap (framelock) and vertical image synchronisation with its corresponding projectors (vSync) to avoid tearing. We use an overall refresh rate of 60Hz.

The cluster workstations system time is synchronised in the nanosecond range through the Precision Time Protocol PTP. We are using an Ubuntu Linux Server running PTPd as Stratum 2 time server. The cluster workstations are running w32tm as Stratum 3 clients.

Fig.: nVidia Display Synchronisation Settings on a Rendering Workstation
Fig.: w32tm Status Query of PTP Stratum 3

UE4 Plugins

Distributed Game Engine

With UE4 a project called Distributed Game Engine (DGE) was created. The edited assets were packed into a plugin with the same name. There you can find, e.g., a model of the CAVE, which was created with Blender. Also function libraries, a DGE messaging manager and all objects that can be accessed through the ESB messaging interface (see below) can be found in this plugin.

Fig.: UE4 Blueprint Plugin ‘Distributed Game Engine’
Fig.: Content of UE4 Plugin ‘Distributed Game Engine’
Fig.: Model of CAVE with a bunch of geometry sockets in UE4 Editor
3D Rendering

To demonstrate the 3D capacity of UE4 in the render cluster, a 3D level was created showing static, rotating and moving objects using physical based materials, as well as animated avatars. Newly created sport equipment was packed into a plugin.

Fig.: UE4 Blueprint Plugin ‘Sports Equipment’
Fig.: Content of UE4 Plugin ‘Sports Equipment’
Fig.: 3D Level in UE4 Editor
360° Video Playback

To enable immersive video playback an UE4 Blueprint Plugin was developed. In Blender a sphere with inverted normals was created which is used in UE4 as screen for shperical 360° video playback. By default UE4 embedds Windows MediaPlayer WMF for file based video playback. But also visualisation of streamed video is supported, e.g., using a VLC plugin.

Fig.: UE4 Blueprint Plugin ‘Immersive Video’
Fig.: Content of UE4 Plugin ‘Immersive Video’
Fig.: Spherical 360° Video Playback Level in UE4 Editor
MoCap and Eye Tracking Support

By default UE4 nDisplay is able to process motion capture data from, e.g., OptiTrack over VRPN. OptiTrack itself provides also with an UE4 Plugin for processing data from full body motion capture. We finally process MoCap position data streamed by the Laboratory Management System.

Also eye tracking data generated from, e.g., PupilLabs is streamed by the Laboratory Management System and processed in UE4 using the ESB Messaging Plugin (see Integration). We developed an UE4 Blueprint Plugin which serves with content for eye tracking data visualisation and processing, e.g., ray tracing and hit calculation. Also PupilLabs calibration shapes and a set of AprilTag textures are included.

Fig.: UE4 Blueprint Plugin ‘TPF Eye Tracking’
Fig.: Content of UE4 Plugin ‘TPF Eye Tracking’
Fig.: Eye Tracking Data Visualisation in UE4 Editor

To integrate a game into the system landscape of laboratories, another UE4 plugin was developed for interaction with an Enterprise Service Bus ESB or experiment management system respectively. Different messaging interfaces – Transform, RenderProperty, TextRender and MediaPlayer – got declared as JSON schemas (see schemas). The Blueprints process JSON payloads received by a message consumer accessing an incoming channel. Also sending payloads is implemented using a message provider accessing an outgoing link. Both make use of ZeroMQ sockets. Thanks to the object-oriented implementation it is possible to realize additional channels or links in the future, e.g. using RESTful technology.

Fig.: UE4 C++ and Blueprint Plugin ‘ESB Messaging
Fig.: Content of UE4 Plugin ‘ESB Messaging’
Fig.: C++ Classes of UE4 Plugin ‘ESB Messaging’

The ZeroMQ asynchronous messaging library is brought to UE4 by a plugin providing with a C++ function library wrapping C++ bindings from zmqcpp.

Fig.: UE4 C++ Plugin ‘ZeroMQ’
Fig.: C++ Classes of UE4 Plugin ‘ZeroMQ’


Using Anaconda Navigator a Python virtual environment was set up. Then a Jupyter Notebook App was programmed using pyzmq and pyWidgets for interactive testing of the messaging interface.

Fig.: Jupyter Notebook App for Interactive Integration Test, Tab Interface Transform
Fig.: Integration Test Protocol for Interface Transform