Distributed Real-time Rendering in CAVE
The aim was to evaluate different game engines with respect to functionality needed for applying them inside research contexts at the Institute of Sport Science ISPW.
For the selected game engine a display cluster for distributed real-time rendering was set up. A game with 3D and 360° video scenes was also implemented.
To integrate the game engine into the recently developed experiment management system, the game engine was extended by a messaging interface.
Applicability criteria were possibility of distributed operation including the synchronization of game state and interaction across multiple game engine instances, but also the fidelity of the graphics and sound rendering functionality as well as the ease of the content-generation workflow.
Therefore the game engine Godot and the world's most widely used game engines Unity3D and Unreal Engine 4 were compared. UE4 won the race, not least because of its Blueprint visual scripting system and above all because of its cluster rendering plugin ‘nDisplay’ with Blueprint API, VRPN and nVidia’s GPU technology support.
The CAVE rendering cluster workstations were setup running operating system Windows 10. For system administration Chocolatey Package Manager and RESTful WinRM based Windows Admin Center were set in charge. For remote management the open-source tool Multi-Remote Next Generation mRemoteNG was installed on the launcher workstation. The software supports a various of protocols. We make use of RDP, VNC and SSH and have also included external tools like PowerShell Session, PuTTY SSH Client and FileZilla FTP. To support the development of the messaging interface, the network analysis tool Wireshark was also installed.
Each of the seven rendering cluster workstations operates with two nVidia Quadro M4000 graphics cards – exept of the front and back wall workstations with one GPU, each. A graphic card is dedicated to a single of twelve projectors. In stereo mode left- and right-eye images, or viewports, are rendered separately. A rendering workstation with two graphic cards therefore manages four viewports. These viewports are rendered in parallel and managed using nVidia Mosaic technology.
Depending on the position and orientation of a projector relative to the wall or floor, a projected image is distorted. To correct this image geometry, the images are warped.
Since the projections overlap, certain regions are illuminated by more than just one projector and results in bright stripes in the overall image. This is adjusted with the luminous intensity and black levels of the images and the projectors – the blending.
To manage these complex corrections we make use of a calibration tool from Scalable Display Technologies. The correction information of each projector then is stored as EasyBlend files – a technology which in turn can be used in the UE4 nDisplay configuration.
The combination of each of the nVidia Quadro graphics cards with an nVidia Sync card results in support for generator locking (genlock), synchronised framebuffer swap (framelock) and vertical image synchronisation with its corresponding projectors (vSync) to avoid tearing. We use an overall refresh rate of 60Hz.
The cluster workstations system time is synchronised in the nanosecond range through the Precision Time Protocol PTP. We are using an Ubuntu Linux Server running PTPd as Stratum 2 time server. The cluster workstations are running w32tm as Stratum 3 clients.
With UE4 a project called Distributed Game Engine (DGE) was created. The edited assets were packed into a plugin with the same name. There you can find, e.g., a model of the CAVE, which was created with Blender. Also function libraries, a DGE messaging manager and all objects that can be accessed through the ESB messaging interface (see below) can be found in this plugin.
To demonstrate the 3D capacity of UE4 in the render cluster, a 3D level was created showing static, rotating and moving objects using physical based materials, as well as animated avatars. Newly created sport equipment was packed into a plugin.
To enable immersive video playback an UE4 Blueprint Plugin was developed. In Blender a sphere with inverted normals was created which is used in UE4 as screen for shperical 360° video playback. By default UE4 embedds Windows MediaPlayer WMF for file based video playback. But also visualisation of streamed video is supported, e.g., using a VLC plugin.
By default UE4 nDisplay is able to process motion capture data from, e.g., OptiTrack over VRPN. OptiTrack itself provides also with an UE4 Plugin for processing data from full body motion capture. We finally process MoCap position data streamed by the Laboratory Management System.
Also eye tracking data generated from, e.g., PupilLabs is streamed by the Laboratory Management System and processed in UE4 using the ESB Messaging Plugin (see Integration). We developed an UE4 Blueprint Plugin which serves with content for eye tracking data visualisation and processing, e.g., ray tracing and hit calculation. Also PupilLabs calibration shapes and a set of AprilTag textures are included.
To integrate a game into the system landscape of laboratories, another UE4 plugin was developed for interaction with an Enterprise Service Bus ESB or experiment management system respectively. Different messaging interfaces – Transform, RenderProperty, TextRender and MediaPlayer – got declared as JSON schemas (see schemas). The Blueprints process JSON payloads received by a message consumer accessing an incoming channel. Also sending payloads is implemented using a message provider accessing an outgoing link. Both make use of ZeroMQ sockets. Thanks to the object-oriented implementation it is possible to realize additional channels or links in the future, e.g. using RESTful technology.
The ZeroMQ asynchronous messaging library is brought to UE4 by a plugin providing with a C++ function library wrapping C++ bindings from zmqcpp.
Using Anaconda Navigator a Python virtual environment was set up. Then a Jupyter Notebook App was programmed using pyzmq and pyWidgets for interactive testing of the messaging interface.