Unreal nDisplay explanation

What is nDIsplay ?


nDisplay technology extends Unreal Engine by distributing the rendering of a camera view over an arbitrary number of machines and then displaying the rendered images on an arbitrary number of display mechanisms.




nDisplay does the following:

• Synchronizes actors across Unreal Engine Instances (cameras, animations, particles, etc.)

• Acts as a listener for keys, axis, and positioning

• Acts as VRPN server for tracking devices such as ART and Vicon

• Supports DirectX 11 and DirectX 12 graphics libraries in both mono and stereoscopic modes (frame sequentialquad buffer, side by side, top / bottom)

• Supports NVIDIA Quadro Sync synchronization for frame consistency, flexibility, and scalability

• Provides asymmetric frustums configuration for stereoscopic systems


nDIsplay setup


The nDisplay toolset consists of a plugin and a set of configuration files and applications for Unreal Engine. It includes the following components:

  • nDisplay plugin - Used during runtime to provide the network interface between instances, optional actor replication, and input management system; configures rendering subsystem to display node topology

  • nDisplay configuration file - Describes the topology of the display system and overall centralized location for project settings

  • nDisplayLauncher and nDisplayListener - Applications for launching and controlling “n” instances of Unreal Engine across different computers on a network, each connected to one or many displays

Features synchronisation over the network


Post-process VFX


The following is a list of post-process VFX that should be disabled or used with caution, as they pose a risk of tearing or other continuity issues:

  • Bloom

  • Lens flare

  • Automatic eye adaptation

  • Motion blur

  • Ambient occlusion

  • Anti-aliasing (although very subtle, some techniques might show a difference)

  • Screen-space reflections

  • Vignetting

  • Chromatic aberration

When deciding whether to include screen-space effects in a distributed display, you will need to balance the importance of the effect to the user experience with the extra care involved in making the effect work (and the possibility that the effect won’t work properly and will end up detracting from the experience rather than enhancing it).

nDisplay limitation

Here are some known physical limitations in nDisplay :

  • nDisplay currently runs on Windows 7 and higher (DirectX 11 and DirectX 12).

  • Quad buffer (active) stereo feature is supported only on Windows 8.1 and higher.

  • OpenGL support is deprecated.

  • nDisplay does not currently have Linux support.

  • Support for framelock and genlock is available only for professional-grade graphics cards such as NVIDIA Quadro.

  • Auto exposure, bloom, planar reflections, and some shader effects are not supported by nDisplay. You can still use them, but you will likely see visual artifacts or discontinuities in your displayed content.

  • 2D UMG Interface features (Hover, Click, and Viewport Alignment) are not supported. Alternative methods through OSC, REST, or other available remote control protocols can be used to control an nDisplay system.

Note that nDisplay scales the GPU side of rendering, which is the main bottleneck for real-time content, but it doesn’t scale CPU-based operations like gameplay and physics. All cluster PCs will still have to process all these CPU-based operations individually. In other words, because nDisplay is strictly a rendering distribution system, it won’t provide any acceleration from a CPU standpoint.

nDisplay workflow

The following diagrams show how nDisplay works with a network and display devices.

Network setup for projective display



Network setup for LED display


Cluster nodes relationships



Projection policy


A projection policy is an abstraction that specifies where to send a projection’s input data and how to compute the output. This means that each policy might have its own properties that it knows how to interpret and utilize. nDisplay supports several policies. The following are the most commonly used:

  • Simple - Standard policy used to render on regular displays.

  • EasyBlend - Integration of EasyBlend calibration data by Scalable SDK, enabling warp/blend/keystoning features. Required to display on non-planar and complex display surfaces such as curved or dome-shaped surfaces using multi-projectors.

  • MPCDI - Integration of the MPCDI standard, used for complex projects relying on this industry protocol.