Mo-Sys VP Pro XR is a pre-configured, multi-node system, joining native Unreal Engine's nDisplay with Mo-Sys unique features like Cinematic XR focus and XR compositing tools.
The system comprises of:
In order to speed up the installation it is recommended to take the following steps:
The system relies on Unreal nDisplay plugin to render the content on a LED wall. Thus, the setup is divided into 2 parts that must be completed in the following order:
When shipped, the system is already pre-configured as follows:
The diagram below represents a setup with three Render Nodes (RN) corresponding to the LED processors.
All the computers in the system are involved in presenting a believable and consistent graphics output, whether it's a Render Node rendering to the LED or XR engine compositing camera image with set extension.
That means the Unreal Project has to be maintained at the same state on all machines.
It is common to use an extra computer in In-Camera-VFX called Editor, which enables an operator or an artist to modify the environment. Additionally, those changes need to be applied to the XR engine.
The easiest way to do that is to map physical drives from Render Nodes and XR Engine on the Editor. Any modifications on the Editor can be then distributed to the other computers via network.
It is highly recommended to define a common directory on all computers. If possible, we recommend using a separate partition and changing the letter name to M. The usual path to the folder storing all available projects would then be: M:\Share.
See Project Distribution to learn how to use the mapped drives in project updating.
Project should be stored in the same local directory for all PCs. Make sure the directory is available on all computers. If possible, we recommend using a separate partition and changing the letter name to M. The usual path to shared folder is then: M:\Share. You can change the drive letter in Disk Mangement
Right click on the selected folder and click Properties.
There are 2 ways of connecting to other PCs' content:
Both are valid for the purpose of maintaining the nDisplay and XR projects in sync.
You can use the Mo-Sys File Sync tab to map all listed devices at once. Refer to Switchboard section in VP Pro Manual for the steps.
The mapped drive infrastructure can be used for a shared Derived Data Cache. These are the files that Unreal Engine generates when compiling shaders. It is possible to assign a location on one of the computers (Editor recommended) and share that location on the local network.
3. Select the location for DDC in Editor Preferences - Project Local DDC Path. This will affect all the projects on that PC!
The PC that stores the DDC must be ON to take advantage of the setup.
nDisplay is often used as a shorter name for the incamera LED VFX workflow. It is actually a plugin in Unreal Engine, that enables rendering a correct perspective on LED screens.
Inner Frustum - the area of the LED screen that the is visible through camera field of view. It is driven by tracking, so positional, rotational and lens change is reflected inside it.
Outer Frustum - area outside of inner frustum, it is usually static and is used as a source of realistic lighting.
LED processor - the hardware and software that combines multiple cabinets (sets of LED panels) into an array that displays a single image. There could be more LED processors driving an LED wall. The image rendered on render nodes is the input for LED processors.
Primary Render Node (PRN) - a computer that renders the image shown on LED volume, or part of it in bigger setups. Primary Render Node is the computer, which should receive tracking from StarTracker.
Render Node (RN) - a workstation that launches the Render Node(s) to put the image on the LED screen/volume and allows for editing.
Switchboard - is a Python application for controlling multiple render nodes and relies on the companion application SwitchboardListener to communicate with them. Launch render on LED from here.
Multi-User - plugin in UE allowing for live level editing while content is displayed.
Cluster - set of render nodes.
For the full procedure of setting up a StarTracker system, please refer to the StarTracker Manual.
Additionally, Mo-Sys has a collection of helpful videos outlining the StarTracker installation process.
Those video can be found here: https://vimeo.com/showcase/8529819 with access password: St@rTr4cker
This section of the manual discusses best practices of setting up the StarTracker system in an LED environment, as well as nDisplay-specific settings and setup options on the StarTracker system.
It is a crucial task to find a transform (position, rotation and scale) of the LED screen(s) in relation to the tracking coordinate system, as it is used for rendering the correct perspective on the screens. When working with set extension this is even more pronounced, as the transition between the graphics visible through a camera and the extended by the XR engine must be seamless.
The layout of screen(s) geometry will be used on both the XR Engine to mask the LEDs and the rendering nodes to define the nDisplay configuration.
When setting up a StarTracker system in the LED environment, it is important to maintain a known geometric relationship between the StarTracker coordinate system and the UE5/nDisplay coordinate system. This manual will outline some common Autoalign patterns below - those patterns will help preserve a solid geometric relationship between real world coordinates, the StarTracker coordinate system, as well as the UE5 coordinate system.
With regards to lens calibration, pure nDisplay / In-Camera VFX is one of the least challenging environments for tracking and lens calibration accuracy. Since the background plate would get distorted in camera, and field of view can be easily modified in the nDisplay config, the main lens parameter to get right is the focal distance.
For all diagrams in this section, coordinates are represented as (X, Y, Z) in the StarTracker coordinate system.
Aligning to a flat plane of the LED wall
When working with a flat rectangular LED surface, one of the possible AutoAlign patterns includes aligning to the LED wall plane itself. In this arrangement, corners of the LED wall would serve as markers / AutoAlign points. In the example below it is top left, bottom left and bottom right corners. Set the height to correspond to studio floor.
Aligning on the floor, relative to flat LED wall
In some scenarios, such as when corners of the flat LED wall are obscured by the real stage floor, another possible AutoAlign pattern is to assign Auto-align points on the floor, but strictly relative (perpendicular and parallel) to the LED plane. This way there will be a known geometric translation offset between the StarTracker origin and the UE5 / nDisplay coordinate origin.
Aligning to a curved LED volume
Start from drawing a line between bottom corners and mark a mid-point (0 point). Mark other 2 Auto-algin points perpendicular to the drawn line.
An example of a curved LED wall Auto-align
If there is a height offset (the LED is elevated from the floor) drop the points from the corners of the wall to the floor.
Data should be sent to three computers: 1. The Editor PC (8001). 2. An extra PC. 3. To PRN (8001) > Live Link Preset 2.
Tuning > XYHPTR
Set to minimum possible, e.g. 0.
Tuning > LED Wall
If set to negative values it will predict the camera pose [in fields], UDP output has to be set to LED On.
Both XR and Render nodes should adopt a specific level structure for easier synchronisation and maintenance of the project. It is recommended to use a separate, persistent level for MoSys actors and nDisplay configuration. This level can be attached to a master level or the other way round.
Key points to take into account:
The following section only describes the extra setup on top of project prepared to render to the LED wall (nDisplay - ICVFX). For more information how to set it up from scratch view Render Node Setup.
This is a replacement for CineCameraActor which is used in Unreal documentation. It is adding two components: MoSysLiveLink and MoSysLensNDisplay. Create and specify the Live Link preset in Project Settings > Live Link > DefaultLiveLinkPreset.
Select the Live Link subject on MoSysLiveLink component.
MoSysLensNDisplay component brings the following lens parameters:
If the lens file is not available on the StarTracker or the data is not needed for a shoot, you can set the specific parameter to manual. This will give control to CineCameraComponent settings.
Finally, if editing and running nDisplay on a single PC, change the port (image below) on MoSys Live Link to free up the port for nDisplay Launched application (don't save the preset after the change).
For nDisplay cluster to work it is essential to send tracking to nDisplay master computer!
You can optionally designate another StarTracker IP endpoint to send data to editor's port, to see the camera movement there.
All the rendering nodes as well as the XR Engine must be connected in a fast local network, preferably dedicated to them and the StarTracker.
All the devices should have a static IP address for easy maintenance. Prepare a list of the addresses, as it will be necessary for software setup. Install Tight VNC for remote access and set the project up in Perforce.
Daisy Chain the system by having an external sync source come into the master Sync Card via a BNC cable (Genlock) and spread the signal further down the chain using RJ45 cables. Ideally each render node should have only one display connected to it (the LED processor). If more than one are necessary configure them via Nvidia Mosaic.
Synchronisation of the LED screen output is a key to a seamless shoot. It is crucial that all the sections of the volume are rendering at the same time, so there is no tearing between sections.
This is achieved by using an Nvidia Quadro Sync Card II. Every Render node has one of them connected to the GPU to guarantee synchronous rendering.
Refer to Frame Lock Configuration in Nvidia's User Guide for more information.
Set a specific configuration on the Nvidia drivers via an Nvidia driver configuration utility, such as ConfigureDriver.exe - this can be obtained here. Download and run ConfigureDriver.exe as administrator through Windows command prompt. When done, type 11 and press enter. This will enable the prePresentWait setting and improve performance without compromising sync.
In nDisplay configuration actor select Nvidia (2) in Cluster > Render Sync Policy > Type.
The Node > Window size should cover the entire desktop resolution. However, the viewport should only cover the resolution required by the LEDs.
In order for nDisplay to lock to the synchronisation from Nvidia sync it must run as a fullscreen foreground Windows application. No other application can sit on top of the nDisplay instance while it is running.
To manage this ensure that:
You can verify that it is running in the correct mode by enabling option 8 in configureDriver.exe (Enable the SwapGroupPresentIndicator for Direct x).
Once you have everything setup in the Nvidia control panel with regards to sync, resolutions and colour space etc, it is useful to export the current EDID and then load it from a file. You can find the instructions here. Alternatively, you can set your EDID through your switcher, such as a Lightware matrix.
It's important to note that incorrectly configured EDIDs can half the performance of nDisplay when using sync policy 2.
To avoid this, ensure you have an EDID which allows you to select a PC resolution at the frequency you wish to shoot at (i.e. 24hz) and is marked as native. Another solution to this is to create custom resolution based on the standard 3840 x 2160 60hz PC resolution and then set it to the appropriate frequence (i.e. 24hz, 25hz, 30hz etc).
Use a bouncing ball test to validate the synchronisation. Place the ball on the edge between 2 parts of the LED wall, so it's visible on two separate segments of the wall. The ball will bounce up and down. If the ball is consistent it means the nDisplay setup is synced. If you see tearing of the ball then the synchronisation is failing.
The ball blueprint can be found under MoSysVPPro Content > nDisplay > Blueprints: BP_BouncingBall.uasset.
Additionally, the Switchboard shows Nvidia driver and sync status of the nDisplay nodes. For a successful sync, the PresentMode column should indicate that each node is in Hardware Composed: Independent Flip. If it states Composed: Flip then you will want to check nothing is overlapping nDisplay as a full screen application on the nodes.
Prepare the nDisplay level so the ball is positioned in between the nDisplay screens. After launching in Switchboard, the ball with bounce quickly up and down.
Start with the Primary Render Node and set up the Live Link connection with StarTracker.
Open the nDisplayLevel level in MoSysVPPro's Content > nDisplay > Maps.
We will be following the In-Camera VFX Template.
The setup requires two objects to render a correct perspective:
The camera has to be specified in the ICVFXCamera component in nDisplayConfig.
Default View Point is the origin of projection for all other areas of the LED that is not visible in-camera. It's not tracked and its purpose is to give the correct reflections on talent (actor), so it should be placed in front of the LED at a point where the actors will be, approximately.
It is recommended that the project is divided into sublevels, where at least one is for the content and one for the camera with a nDisplayConfig. That way, it is easy to transfer the setup to different content levels and projects. Remember: sublevels have to be always loaded, otherwise they will not be visible after launching.
Depending on the number and shape of the LED volumes the virtual layout needs to be adjusted.
Top part of the nDisplay configuration actor defines the LED screen transform (where it is in space) and the bottom part describes the pixel mapping on the screen space. In other words, the image should be rendered in relation to the top left corner of the screen.
Edit nDisplayConfig to define the viewports.
At the top of the nDisplayConfig, bring the meshes representing the screen/volume to 1:1 scale. For rectangular screens it is easy to use nDisplayScreen or a plane mesh, however, if dealing with more complex shapes like a curve it needs to be modelled externally.
Refer to Step 2 - Create LED Panel Geometry in In-Camera VFX Quick Start guide for building a complex mesh.
Every viewport responsible for rendering to a part of the LED volume is represented as a separate mesh in Unreal. The 3D model (mesh) should be split into the respective parts.
Configure the resolution of the screen(s) at the bottom of nDisplayConfig.
Change the resolution of the viewport(s) to reflect the resolution of the screen or a segment of the volume driven by a Render Node. Usually one Node is responsible for one viewport.
Node (window) has to be big enough to fit viewport(s):
Refer to the next page images and, optionally, to Step 3.
Every Render Node requires adding an extra cluster node and assigning the IP address to it on the Node component.
Select the sync policy. Nvidia (2) if using Quadro Sync Card II.
See Step 4 for launching Your Project with nDisplay in In-Camera VFX Quick Start to launch the project on the LED screens.
Use Switchboard plugin to launch:
After you launch in nDisplay launcher it should appear on the primary screen.
The Project must be copied to all of the Rendering Nodes to the same directory as the Primary Render Node. All the rendering nodes should store the same version of the project.
Various workflows can be used here:
It is recommended to use Mo-Sys File Sync panel in Switchboard. Refer to VP Pro Manual, Switchboard section (14.1) on how to install Mo-Sys Switchboard panels.
It is also possible to use a version control software like Perforce. The advantage being that the user can roll back to previous revisions.
Alternatively, especially for testing, a user can use a project that is located in the Windows shared folder, visible to all rendering nodes. That way, there is no need to distribute the project as it is loaded on start from the shared folder. This is not recommended for production, as the stability cannot be guaranteed and the start time is usually longer. This approach requires the user to assign a path to the network location in the Switchboard settings.
Cinematic Focus XR provides a seamless focus control with LED stages. The system takes advantage of the existing geometry defined for nDisplay and camera pose given by the StarTracker. Allowing focus pullers to use it naturally regardless of whether an object is real or virtual. It is currently compatible with Preston and Teradek RT FIZ systems.
Depending on the FIZ system the required hardware is either 1. or 2.
Preston FIZ system with:
MDR3 needs a firmware update to version 1.130 or higher.
Teradek RT with:
Custom firmware for both MDR and CTRL units is required and provided (Teradek_RT_firmware_v1.2.15_MoSys).
Connect the serial cable to the Preston MDR 4 pin Lemo serial or 5 pin Lemo to Teradek MDR serial 1. On the other end plug it to the Primary Render Node USB port.
Configure the serial port in Device Manager and take note of the COM port number.
Setting up the FIZ system is out of the scope of this manual. Please refer to Preston's or Teradek's documents for more information. It can be found here: https://prestoncinema.com/#!/downloads/firmware https://teradek.com/pages/downloads#teradekrt
After powering up the system, it will try to find end stops of the lens. It is a part of the calibration.
Make sure to map focus to the lens reading distances. This can be done in both imperial and metric units.
Open the Unreal project on Primary Render Node.
Verify a collision profile is setup in Project settings > Collision > Object Channels. There should be a custom Object Channel called LED. Create one if missing.
Add a new component on MoSysCameraNDisplay (used for ICVFX) called CinematicXRFocus.
Configure the component:
In nDisplay configuration blueprint, set the collision presets in collision section on the mesh(es) used with viewports as follows:
Collision Presets set to Custom and below Object Type is set to LED.
Finally, set Manual Focus to true on MoSysLensNDisplay component to enable Hand control to set the focus distance.
A sample level nDisplayLensControl with the component already on the camera can be found in: MoSysVPPro Content > nDisplay > LensControl.
The lensmap created on the hand controller needs to be exported to the PC and then loaded to the MDR for the system to work correctly. At all times the Lensmap must be the same on both hand controller and the MDR.
Teradek RT Manager must be installed on the Primary Render Node to manage the file exchange (https://teradek.com/pages/downloads#teradekrt).
The XR Engine uses camera feed as an input and composites it with AR objects and virtually extends the set when camera points away from the LED screen. This means the talent is only visible if the screen is behind them.
Currently we support both internal and external compositing using a mixer/keyer.
In Project Settings > Mo-Sys VP Pro: - Define the image framerate and resolution. - Uncheck Has Timecode Provider under Sync.
Setup one of the StarTracker's data output to send tracking data to XR Engine's IP address.
Adapt the existing Live Link preset to enable receiving tracking on XR engine for MoSysCamera(s).
Change the IP address to correspond to the machine that is supposed to receive data. The setting is in: Mo-Sys UDP source > Tracking > Network interface IP Address.
If XR PC has IP: 10.11.10.100, set it there and save as a separate Live Link preset. Select as the default Live Link Preset:
Apply Live Link subject only to MoSysCamera(s), as the nDisplay camera(s) should already have its own preset and subject assigned.
Be careful of synchronising the configuration folder as the Render Nodes must store a different default Live Link preset! The default preset is stored in DefaultGame.ini.
Set up the project to use pre-keyer mode (refer to VPPro Manual - 4.9 Pre Keyer Mode).
Use pre-keyer matte to mask the LED wall. The mesh for the wall must be placed at the same position and rotation as in the nDisplayConfig. This will give the matte, that can be used to composite, the video with CG fill.
Pre-keyer matte has a custom stencil of 247.
Any object that is supposed to be in front of the LED wall (augmented - AR) needs to have custom stencil set to 250.
Pre-keyer mode provides 2 video outputs. One for the CG perspective of the camera (fill) and the LED matte for composition.
Set Frame Buffer Pixel Format to Float RGBA or 8 bit RGBA in Project Settings if using a soft (feathered) matte. It will allow enough bit depth for the feathered part of the matte.
Open the panel from Window/Developer Tools. If it's not present, enable the Timed Data Monitor plugin.
Use Time Correction value (in seconds) to delay the tracking. Watch Timing Diagram and make sure the vertical bar is green and safely inside the buffer. If it's not, set the Buffer Size to a higher value.
Use the augmented cones as reference to set the delays. Pan the camera in a series of short, sharp movements to see if the cones stay on the corners of the LED. Stop between the movements to distinguish whether the cones or the video moves first. Adjust the Time Correction. Time Correction for tracking is saved with Live Link preset, whereas the video delay is set in Blackmagic source file. If not saved there the values will default to 0 after the project is reloaded or video dropped.
The standard workflow of colour adjustments using set extensions is as follows:
At this step we assume the first two steps are done. Use XR Camera Settings to adjust the set extensions.
Open the panel starting from the Mo-Sys icon.
XR Controller exposes some of the settings used to set extension. This includes:
If in Play Mode, click Refresh button to be able to change parameters dynamically.
You can use Mesh Builder to generate the representation of the LED wall, by looking through a camera with the crosshair at the corners of the screen. Two or three observations are used for calculations.
The tool was designed for rectangular screen or a combination of rectangular screens. However, spawned corners can facilitate placing a more complex mesh as well.
The procedure of finding the corners and generating a mesh is as follows:
The panel can be used for a quick colour correction tool. It captures the incoming video and tries to adjust the set extension to match. It works per camera, if using multiple cameras select the one you wish to correct. Before starting, move your physical camera to a point where most of the image is filled with the LED. Make sure the MoSysCamera is tracked and the props or AR objects are not in the camera field of view.
Follow the wizard:
2. Click Capture Camera.
3. Click Preview.
The preview should show the colour corrected image. Apply it to Mo-Sys Output with Apply Correction checkbox.
You can start again with Reset.
Auto-Geometry Correction is a tool that allows for a seamless transition between the content in the set extensions and the content on the LED screen. It compares the CG-only image in the XR scene with the video image of the LED screen and then computes a homography correction that can be applied to the XR composite.
Steps on how to use it:
Once you have the save file you can load it and use it with Read From File mode skipping the live capture steps.
Other features of geometry match panel:
Visualizer - visualizes feature detection / matching for debug purposes. Can choose between Matches, Error and Overview.
Pause Capture - holds the current homography in Live Capture mode. Can be used to evaluate correction with homography calibration features disabled.
Enable camera switching - allows to set different save files for different cameras and switch between them.
Samples - allows to view currently stored samples, delete specific ones.
Detection threshold (Advanced Settings) - can be adjusted if there's not enough features detected.
XR Multi Cam is a tool to seamlessly switch between cameras in Mo-Sys VP Pro XR. It sends a command to the Primary Render Node to update the perspective on the LED volume, and smooth transition can be achieved by adjusting the delays.
This can be seen in the sample level: L_MoSysXR_MultiCam_Example.umap.
Use an Editor PC and apply the settings from XR engine project setup. Don't synchronise DefaultGame.ini in Config folder!
MultiViewXR can provide an AI-assisted director preview that removes the incorrect background from the non-live cameras and replaces them with the correct perspective background.
Two additional PCs are required. The first PC runs VP Pro in MultiViewXR Preview mode to generate the quad-split CG backgrounds. Project Settings > Mo-Sys VP > Mode. DeckLink SDI card configuration must match that of Pre-Keyer Mode.
There is a sample scene in Plugins > MoSysVPPro > StarterContent > Maps > L_MoSysXR_MultiViewXRCG_Example. This loads a sample overlay on the preview found in Plugins > MoSysVPPro > XR > _ MultiViewXR_ > MultiViewXROverlayWidget.
The fill and key outputs of this engine are fed - along with a quad split of the cameras - to the MultiViewXR processor as shown.
Go to Project Settings > Collision and verify it has a custom Object Channel: RefPlane.
Go to Project Settings > Collision and verify it has a custom Object Channel: LED.
Make sure the Viewport's mesh is a Static mesh and not the nDisplayScreen. The mesh should have a collision preset set to LED.
Refer to section 3. Share and map drives.
This could be due to non standard (ASCII) symbols in project path. Try with simple, English alphabet only directory.
To come back to 10 bit or 8 bit format, go to \
Close all the viewports and open the Mo-Sys Viewport again. Then open other viewports.
Currently, the Grass Valley switcher implementation is in Beta. It requires further testing but is working with simulators.