Pixel Streaming for Linux: The Road to 4.25

In this article we explain how to get started with the 4.25 version of Pixel Streaming for Linux and discuss the challenges we overcame in developing it.

It’s been over a year since Adam Rehn and I first managed to get Pixel Streaming working under Linux with Unreal Engine version 4.23 and the OpenGL rendering backend. Since then, Epic Games has deprecated OpenGL to better focus on SM5 features with DirectX 12 and Vulkan. This posed an interesting problem for Pixel Streaming for Linux, as the NVIDIA NVENC hardware video encoder used by Pixel Streaming does not currently support passing Vulkan device memory directly into the encoder. What it does support, however, is accepting CUarray and CUdevice pointers from the NVIDIA CUDA driver API. It also just so happens that CUDA supports importing external device memory via opaque file descriptors, which Vulkan is capable of emitting with the VK_KHR_external_memory extension. Armed with this knowledge, we have upgraded the Unreal Engine’s Pixel Streaming plugin and added some basic CUDA features to enable Pixel Streaming under Linux in 4.25.

The sections below explain how to get started with the 4.25 version of Pixel Streaming for Linux and discuss in more detail our quest to get Pixel Streaming for Linux working with the newer Engine release.


Enough with the exposition, just tell me how to do the thing!

Before jumping straight into the building our fork of the Unreal Engine, you will need to ensure your NVIDIA graphics drivers are up-to-date and install the NVIDIA CUDA Toolkit on your system. I would also recommend having a read of Epic’s official Pixel Streaming documentation to familiarise yourself with the overall stack.


  • An NVIDIA GPU with up-to-date NVIDIA binary drivers (450.51 or higher)
  • NVIDIA CUDA Toolkit (10.1 or higher)
  • The 4.25-pixelstreaming branch from our custom fork of the Unreal Engine
  • Getting Started with Pixel Streaming walks users through the basics of setting up the official reference implementation of Pixel Streaming with its Cirrus signalling server under Windows.

  • Pixel Streaming Demo demonstrates how to use custom HTML and JavaScript to add browser interactions with an Unreal project using the Pixel Streaming plugin under Windows.

Building the Engine

We have modified to automatically download our updated binaries for WebRTC, so the process for building the Engine mostly follows the standard method under Linux. After installing the latest NVIDIA drivers and CUDA, run the following commands:

# Clone the source code for the 4.25 version of Pixel Streaming for Linux
git clone --depth 1 '' -b 4.25-pixelstreaming && cd UnrealEngine

# Download third-party dependencies, including the updated WebRTC binaries

# Generate project files for Visual Studio Code
./ -vscode -platform=Linux

# Build the Unreal Engine
make .

# Create a symlink for (see below for notes on the appropriate location)
ln -s /path/to/your/ Engine/Binaries/Linux/

The last step is to create a symlink to in a path that the Editor executable can find it. The exact location of this file varies based on your Linux distribution, see the notes in the repository’s README for details on common locations. This requirement seems to be a bug with how the Unreal Engine currently finds system libraries and we have an open issue to try and fix it.

Once the initial build steps are complete then you are ready to work with the plugin. The Building and running section of the repository’s README provides step-by-step instructions on how to setup and run the Pixel Streaming Demo project provided by Epic.

The journey to 4.25 and Vulkan

The single biggest requirement that we set for ourselves going into this endeavour (other than simply getting it working) was to have as minimal an impact on existing Unreal Engine code modules as possible. The idea behind this requirement was that if people wanted to merge our changes into their own custom Engine fork then there would be as few conflicts as possible. This was challenging as we needed to add CUDA to the Engine in order bridge NVENC and Vulkan, whilst also getting access to underlying device memory pointers from the Vulkan RHI.

Adding CUDA to the Engine

One of the first problems we encountered is that the CUDA documentation states that having multiple CUDA contexts per process per devices will result in a loss of performance. We have not performed any testing to verify this, but it’s generally safe to assume that NVIDIA knows what they are talking about when it comes to their own technology stack. In order to avoid imposing performance limitations on developers who want to use CUDA in tandem with Pixel Streaming in their Unreal projects, we have made a small plugin that allows the Unreal Engine to manage CUDA contexts within the current process. The CUDA plugin’s sole purpose is to initialise CUDA and then maintain a map of device UUIDs and contexts in order to ensure that only a single context is used per device. This facilitates a workflow where developers to retrieve the appropriate context for a given device from the plugin rather than attempting to create one directly, allowing their own CUDA code to co-exist with the Pixel Streaming plugin without incurring a multi-context performance penalty, even when using multiple GPUs.

This plugin is not enabled by default and is instead a dependency of the Pixel Streaming plugin under Linux. The module loads in the pre-default stage so that other default loaded plugins can bind to its PostCUDAInit delegate. It doesn’t actually initialise immediately upon loading but instead binds InitCuda to FCoreDelegates::OnPostEngineInit which fires once the RHI has been initialised. This ensures that the initial CUDA context is on the same GPU device as the RHI is. For our use case this is a requirement in order to get access to Vulkan device memory.

Getting it working with Vulkan

The next requirement is that we have to allocate our own textures using Vulkan. This serves two purposes:

  1. We need the VkDeviceMemory to generate the external memory opaque file descriptor that CUDA can wrap, and,
  2. By allocating and passing it to FVulkanDynamicRHI::RHICreateTexture2DFromResource we can guarantee that Unreal won’t try to pack the texture in with other textures.

This is all done inside of the FCUDATexture class, which keeps a reference to the CUDA texture as part of the new CudaVideoEncoder submodule of the Pixel Streaming Engine plugin. This Linux-only submodule adds a CudaVideoEncoder class (NVENC configured with CUarray as input) to the list of H.264 capable encoders in the Engine’s AVEncoder module and mimics the existing NvVideoEncoder class, replacing the parts of the code related to DirectX. Like NvVideoEncoder, the CudaVideoEncoder implementation copies each rendered framebuffer to a managed backbuffer (this time with memory wrapped in a CUarray) and passes them to NVENC to be encoded and then output to WebRTC.

This results in only one extra texture copy being performed per frame, which is the same as in the reference Windows implementation. While it theoretically could be possible to wrap the framebuffer itself in a CUarray to remove this copy, the nanosecond-scale cost saving observed in our testing doesn’t warrant the added complexity of handling the case where NVENC fails to free up an active framebuffer in time for it to be drawn to.


One of the most unexpectedly frustrating challenges we had to overcome was actually building WebRTC for use with Unreal Engine 4.25 under Linux. Credit goes to my colleague Nicholas Pace for solving this tangled web of patches and configuration scripts.

In Unreal Engine 4.23, the Linux binaries for WebRTC that shipped with the Engine went largely unused and did not match the version of the Windows binaries. This meant that Adam needed to build them from source, which involved a bit of trial-and-error but was largely a straightforward process. This simplicitly was the result of the fact that the WebRTC libraries were linked into a standalone executable called WebRTCProxy which did not link against the Unreal Engine itself, meaning that it was not necessary to customise the WebRTC build process in order to ensure compatibility with the Engine. (See this page for details of why building third-party libraries correctly to make them compatible with the Unreal Engine is so challenging.) However, the WebRTCProxy program was merged into the Pixel Streaming plugin itself in Unreal Engine 4.24, so the WebRTC binaries for newer versions of the Unreal Engine must be built correctly in order to ensure compatibility with the Engine’s other third-party dependencies.

In Unreal Engine 4.25, the Linux binaries for WebRTC that ship with the Engine do actually match the version of their Windows counterparts, and so initially we believed that it would not be necessary to build WebRTC from source at all this time around. Unfortunately, Epic’s build script for WebRTC had not been updated to account for a change in the default build behaviour of WebRTC under Linux in newer versions of the library, and we discovered that the Linux WebRTC binaries that ship with 4.25 are in fact thin archives that reference object files which are not included with them. This renders them useless because there isn’t actually any code to link against!

And so, in steps our fearless hero Nick to take up the quest of not only getting WebRTC to build, but to do so without clashing with other third-party libraries that the Engine depends on. The first major problem was to identify which version of WebRTC the Unreal Engine is actually using.

There are at least 3 different naming conventions used to refer to releases and revisions inside the WebRTC source repository, and since none of these are cleanly exposed in the web view of the repo, finding a particular revision requires manually trawling through changelog pages. Specifically, the module rules file for WebRTC in the Unreal Engine source tree (WebRTC.Build.cs) references revision “rev.24472”, which is in fact release M70. As Nick discovered, however, that release actually contains a bug in a DEPS file that means it won’t configure correctly. Attempting to patch the DEPS file causes the gn build system to complain, and so it was necessary for Nick to test each subsequent commit until he found the earliest revision which contained the bugfix without being so new as to break compatibility with the existing code in the Pixel Streaming plugin.

Now that Nick had the correct version of WebRTC source code, he had to determine the correct build flags. Due to the fact that Google migrated the build system for WebRTC from gyp to gn between the versions of the library used by Unreal Engine 4.23 and 4.25, he was unable to simply reuse the build flags from Epic’s official build script in the 4.23 branch of the Engine source code (this was compounded by the fact that the relevant build scripts were in fact removed in later Engine branches.) Since the build flags between the two disparate build systems do not have a one-to-one relationship, it was necessary for Nick to manually determine the gn equivalents to the old gyp flags used in 4.23.

After further experimentation and testing, Nick determined that it was also necessary to specify the build flag is_component_build=true, as without it gn will bundle OpenSSL into libwebrtc.a. This behaviour is problematic since the Unreal Engine already links directly against OpenSSL and the inclusion of two versions (even if they are identical) results in duplicate symbol errors when WebRTC is linked into the Pixel Streaming plugin. By setting is_component_build=true we avoid this, but Google has explicitly disabled that functionality when building WebRTC so we have to apply a small patch to bypass the assertion that “Component builds are not supported in WebRTC”.

The final challenge was to build WebRTC with the same external dependencies as the Unreal Engine to avoid symbol mismatches between the libc++ standard library implementation used to compile the Engine and the one that Google uses to build WebRTC. We achieved this by performing the build in a CentOS 7 container (since Unreal Engine 4.25’s bundled toolchain is based on CentOS 7) and copying the OpenSSL binaries downloaded by into the container image. This solution does not work for later releases (M85+) since the latest version of gn requires a version of glibc which is newer than the one shipped with CentOS 7. For these newer releases of WebRTC, it is necessary to perform the build under a newer Linux distribution such as CentOS 8 and instruct the build process to use the system libraries from a CentOS 7 sysroot image by specifying the sysroot= build flag that was introduced in later versions of gn. This allows the build tools to run in the newer host environment whilst ensuring the resulting WebRTC binaries are still built against the older version of glibc in order to maximise compatibility. Expect to see this solution utilised in later releases of Pixel Streaming for Linux that incorporate a more up-to-date version of WebRTC.

The results of Nick’s Herculean endeavour can be found in the tw-build folder, where these learnings have been distilled down to a Dockerfile and some scripts.

When it all comes together

After overcoming all of the challenges described above, we now have a Pixel Streaming plugin that works with Unreal Engine 4.25 under Linux. Not only that, but it can be used seamlessly with existing projects such as Epic’s official Pixel Streaming Demo. We have plans to continue development of the plugin (discussed in the section below) but at this time have not yet tested it with Unreal Engine 4.26. If you are interested in the development of Pixel Streaming for Linux and would like to help with propelling it forward please check the repo’s issue tracker for our current plans and feel free to reach out to us at if you have more bespoke requirements that would be better served by engaging us in a consulting capacity.

Further development

Pixel Streaming for Linux is still under active development, with the following enhancements already planned:

  • Testing with Unreal Engine 4.26
  • Fixing how Unreal finds system dynamic libraries (issue link)
  • Support for video decoding via NVDEC (issue link)
  • AMD support via the AMF encoder (issue link)
  • Further upgrading WebRTC to take advantage of advancements in the API (issue link)

Further reading

  • Adam’s article Pixel Streaming in Linux containers discusses how to use Linux containers to package and run Pixel Streaming projects, which is the methodology we recommend for deploying this technology in the cloud.