Investigation into custom Unreal Engine streaming solutions

An investigation into the feasibility of developing plugins that stream Unreal Engine inputs and outputs via protocols other than WebRTC.

The contents of this report are Copyright © 2021 TensorWorks Pty Ltd and are licensed under a Creative Commons Attribution 4.0 International License.


1. Introduction

Since late 2018 1, the Unreal Engine has included a system called Pixel Streaming, whereby audio and visuals from a remote Unreal Engine instance are streamed to a WebRTC peer (typically, a web browser.) Standard inputs such as keyboard, mouse, and touch can then be transmitted back from the WebRTC peer to facilitate interaction with this Pixel Streaming Unreal Engine instance. 2

The Unreal Engine’s Pixel Streaming system is built upon WebRTC, which is well supported by modern web browsers. 3 WebRTC is specifically designed as a holistic solution for bidirectional, low latency real-time communication over the public internet. There are, however, a number of streaming scenarios that are either unidirectional, non-realtime, or operate in a controlled networking environment that does not suffer from the packet loss or the varying network conditions of the public internet. Examples of such streaming scenarios include: live broadcast, simulation software, legacy hardware, or simply streaming scenarios running in a completely controlled private network. These scenarios do not necessarily require, nor support, the conglomerate of technologies that are WebRTC.

If these aforementioned scenarios are unable to support a WebRTC connection it means that the Unreal Engine Pixel Streaming system cannot be used directly. This is unfortunate, as streaming of the Unreal Engine’s audio, visuals, and inputs in these use cases is, we argue, highly desirable. Fortunately, many of these streaming use cases, particularly blackbox hardware and simulation software, already support receiving streaming data using existing streaming protocols such as RTP or RTMP. The support for non-WebRTC streaming protocols in these use cases presents two potential options:

  1. Implement these non-WebRTC streaming protocols inside the Unreal Engine Pixel Streaming system and stream directly from the Unreal Engine to a receiver using these protocols.

  2. Use the existing WebRTC-backed Pixel Streaming system to stream to an intermediate WebRTC gateway server that effectively forwards the WebRTC media streams via a supported streaming protocol such as RTP or RTMP.

The purpose of this report is to investigate each option. Specifically, we will identify candidate approaches for these options and then evaluate each approach in terms of its advantages, limitations, and general feasibility. We conclude this report with our recommendations regarding Pixel Streaming to these non-WebRTC receivers.

2. Background

In this section we present an overview of the relevant streaming protocols, various quality-of-experience techniques that generally accompany real-time streaming protocols, and a review of some WebRTC gateway servers and forwarders. This overview provides the relevant context to inform our investigation into the advantages, limitations, and feasibility of each potential solution in Section 3.

2.1 Streaming protocols

Streaming of real-time data, such as video, audio, and control input, is a task that has been attempted and improved upon many times in the past. As part of this report we have conducted an analysis of some of the previous works that we consider relevant as potential streaming protocols for our purpose. The overview table below provides a brief description and the key details about the relevant protocols.

Name Description Transport Typical use Year
RTP - Real-time transport protocol A protocol for the end-to-end transmission of real-time data. RTP includes sequence numbers/timing data to aid in out-of-order data reassembly. 4 UDP Video/Audio 2003
RTCP - RTP Control Protocol “Monitors the quality of service and… conveys information about the participants in an on-going [RTP] session”. 5 RTCP is the sister protocol for RTP, it is used to send QoS control messages for an RTP session. UDP Control messages for QoS 2003
RTSP - Real-time streaming protocol “RTSP provides an extensible framework to enable controlled, on-demand delivery of real-time data, such as audio and video…[RTSP] does not typically deliver the continuous streams itself…in other words, RTSP only acts as a ‘network remote control’ for multimedia servers”. 6 Typically the streams that RTSP controls will use RTP as their underlying protocol. RTSP assumes a client-server architecture. UDP/TCP Messages to create and control streams. 1998
RTMP - Real-time messaging protocol “RTMP provides a bidirectional message multiplex service over a reliable stream transport, such as TCP, intended to carry parallel streams of video, audio, and data messages, with associated timing information, between a pair of communicating peers”. 7 RTMP was acquired when Adobe purchased Macromedia in 2005. The specification was released publicly in 2009. 8 It is widely supported by media servers; however, browser/mobile support has been declining. 9 It suffers from some latency, for example, 5s is typical. 9 TCP Video/Audio 2009
SRT - Secure reliable transport “SRT is a video streaming transport protocol and technology stack (similar in concept to reliable UDP). SRT connects two endpoints for the purposes of delivering low latency video and other media streams across lossy networks”. 10 In addition to handling lossy networks, at a protocol level SRT also supports end-to-end encryption. SRT has mostly seen adoption in the live broadcast industry, thus outside of that sector protocol support may be of concern; however, it is open source. 11 UDP Video/Audio 2013
RIST - Reliable internet stream transport RIST is similar to SRT in that it is an open source 12 protocol that aims to solve reliable, low latency transmission of media in lossy networks (such as the internet). However, unlike SRT, RIST is built on top of RTP and RTSP to maximise interoperability due to the wide number of sources already supporting RTP and RTSP. 13 UDP Video/Audio 2018
WebRTC - Real-time communication for the Web WebRTC is a combination of many existing technologies, designed to achieve low-latency (specifically, sub-second) communication over the public internet. 14 It is resilient to poor network conditions and can adapt gracefully to packet loss. 14 Unlike all the other protocols discussed in this report, WebRTC itself is a holistic solution to real-time communication, handling: peer networking, session management, streaming of data, and adapting to changes in network conditions. Additionally, it is widely supported in all major web browsers. 14 TCP/UDP Video/Audio/Data 2011
Table 1: Brief overview of the relevant streaming protocols we will analyse in our investigation.

2.2 Quality-of-experience techniques

We argue RTP is likely the simplest protocol to implement of any listed in Table 1. However, unlike SRT, RIST, RTMP, and WebRTC, the RTP protocol has no in-built mechanism to handle packet loss or network jitter - these challenges are deferred to the application developer. Thus, it seems prudent to discuss how these challenges would be addressed if, say, a pure RTP streaming solution inside of Unreal Engine were to be implemented.

Fortunately for the developer, there exists a number of well known and effective techniques to address packet loss and network jitter of streaming data. We call such approaches “quality-of-experience” (QoE) techniques. Some known QoE techniques include forward error correction (FEC), automatic repeat request (ARQ), jitter buffering, and packet loss concealment. We briefly introduce and explain each of these QoE techniques below. Additionally, we highlight that all proper WebRTC implementations will implicitly include FEC and jitter buffering QoE techniques. 15 Furthermore, Google’s WebRTC, the implementation that Unreal Engine’s Pixel Streaming system uses, also includes a number of custom packet loss concealment techniques. 16

Forward Error Correction (FEC)

Forward Error Correction (FEC) is the process of transmitting redundant information alongside streaming data so that when a packet is lost this redundant information can be used as an error correcting code to recover the lost information. 17 In practice, this requires keeping a buffer of received packets so that when missing packets are detected future packets can be used to reconstruct the lost information without asking for a retransmission of old data. 18 The tradeoff is these redundant error correcting codes increase the total transmission size. However, it should be noted that the more redundancy that is added the better FEC works in practice. 18 The amount of redundancy that is acceptable is largely defined by the bandwidth capabilities of the underlying network. As an indicator for the sort of loss recovery one could expect, a real-world study 18 demonstrated that with a 50% bandwidth overhead an RTP-based FEC implementation could recover streaming data in scenarios where packet loss was approximately 5 - 10%.

Automatic Repeat Request (ARQ)

Automatic Repeat Request (ARQ) is a technique where the receiver requests for the sender to retransmit a packet if it is lost. 19 ARQ requires the introduction of a buffer on the receiving end to determine if packets are truly lost or simply out-of-order. 19 ARQ has two major disadvantages compared to FEC:

  1. ARQ requires bidirectional communication between sender and receiver. 19
  2. ARQ introduces additional latency waiting for retransmission to occur. 19

However, unlike FEC, ARQ does not require a great deal of additional transmission bandwidth. Additionally, ARQ is able to recover from large, contiguous, runs of lost packets, which is a scenario FEC would struggle with. 20 We highlight that Haivision, the creators of SRT, use ARQ in their streaming protocol because “it requires less bandwidth than other error correction methods… and introduces only a small degree of latency to the transmission”. 21

Jitter Buffering

Jitter buffering is simply the process of maintaining a buffer on the receiving end to store incoming packets. 22 It is used for packet reordering when packets arrive out-of-order and also to provide a constant playback stream of the receiving end. 22 The disadvantage of jitter buffering is that latency is increased directly by the size of the buffer. While fixed size buffers are simpler to implement, it is also possible (and desirable) to dynamically resize the jitter buffer based on recent network conditions. Dynamic jitter buffers can use stream statistics from protocols such as RTCP to decide when to resize the buffer. We highlight that FEC and ARQ both contain jitter buffering as part of their practical implementations.

Packet Loss Concealment

Packet loss concealment is actually a family of techniques whose purpose is to decide how to progress in the presence of a missing packet. Unlike FEC, ARQ, and jitter buffering, which are general QoE techniques for any streaming data, packet concealment techniques are typically designed to specifically handle lost audio or video packets by exploiting the properties of audio and video data. For example, in some audio streams when packets are lost a technique called waveform substitution is used, whereby some already received audio packets are carefully reused and stitched into the received audio to extend the current voice sound until new data is received. 23 Another example is Google Duo, which uses the WaveNetEQ generative model to interpolate parts of speech which are missing due to packet loss. 24

2.3 WebRTC forwarding technologies

As we mentioned in Section 1, the other option, besides implementing alternate streaming protocols inside of Unreal Engine, is to stream using Unreal Engine’s existing WebRTC-based pixel streaming to a WebRTC gateway/forwarding server. This gateway/forwarding server would then translate the WebRTC stream to a protocol that the receiving end can understand, such as RTP, then finally, the gateway/forwarding server will transmit the stream along to the receiver. Fortunately, due the wide adoption of WebRTC there are existing gateways/forwarding servers that may be suitable for this purpose. We present some of the relevant options below.


Janus is a general purpose WebRTC server written in C. It is designed such that the base Janus application has a small footprint and most extra functionality is achieved through plugins. 25 For the purposes of forwarding WebRTC media through other protocols, Janus provides a “NoSIP” plugin that acts as an “RTP bridge” that relays media from a WebRTC peer to a receiver using RTP/RTCP. 26 There are also the Janus plugins “Videoroom” 27 and “Janus-RTP-Forward-Plugin” 28 written by the community, which also appear to be viable options. We highlight that Janus is open source and licensed under GPL 3.0. While GPL 3.0 is incompatible for linking into the Unreal Engine this is likely a non-issue because a Janus WebRTC forwarding server would sit completely outside the Unreal Engine.


Pion is a WebRTC implementation written in Go and unlike Google’s WebRTC, Pion is specifically designed to be fast to build and customise. 29 While Pion is not specifically a WebRTC gateway or server it does contain an “RTP-Forwarder” example that illustrates how to use it as a WebRTC peer that forwards RTP packets elsewhere. 30 In terms of implementation, we recognise that Go is not trivially callable from Unreal Engine C++; however, this should not be a problem because Pion would be deployed outside of Unreal Engine and communicate with Pixel Streaming as a standard WebRTC peer.

Other options

We also investigated GStreamer WebRTC 31 and Kurento WebRTC server 32, both of which seemed suitable, but somewhat less documented than Janus or Pion regarding WebRTC to non-WebRTC stream forwarding. Additionally, it also seems possible to implement a WebRTC forwarding server using Google’s WebRTC; however, we would not advise this course of action, because in our experience, Google’s WebRTC is complex, time consuming to build, and non-trivial to update.

3. Feasibility of various streaming solutions in Unreal Engine

In this section we evaluate the feasibility of various streaming solutions in terms of their implementation into the Unreal Engine, their general advantages and limitations, and we then briefly present our conjecture on how each solution would fare in an unreliable network.

Specifically, we will explore the feasibility of each protocol by discussing each of the following criteria:

  1. How to implement. We briefly describe the process of implementing a potential solution with the Unreal Engine’s Pixel Streaming system. Are there any Unreal Engine compatible APIs to utilise or study as a suitable reference implementation?

  2. Advantages. What are the advantages of implementing this solution over other options?

  3. Limitations. What are the limitations of implementing this solution?

  4. Unreliable network use case. How would we expect this solution to fare in a real network with packet loss, jitter, and changing network conditions (e.g. the public internet)?

UDP in the Unreal Engine

UDP streaming is likely the simplest option to implement inside of the Unreal Engine; however, it has several limitations that, we argue, make it an untenable choice. Implementing video and audio streaming in the Unreal Engine using UDP will not require the implementation of any specific protocol or library as the Unreal Engine already supports transmission of UDP packets. 33 Additionally, transmission of data using UDP will lack the packet overhead that other transmission protocols require. However, pure UDP, is very limited; it has no support for packet loss, out-of-order delivery, or jitter. For example, even in private, reliable networks, UDP will potentially drop packets or have some out of order delivery. 34 For example, when we consider the UDP transmission of H.264 encoded video in a unreliable network, the loss of packets may well result in the video stream becoming unrecoverable. 35 It is our view that these limitations make pure UDP an untenable choice. However, there are several additional techniques that can be implemented on top of a UDP streaming solution, such as FEC and jitter buffering, that will mitigate these limitations to some extent. However, we would argue that if mitigating techniques are being used to overcome the intrinsic properties of UDP then other protocols, such as RTP, already exist, and due to their wide adoption, should be preferred.

RTP in the Unreal Engine

RTP is a lightweight protocol that can be implemented in the Unreal Engine using the inbuilt UDP transmission APIs. If a reference implementation is required we recommend studying uvgRTP 3637 and Live555 38. The major feature of RTP, compared to pure UDP, is that RTP transmits sequence information in its packets to aid in reconstruction when out-of-order delivery occurs. However, RTP itself has no inbuilt QoE techniques to actually address out-of-order delivery, which means if QoE techniques such as jitter buffering or FEC are required, these QoE techniques must then be implemented by the developer. In a reliable network without extra QoE techniques implemented, we would expect RTP’s effectiveness is mostly similar to UDP. However, with QoE techniques implemented, we expect RTP to operate effectively in a reliable network. Additionally, in a reliable network, jitter buffers could be kept relatively small, thus minimising latency. However, in an unreliable network, RTP would require QoE techniques to deliver a functional real-time streaming experience. However, we highlight that the QoE techniques commonly paired with RTP, FEC and jitter buffers, will introduce tradeoffs in bandwidth and latency, respectively, when operating in an unreliable network.

RTP + RTCP in the Unreal Engine

RTCP is designed to monitor real-time streams and calculate statistics, which are then used to dynamically modify QoE techniques or dynamically adjust the source stream. Should the implementation of RTCP into the Unreal Engine be pursued, it is our recommendation to study the existing RTCP library in Live555. 3839 We highlight that to make full use of RTCP it would have to be implemented in the receiving end too, which may not be possible in some blackbox systems. However, assuming RTCP is implemented on both ends and RTP is using jitter buffering, we expect RTCP would greatly improve the quality of experience in unreliable networks by allowing the jitter buffers to dynamically resize based on network conditions. Contrasting this to a pure RTP jitter buffer that has to assume a fixed size, we would expect the RTP + RTCP solution to have reduced latency as network conditions recover. Additionally, even if jitter buffering is not used, we would still expect RTP + RTCP to produce a better experience than pure RTP in unreliable networks because the source quality of the stream can be dynamically adjusted using information from RTCP.

RTSP in the Unreal Engine

RTSP is a protocol for controlling streaming media servers. 6 In practice, it is typically a thin protocol that uses RTP and RTCP for the underlying media streams and control messages. Therefore, we consider the practical advantages and disadvantages of implementing RTSP in the Unreal Engine to be much the same as an RTP + RTCP solution. Thus, we refer the reader to our above remarks regarding RTP + RTCP. Additionally, when seeking a reference implementation for RTSP we recommend studying Live555 38.

RTMP in the Unreal Engine

RTMP is a widely supported TCP-based protocol that was developed by Macromedia/Adobe 7. In terms of implementing RTMP into the Unreal Engine, audio and video streams from the Unreal Engine’s Pixel Streaming system would have to be chunked and transmitted using the Unreal Engine’s TCP API 33 in the RTMP packet format. We recommend studying iReader’s media server for an existing RTMP implementation. 40 The major disadvantage of RTMP is that it uses TCP as its underlying transport, which in an unreliable network will cause latency to bloat as redelivery attempts occur. 41

SRT in the Unreal Engine

SRT is a low latency UDP-based streaming protocol created by Haivision. 10 SRT is open source and Haivision provides a production ready SRT library for developers. 11 To implement SRT inside the Unreal Engine’s Pixel Streaming system we would recommend incorporating Haivision’s SRT library into its own plugin or modifying the existing Pixel Streaming plugin to include the SRT library. The advantage of SRT is that it is specifically designed for low latency streaming across unreliable networks and unlike other UDP-based protocols, such as RTP, SRT itself contains a mechanism to handle packet loss via ARQ. The main disadvantage we foresee with a solution that uses SRT is that SRT is relatively new and is unlikely to be supported by many devices on the receiving end.

RIST in the Unreal Engine

RIST is very similar to SRT, it too is a low latency UDP-based streaming protocol. 13 Unlike SRT, which was created by a private corporation, RIST was created by the Video Services Group, a non-profit “comprised of service providers, users and manufacturers dedicated to interoperability, quality metrics and education for media networking technologies”. 42 To implement RIST we recommend incorporating librist 12, the official RIST library, into its own plugin or including it in the existing Pixel Streaming plugin. Much like SRT, the main disadvantage of implementing a solution that uses RIST is that it is relatively new and unlikely to be supported by many devices on the receiving end.

Unreal Engine → WebRTC bridge → non-WebRTC receiver

This solution involves leveraging the existing Pixel Streaming system in the Unreal Engine to stream to a custom WebRTC bridge that then forwards the received video and audio from the Unreal Engine to some arbitrary non-WebRTC receiver using a widely support streaming protocol such as RTP, RTSP, or RTMP. Existing WebRTC restreaming servers include Janus 25 and Wowza streaming engine 43; however, the Go-based WebRTC library, Pion 29, is also an option if there are specific requirements beyond simply forwarding streams.

The advantage of this solution is that it requires no modification to the Unreal Engine or the Pixel Streaming plugin and moves the requirement to stream to other protocols outside of the Unreal Engine. Additionally, the deployment of a WebRTC bridge, particularly on its own hardware, is more scalable than streaming directly from the Unreal Engine while also eliminating the possibility of resource competition with the Unreal Engine Pixel Streaming application. Furthermore, due to WebRTC being the underlying streaming technology, this approach inherits all of the low latency, high resiliency, properties of WebRTC if configured properly.

Conversely, the disadvantage of this solution is that it requires another component, potentially running on its own hardware. Additionally, this approach may introduce some latency as the stream is forwarded through the WebRTC bridge. Furthermore, if the WebRTC bridge is not in the same local network as the receiver this approach will inherit the same qualities as whichever protocol is selected to stream from the WebRTC bridge to the non-WebRTC receiver.

4. Summary and recommendations

Based on Section 3 there are some solutions we can disregard from our recommendations as they do not satisfy our goal of ensuring compatibility with legacy and/or blackbox streaming clients. Specifically, SRT and RIST should be disregarded because they are likely to be incompatible with most legacy streaming receivers. Furthermore, it is not clear whether RTSP has any benefits over simply implementing RTP and RTCP in an Unreal Engine Pixel Streaming scenario; therefore, it too can be disregarded. Lastly, a pure UDP streaming solution may suffer from packet loss issues, even in a private network, so we will also disregard it too.

Thus we consider RTP, RTP + RTCP, RTMP, or a WebRTC forwarding bridge to be the most feasible solutions to support streaming from the Unreal Engine’s Pixel Streaming system to a non-WebRTC receiver. However, of these four options, a WebRTC forwarding bridge is our recommended solution.

We argue that the RTP, RTP + RTCP, and RTMP solutions all have one major disadvantage compared to a WebRTC forwarding bridge. That is, if stability in poor or varying networking conditions is required then these protocols do not provide a solution on their own. RTP and RTP + RTCP can be implemented with FEC, packet loss concealment, and jitter buffering to address these issues; however, this must be implemented, to some extent, on both ends, which is a non-trivial task and may be an infeasible task if the receiving end is a blackbox or unmodifiable legacy system. Whereas, a WebRTC forwarding bridge gains all the low latency and poor networking resilience that are inherent to WebRTC. We highlight that to utilise these benefits of WebRTC to maximum effect we recommend installing the WebRTC forwarding bridge in the same local network as the non-WebRTC receiver. If the WebRTC forwarding bridge is not in the same network as the non-WebRTC receiver then this solution will be subject to the same disadvantages as the underlying forwarding protocol (e.g. RTP, RTMP).

Furthermore, even in a scenario where stability in poor or varying network conditions is of no concern, we still recommend the WebRTC forwarding bridge. The WebRTC forwarding bridge is the only solution that does not require making any modifications to the Unreal Engine. This mitigates several issues, including: the maintenance cost of updating the solutions as changes occur in the Unreal Engine source code, licensing conflicts with the Unreal Engine, and the complexity of integrating third party libraries inside the Unreal Engine. Additionally, it is the only solution that operates completely outside the Unreal Engine and is therefore far more scalable.

To demonstrate the viability of this solution we have provided a proof-of-concept using Pion WebRTC to forward RTP streams from the Unreal Engine’s Pixel Streaming system to a non-WebRTC receiver (FFplay in this case). This proof of concept can be found at:

Appendix A: Measuring latency

In any streaming solution measuring the overall and per-component latency is useful to debug, optimise, and configure the system. Based on our examination, it appears that there is no embedded mechanism within the streaming protocols we have analysed for this purpose. In this appendix we present a number of techniques of varying implementation complexity and accuracy for the reader’s consideration. We restrict our techniques to video streaming scenarios.

Embedded timecodes

This technique requires timecodes are rendered into the source video. This video is then transmitted and the received video image containing the rendered timecode can be visually inspected at each component and compared with the source video timecode. The difference in timecodes gives a rough estimate of the latency between the source and the tested component. The major problem with this technique is synchronising the manual inspection. While this technique is inaccurate, it is simple to implement.

Embedding metadata in the stream

This technique requires timestamps to be transmitted inside the video stream by using the existing metadata fields that are supported by the underlying media. For example, H.264 supports a “user data unregistered SEI message” 44 that others have successfully used for this purpose. 4546 We highlight that this technique requires components in the streaming system to have synchronised clocks. Once the timestamp is received it is compared with the system clock to determine the latency. The advantage of this method is that it is highly accurate. The disadvantage of this approach is it artificially inflates the bandwidth, which may itself impact latency.

Packet inspection in software

This technique does not require any modification to the streaming system, instead it requires that packet inspection software is installed on each component in the streaming solution. When the latency test begins the packet inspection software starts collecting all the relevant packets and then logging the packet contents and the time of receipt to a file. Once the latency test is complete all logged data across all components is processed offline such that a given packet will have a corresponding timestamp at each component that received or transmitted it. Using these timestamps, overall and per-component latency can be calculated over time. We highlight one major disadvantage of this technique is that it is only possible when each packet can be uniquely identified, which is possible in protocols such as RTP where each packet is given a unique sequence number. However, in configurations where packets are encrypted or streams are interleaved together this technique becomes challenging to implement. However, if this technique is pursued, we recommend using the packet inspection software Wireshark 47 to collect the relevant packets. In addition to packet collection, Wireshark also has an RTP stream analysis feature 4849 that reports packet loss and jitter, both of which are useful complementary sources of information when analysing stream latency.

Further reading

  • The textbook, “RTP: Audio and Video for the Internet” is a detailed textbook about the RTP protocol and provides an in-depth discussion about all the considerations one would have to make to realistically use RTP for streaming audio and video on the public internet. For example, the book has chapters dedicated to packet loss concealment and how to add security on top of RTP. 50

  • There exists an RFC for a proposed standard regarding an “RTP Payload Format for H.264 Video”. According to the RFC, “the RTP payload format allows for packetization of one or more Network Abstraction Layer Units (NALUs), produced by a H.264 video encoder, in each RTP payload”. While this RFC is only a proposed standard and is not yet accepted (opened in 2011) it is probably best to follow for future-proofing purposes if streaming of H.264 via RTP inside the Unreal Engine were to be attempted. 51

  • This article 52 discusses how one might implement QoE techniques such as jitter buffers on top of RTP.

  • Haivision’s SRT whitepaper describes, at a high level, how SRT is different from other real-time media streaming protocols and how it handles poor network conditions at the protocol level. 53

  • In this blog post 54 from Haivision, the creators of SRT, they provide some history on SRT and how they came up with their solution based on their work with live broadcast transmission. They mention their previous approach of using MPEG transport streams on protected networks, which when packet loss is low, works perfectly, they claim. They go on to mention that if packet loss became a problem on those protected networks they used Forward Error Correction (FEC), which solved their packet loss problems entirely. It was only when their customers wanted to move away from protected networks and broadcast between sites that they required a solution for low-latency, error-tolerant, transmission over the public internet. 54


  1. Unreal Engine Documentation - Unreal Engine 4.21 Release Notes 

  2. Epic Games Blog - Pixel Streaming: delivering high-quality UE4 content to any device, anywhere 

  3. MDN - WebRTC API 

  4. IETF RFC - RTP Protocol 

  5. IETF RFC - RTCP Protocol 

  6. IETF RFC - RTSP Protocol  2

  7. Adobe - RTMP Protocol  2

  8. Pogrebnyak - What is RTMP and how it’s used in live-streaming. 

  9. Wowza - RTMP Streaming: The Real-Time Messaging Protocol Explained  2

  10. SRT Alliance - FAQ  2

  11. Github - Haivision’s Github Repository for SRT  2

  12. Gitlab - VideoLan’s Gitlab Repository for RIST  2

  13. Video Services Forum - Technical Recommendation TR-06-1  2

  14. O’Reilly Textbook - High Performance Browser Networking, Chapter 18, WebRTC  2 3

  15. Ebook - WebRTC For The Curious 

  16. WebRTC Org - Google’s WebRTC Architecture 

  17. Book - High Spectral Density Optical Communication Technologies, Chapter Forward Error Correction 

  18. Johan Westerlund’s Thesis - Forward Error Correction in Real-time Video Streaming Applications  2 3

  19. Vidovation - Error correction techniques for IP networks (Part 2 – ARQ)  2 3 4

  20. Stackoverflow - Forward Error Correction for streaming data 

  21. Haivision - Automatic Repeat reQuest (ARQ) 

  22. Dimitri Osler - RTP, RTCP and Jitter Buffer  2

  23. Book - Speech Coding, Chapter Packet Loss and Concealment 

  24. Google AI Blog - Improving Audio Quality in Duo with WaveNetEQ 

  25. Janus Documentation - Janus the general purpose WebRTC server  2

  26. Janus Documentation - NoSIP plugin documentation 

  27. Janus Documentation - VideoRoom plugin documentation 

  28. Github - Janus RTP Forwarding Plugin 

  29. Github - Pion WebRTC  2

  30. Github - Pion WebRTC RTP Forwarder Example 

  31. GStreamer WebRTC: A flexible solution to web-based media 

  32. Kurento Documentation - What’s Kurento? 

  33. Unreal Engine API Documentation - Runtime > Networking > Common  2

  34. StackOverflow - What are the chances of losing a UDP packet? 

  35. StackOverflow - Streaming a h.264 coded video over UDP 

  36. Altonen et al. - Open-source RTP library for high-speed 4K HEVC video streaming 

  37. Github - uvgRTP, an RTP library written in C++ with a focus on usability and efficiency 

  38. Live555 - The LIVE555 Media Server  2 3

  39. Github - Live555 Mirror 

  40. Github - IReader’s Media Server 

  41. Boris Rogier - Measuring network performance: links between latency, throughput and packet loss 

  42. RIST - Why RIST? 

  43. Wowza - Wowza Streaming Engine 

  44. RTP Payload Format for H.264 Video Stream Extensions - Stream Layout SEI Message 

  45. Stackoverflow - How to add metadata to transport stream? 

  46. Stackoverflow - Embedding Metadata to H.264 encoded file 

  47. Wireshark - The world’s foremost and widely-used network protocol analyzer 

  48. Wireshark - RTP Analysis 

  49. Wireshark - Decipher the RTP Stream for Packet Loss Analysis in Wireshark for Voice and Video Calls 

  50. O’Reilly Textbook - RTP: Audio and Video for the Internet 

  51. IETF Proposed Standard - RTP Payload Format for H.264 Video 

  52. Smartvox Knowledge Base - RTP, Jitter and audio quality in VoIP 

  53. Haivision Whitepaper - SRT Open Source Streaming for Low Latency Video 

  54. Marc Cymontkowski - Why We Created SRT and the Difference Between SRT and UDT  2