It’s Xmas! Or, depending on your time zone, it’s about to be Christmas. It’s the season to be jolly, and CoSMo would like to bring you a few gifts.
While we have been hard at work in the background to make sure #Webrtc gets better (some overview of that here), we have put a lot of effort in the last year into remote collaboration workflows for content production and post production.
Which is why we are releasing multiple products and services in time for Christmas, bringing Broadcast Quality to Real-Time streaming across devices.
I. Tech Emmy Awards go to … #Webrtc solutions?!?
While Real-time remote production or post production is not new, it was historically done through file exchange and annotation, which is iterative and sequential, and thus not interactive.
Teamwork requires real-time connections to allow for collaborative post-production. That also is not new to the broadcast ecosystem. Sohonet, for example, offers its ClearView Pivot system. However, it requires dedicated appliances, networks and 100-200 Mbps internet connection speeds to guarantee the necessary quality to work well.
Interestingly, this years Emmy awards needed to be broadcast via an OTT solution because of COVID restrictions. Media engineers and producers weren’t able to collaborate remotely from offices with dedicated equipment and network connections; but rather from their homes with only consumer grade hardware, routers, and public internet access. That’s where evercast and Sohonet’s ClearView Flex.
The broadcast industry needed a solution that offers broadcast-quality features, while still enabling consumer-grade workflows. That is the very anatomy of a solution deserving of a Tech Emmy Award:
The anatomy of that Tech Emmy Award includes a WebRTC A-Team:
- CoSMo for the WebRTC clients, encoder/decoders and protocols
- Meetecho‘s Janus for the server and some extra platform work. (Interestingly enough, Sohonet clear view also uses Meetecho‘s Janus server and the webrtc stack).
Both Evercast and Sohonet clear view Flex use WebRTC encryption, which is not end-to-end in the use case of interest, and were originally limited to 8bits, 4:2:0 video, and stereo sound.
II. OBS with WebRTC Inside!
The Evercast solution publicly leverages a modified version of OBS (21) to support streaming webrtc (69) originally developed by CoSMo in collaboration with Evasyst, an e-Gaming company which since pivoted and rebranded itself Kast.
As WebRTC was improving, the WebRTC A-Team was improving both OBS-studio-webrtc and Janus to stay at the cutting edge:
- more bandwidth management
- multiple codecs rooms
- support for VP9 & AV1
- 5.1 audio pass-through for Janus, and the equivalent for OBS.
More recently support for the first Draft of the WHIP protocol has been added to allow the OBS-studio-webrtc client to connect to any platform that can ingest #Webrtc.
The first Xmas gift we have for you is this: A new version of obs-studio-webrtc now available in Beta that includes the upgraded libwebrtc 84 with the latest WebRTC security patches and the latest version of OBS (26). This release is the most complete and in-sync ever, with:
- NDI support
- SDI support through Black Magic Decklink Devices
- websocket remote control, and much more.
OBS-studio is great software, and a great project led by passionate people and supported by all the big platforms. But, it has its limitations for the post-production and broadcast use cases.
III. Beyond OBS … Advanced workflows with Millicast Encoder, Studio & Player
OBS-studio is perfect for the original use case of streaming one’s screen from a consumer PC to a social platform using RTMP. For that workflow it’s a great tool with little to nothing one can really complain about. Not to mention it’s FREE. The limitations only arise when trying to use OBS for things it was not originally designed for.
A solution specifically designed to overcome those limitations explains our second Christmas gift: native Millicast clients designed for simplifying different parts of the production pipeline. The first desktop versions are available here.
Our Millicast clients are designed to improve the workflow in multiple places in the streaming pipeline where OBS-studio-webrtc was being used.
The Millicast Encoder is responsible for encoding the original source on a computer connected to a physical capture device (SDI, HDMI) or virtual device on the network (NDI).
In that configuration, having a GUI is not of great importance, but the capacity to remote control the software (especially with COVID) was request number one.
Request number two from professional studios is the capacity to run multiple encoders in parallel. Studios have big workstations with multiple capture devices (i.e. Blackmagic DeckLink Capture Cards), each connected to a professional SDI input that needs to be individually encoded and streamed.
While these Workstations have plenty of capacity, OBS was never designed to support this workflow. Trying to run multiple OBS instances in parallel is problematic as the instances compete for system resources (CPU, Memory).
A significant amount of users are also using OBS-studio-webrtc as an adaptor. They receive a stream through NDI, or through a browser source (see below), or other creative ways; and either push it to a professional SDI display or masquerade it as an NDI source for other software running on the same LAN.
In the original OBS-studio the embedded browser is old and subject to WebRTC security holes that are well documented. Despite that, this is the officially the preferred way to bring WebRTC streams into OBS.
But some projects have found alternative workflows specifically designed for this use case. If I had to cite only one example, OBS-Ninja is doing a great job at this niche use case. Developed by a single developer, and fully open source, it deserves credit and support.
While this is doable today with OBS-studio, we are getting quite far away from OBS Studio’s original use case of encoding locally and sending that stream to social platforms. The problem persists of finding a more efficient way to decode the SDI/NDI/CEF libraries. It calls for a simplified player, with no encoding capacity, to simplify the system at the source.
That also touches on what is IOHO the biggest limitation of OBS-studio today: mobile support.
With the ‘player’ you want to support a wide range of devices, including:
- Mobile phones
- and TV playback with Chromecast and Apple TV.
While Millicast has supported a Chromecast client for over two years, it did not have an equivalent to OBS on mobile.
Google Stadia is leading the way showing us how good (4K HDR 60fps cyberpunk, anyone?) a WebRTC player can be even on iPhone or iPad.
This is our third Christmas gift: Millicast native clients for mobile devices, including:
- iOS: iPhone & iPad
If you want to participate in the Beta testing for iOS, iPad and AppleTV, which will last from Christmas to Chinese New Year (mid-February), contact us at hello [at] millicast.com. Beware, there are only 10,000 TestFlight seats for the testing, so first come, first served 😉
The original use case, which involves capturing multiple sources, mapping audio channels, mixing audio, compositing frames and so on is still important. These days people want to do it with several encoding pipelines in parallel, and on their iPad.
What is also extremely difficult with real-time media is to debug issues when there is a perceived degradation of quality. Especially when the problems comes from the public internet connection on either the sender side or receiving side.
One of the most used debug tools for this in Chrome is the webrtc-internals page and its real time statistics (Firefox and Safari have equivalent tools). Unfortunately it only gives you part of the answer. You can see the uplink stats you are sending, or the downlink stats if you are receiving, and you cannot see what’s happening inside the platform you’re streaming to.
In Millicast Studio, we integrated full support for WebRTC Statistics. One can get the same level of information you would get inside a web browser, allowing for advanced debugging. On top of that, we implemented a loopback mode, for senders to see themselves as their viewers would, test latency, check both their uplink and downlink quality, and more.
The video below is an illustration of the basic features of Millicast Studio:
IV. Quality, Innovation and Differentiation
In terms of software quality, I would say that OBS is under tested. Like anything else, if it is not tested it is broken. They have a very agile process, and fix bugs very quickly, but lack a complete thorough test suite with coverage and other usual tools most open source project users have become used to.
CoSMo, one of the parent companies of Millicast, has a strong focus and expertise in Testing. We have developed the KITE test suite with Google specific to WebRTC interoperability, and written half of the WebRTC web-platform unit tests.
One of the things that makes KITE special is the capacity to test both web and native apps, on desktop and on mobile. Which makes KITE the perfect tool to test OBS-studio-webrtc and the new Millicast native apps against web and player apps.
In the next version of obs-webrtc-studio (m84v26), the KITE integration tests will be made open source as part of the GitHub repository. To manage expectations, we are still getting reports of bugs due to regressions. But it is a step in the right direction. Progress, not perfection.
Innovation and differentiation.
It is unlikely that WebRTC will gain wide adoption for production and broadcast use cases if it remains based only on the version implemented in the browsers.
In terms of media quality, the current reference implementation supports only 8-bit images with 4:2:0 chroma sampling. The H.264 and VP8 codecs support only BT.601 color space. However, the ubiquity of browsers, and the ease of development of web apps are going to make it the de-facto standard for years to come.
Many, including Apple, Google, Cisco and CoSMo, are using WebRTC way beyond what is possible on the web:
- Better real-time codecs (H.265, AV1)
- Better bit depth (10bits)
- 4:4:4 chroma sampling
- Dynamic color space (HDR)
- Scalable codecs
… the list is long even when discussing just media quality.
Even today, companies can achieve broadcast quality in real-time over the public internet without dedicated hardware!
This is an innovative concept and a decisive blow to most traditional streaming media servers that preach the usual quality-latency-bitrate triangle. A triangle that means you have to sacrifice either Scale, Latency or Quality to provide a truly scalable real-time solution for professional broadcasters.
That is also true for End-To-End-Encryption which is being adopted in WebRTC through web based technology co-authored by Google and CoSMo (Apple added an early experimental implementation in the latest Safari Tech Preview 117).
Although we are huge proponents of open source projects, the reality is that the GPL-license of OBS-Studio reduces incentive for companies to embed their innovative IP in the code base. And unlike commodity technology like RTMP, embedding new technology into WebRTC represents real differentiation that will drive evolution.
Providing our own native apps allows Millicast to provide innovation to our customers early, and for our customers to develop apps based on our SDKs and their own IP while keeping their differentiation value intact.
V. Take Away for Christmas
Millicast and CoSMo are releasing today the latest version of OBS-studio-webrtc for desktop. It is fully synchronized with the latest version of OBS-studio, v26.
Millicast and CoSMo have released native clients to improve upon OBS-studio-webrtc in terms of testing coverage, size, and WebRTC support.
The Desktop versions are already available, and the mobile versions (iOS, iPad, appleTV) are being made available through testFlight, and you can contact hello [@] millicast.com to become part of the Beta test.
While OBS-studio-webrtc will be supported in the future, its feature set will be kept in par with what is possible in browsers. However, the native apps and SDK will have advanced features not possible today in the browser, which will bring WebRTC as close to broadcast quality as possible.
Those native apps will be made available as SDK + white label apps for all Millicast customers that want to leverage WebRTC to its maximum potential and add their own IP to their App without being subject to a GPL license.
Merry Christmas and Happy New Year to all.
Annex 1 – Evercast at IIT-RTC 2017
Speaker: Alex Cyrell
Company Name: Evercast, LLC
Bio: Co-founder Evercast, LLC, Pioneering the implementation of RTC in Movie and TV Production. As founder and President of OmniMount Systems, Inc., Alex Cyrell took OmniMount from garage level early start-up, through several growth plateaus into an international enterprise currently exceeding nine figures in revenue. He designed, engineered & marketed an extensive line of products with myriad applications–across a spectrum of industries. He established broad B2B and B2C distribution. International sales exceeded 60 countries. He has been awarded numerous patents & international awards in design engineering, media & advertising, including the prestigious Clio & Mobius awards. He personally wrote an Underwriters’ Laboratory (UL) standard. Executive Producer: ANOTHER ROAD HOME Released in over 40 movie theaters to critical acclaim from major media outlets e.g. The New York Times, Washington Post, Christian Science Monitor, CNN and many others. Co-founder, Future Primitive Designs, Ltd. Developed breakthrough detail replication methods and tooling processes that enabled miniaturization of over 400 miniature guitars, microphones & symphonic instruments. Sales reached over 8000 stores and museums. Global sales exceeded 50 countries. Partner, Desert Dreams Productions, LLC: music recording, publishing & music video production—an evolution of Cyrell’s early career as a professional musician, playing with music industry legends Michael Jackson, Tom Jones, Big Joe Turner, among others. Cyrell has consulted in several diverse industries; from green construction to health care/advanced medical diagnostics. He has been a ‘Technopolis’ mentor and guest lecturer at Arizona State University’s School of Mechanical Engineering. A member of Society of Manufacturing Engineers (SME), Industrial Design Society of America (IDSA); National Association of Recording Arts and Sciences (NARAS). BA Hofstra University. Graduate Studies: Graduate Faculty, The New School.
Title: The Impact, Value and Future of Real Time Communication in Hollywood Movie & TV Production
Tags: WebRTC & Cloud Communications,AudioVideoSynopsis:
Desc: Tracing the evolution from celluloid film to digital production shows how the audience experience has been brought to a level unimaginable at the dawn of the moving images era. Ceaseless demand “for more and better” by increasingly sophisticated audiences makes movie making ever more complex and technologically demanding. The current Hollywood movie zeitgeist benefits now more than ever from RTC. It streamlines editorial workflow with new efficiencies and offers distinct creative advantages. But there are challenges. What has been acceptable for a conference call is inadequate for movie making and high-end TV production. Traditional mechanisms for iteration and collaboration are available but they are not “real time.” RTC streaming provides to the movie industry substantial cost reduction, enhanced productivity and creative momentum. Meeting the industry needs for real time stream quality at higher resolution, higher frame rate and bit rate, full-spectrum stereo audio, requires a substantial effort. Added security, not only on the wire, but also watermarking, and full containers to protect the client side, are examples of the many security essentials. Specific future RTC improvements will further enhance RTC value for Hollywood and all of video content production.