From the broadcasting industry perspective, WebRTC is not “complete”. It can’t be implemented once and forgotten, like many other protocols before. It lacks a standard signalling protocol to go along. From the web industry perspective, it’s a wanted feature, and it did allow the protocol to be used for vastly different use cases than originally anticipated or designed for. That situation has been frozen in place for many years, as neither the web community (we have what we want), nor the broadcasting community (who needs realtime anyway, VOD will rule forever), seem to be interested in jumping in and filling the gap. Until COVID-19 that is ….
In the broadcasting industry, there is still a part where RTMP is ubiquitous: live streams, and most importantly: media ingest for media platform, social or not (YouTube live, Vimeo, twitch, …). With the pandemic forcing people to stay at home and business to reinvent themselves to provide interactive feeds and remote experience that would mimic real life experience, the “live” segment has enjoyed a surge in interest. Really, what audience want is the same “vibe” that they would have in real life, that interaction in the moment with the rest of the audience, and with the performers. That requires better than live, that requires real-time streams.
WebRTC offers many technical advantages over older realtime protocol still in use like RTMP, RTSP, especially when it comes to resilience to bad network conditions, adaptability, security by default, better codecs, and so on and so forth. It is also a web standard, which makes writing client application for it much easier. Alas, the lack of standard signalling protocol has historically made it too hard to support for software solution, and completely unbearable for hardware encoders to support.
Some protocols, like FTL, have tried to adapt it to bypass the problem. Some others, like SRT, have tried to reimplement resilience and adaptability on top of different protocol to end up with something better than RTMP, alas not standard. Some, like amazon, or ffmpeg, are peeling down webrtc to raw RTP, which maintain the real-time and resilience features, at the cost of encryption, interoperability and other interesting features that webrtc brings to the table.
Dealing with the problem for many years now, Millicast decided to focus on the biggest pain point: the media ingest, with a solution that would hide all the WebRTC complexity otherwise necessary in that use case, and that would allow software like OBS, GStreamer or FFMPEG, and hardware encoders to implement support for WebRTC once and for all, without compromise.
The result is WHIP, the webrtc http ingest protocol, and practically it means that using the webrtc.org webrtc stack (for example), and the open source WHIP library, is all you need to support webrtc on the sender side. The burden of webrtc support is now on the server side. To date, the open source meedoze server, and janus server provide an implementation. Client side, Millicast’s OBS fork has a complete implementation for you to evaluate from. A ticket has been open in the GStreamer by one of the developper to
For those interested in implementing it, and giving feedback, there will be a session at the free, online IETF Hackathon next month: