Conversation

Replying to
Some background 📖 We experience live video in one of 2 ways 📞 Video calling 2-way real-time interactive calls with limited scale. Think Zoom 🎥 Live streaming 1-way broadcast to millions of users, usually with a delay of a few seconds. eg. Twitch, IG live (2/n)
1
14
Both of these run on different underlying protocols - WebRTC (video calling) and HLS/DASH (live streaming). Developers building an experience that combines both video calling and live streaming elements have to use 2, sometimes 3 separate vendors/SDKs (3/n)
1
8
The pandemic blurred the lines between live streaming and video calling. Everyone’s home is a studio now 🎙️ Boring 1-way live-streams have been replaced by collaborative discussions - audience come on stage, streamers add co-hosts in real-time - all at massive scale (4/n)
1
6
But the world of SDKs hasn't kept up. Most SDKs ask developers to choose either streaming (limiting interactivity) or calling (limiting scale) Developers have to manually stitch together multiple systems to build a large-scale live interactive streaming experience (5/n)
Image
1
6
brings video calling and streaming into a single SDK Developers don’t have to maintain state between different SDKs, build their own real-time DBs to manage interactions and roles, manage CDN/transcoding pipelines. With 100ms, all of this just works (6/n)
Embedded video
GIF
1
9
What does this mean for businesses? For PMs - added interactivity 👯‍♀️ If your users are using 1-way streaming (YT, Instagram, or a custom HLS stack) - you can now offer real-time interactive features - bringing people on stage, polls, chats with a few API calls (7/n)
1
7
For CXOs - cost savings 💰 If you're relying on a pure real-time calling stack (Zoom, custom WebRTC) for use cases with interactivity and >100 participants, you can reduce your cost with 100ms' Live Stream SDK (8/n)
1
9