This is part three of four in this week’s series on the state of web video. Read parts one and two. The next installment will be posted tomorrow.
So far, we’ve explored our abilities to load and control video playback and to tweak every pixel and audio sample to any creative whim. There are some recently available and upcoming tools we can use to fling our creations far and wide. But they are still rough, so using them will take careful preparation and, as always, lots of testing.
Webcams, Microphones and Peer-to-Peer Streaming
The MediaStream Processing API presents new options for streaming audio and video on the web. Each stream is a collection of synchronized audio and video tracks that can come from a webcam or someone else’s web browser over a peer-to-peer connection. The stream can also be played back in many ways — in a regular HTML5 audio or video element, to a Web Audio API “node” or to an outgoing peer connection. HTML5 Rocks has an article on real-time communications between browsers (“WebRTC”) with a decent primer on the MediaStream object, but it’s a couple of years old and a lot has changed, so be sure to also check the reference.
For now, MediaStream support is limited to Firefox, Chrome and Opera. The browsers that support MediaStream all support cameras and microphones as sources and they can both send and receive these streams with WebRTC. The technology is mature enough that Google has started using it for Hangouts and Mozilla is using it for Firefox’s upcoming Loop. There is no shortage of tutorials and examples that use webcams and WebRTC on their own or with WebGL or Web Audio API. The only specific bug I’ve noticed so far is that video dimensions for streams are not available when they should be in Firefox. In the future, we will hopefully see more uses for media streams. There are proposals to be able to stream video from video elements or from a canvas, which would be very powerful but have so far had limited traction.
Similar to MediaStreams (and easily confused), Media Source Extensions allow for the long-awaited “adaptive streaming” ability. That means that if you’re watching a video on a slow network connection, the browser can automatically and seamlessly switch to a lower bit-rate stream to avoid pausing to buffer. If the connection speeds up again, the video quality will come back up.
Even though Media Source Extensions are new, support is spreading rapidly — Chrome and Internet Explorer can already use them. YouTube can, when it’s available (on Firefox, though only for WebM files and not yet for MP4). Netflix uses it as well and just announced that it will make adaptive HTML video available in the next version of Safari.
Part four will go live tomorrow. I’ll discuss the different ways web video technologies fail and what we can do about it.
Get more documentary film news and features: Subscribe to POV’s documentary blog, like POV on Facebook or follow us on Twitter @povdocs.