In my recent demo, I experimented with mouse and keyboard controls for interactive virtual reality in a web browser. I received some positive and constructive feedback, and it appears to be a good start for VR interactions in a static scenario where the user is seated at a desk. I thought I’d see if we could push the technology a little further for a more intuitive and untethered experience.

This week’s demo allows you to view a scene with a web browser and use a separate handheld device as both a pointer for manipulating objects and for moving around the virtual space. As before, it’s build entirely on web technologies, but this demo takes full advantage of the web as a superior delivery platform. If you don’t have access to your own Oculus Rift, you can use a mobile browser to look around the scene.

View the demo » (more instructions below)
View the source code »

Viewing the Demo

Open the demo page in any modern browser. (If you have an Oculus Rift headset, you can use a VR build of Firefox or Chrome.) You can even use a phone or tablet, but again you’ll need Firefox or Chrome to try my remote control. The page will show a QR code, which you can scan with the mobile device you’ll be using as the controller. Once the control web page comes up and connects to the viewer page, the QR code should disappear and you’ll have control of the pointer. In the viewer, if you have an Oculus Rift or Google Cardboard, click the “VR” button to start the full-screen 3D view.

Once you have the display working and your headset is on, hold the control device as you would a TV remote control. Point it in the direction you’re looking and tap the screen with two fingers to center the pointer on your view. You should now be free to look around and point the control device wherever you like. When you point at one of the objects in the scene, tap the screen once to grab the object and rotate it around with your hand. Tap again to let it go.

You can also use the touch screen on your remote control to move around the scene. Touch the screen and drag in any direction to move in that direction. The demo is aware of both the direction you’re pointing the phone and the direction you’re dragging your finger. So you can use either one to orient your movement. If you move in any direction other than the one you’re looking, you’ll move slower than if you were moving straight ahead. (Fast movement backwards or to the side in virtual reality is a sure way to cause motion sickness.)


Mobile browsers provide a handy API for determining the orientation of the device, using the built-in gyroscope and compass. I’ve found that the API is reliable but the hardware is not. There is often some drift, requiring the double-touch mentioned above to re-center the pointer, and it can sometimes fluctuate wildly. There is also a problem with lag, since browsers will only report orientation every 50 milliseconds. But work is under way to report more often, specifically to support mobile virtual reality applications.

The browsers are also very good at reporting touch events, even with multiple points of contact. But this can get thrown off when you rotate your phone. The browser is constantly trying to orient itself correctly regardless of whether you’re holding the device vertically or horizontally. Every time it re-orients, it breaks the touch events, and this messes up the demo. Fortunately, there is a way to lock the orientation, but it only works in full-screen mode.

Finally, we need a way to communicate the orientation and touch events from the mobile device to the main viewer. These events come in many times per second, and we want to send them as fast as possible to keep the pointer from feeling too sluggish. We can transfer data directly between the two devices using a WebRTC peer data connection. An older technology like WebSockets would require going through a server, which could add 100 to 200 milliseconds on top of the 50ms lag we’re already incurring waiting for the orientation data. WebRTC works in Firefox and Chrome, but unfortunately not Safari, so our demo won’t work with iOS.

It’s early to tell what interaction methods will be popular for virtual reality. Whatever it is, I imagine specialized hardware will offer a more reliable and higher-fidelity experience than this. But this could be a very portable and convenient alternative for mobile VR using a device you’re likely to have in your pocket anyway.

Let us know if you end up using the code or can think of how to improve the experience. Share a link. You can comment below, use the hashtag #povtech or email us at

Video by Shako Liu.

Get more documentary film news and features: Subscribe to POV’s documentary blog, like POV on Facebook or follow us on Twitter @povdocs!

Published by

Brian Chirls is the Digital Technology Fellow at POV, developing digital tools for documentary filmmakers, journalists and other nonfiction media-makers. The position is a first for POV and is funded as part of a $250,000 grant from John S. and James L. Knight Foundation. Follow him on Twitter @bchirls and GitHub.