When Facebook bought Oculus VR for 2 billion dollars I’ve read some comments like “what do they want with game technology? That while I believe there are some non game related possibilities.
Imagine watching video or courses on a big virtual IMAX like screen. Or implementing a photosphere viewer for example. You could have a real estate website where you could view each room in 360 degrees without ever needing to visit the property in question. With some imagination the possibilities are endless.
As part of a little experiment and to get a grasp of some VR, I’ve build a virtual (class)room with a big videoscreen.
Naturally being a web developer first I wanted my environment to run in a browser. It may be not be the optimal choice but I wanted to see how far I could go with it. It was also an opportunity to improve my WebGL skills.
Modeling a room
While you could in theory code a room in code, it is relatively easier to model something in a 3D package. Fortunately for me I have some experience modeling in 3D and my preferred package (Blender) also has a BabylonJS exporter.
It’s certainly isn’t the best 3D scene that I could create, but it was good enough to test some simple things. Simple scenes are also a bit easier to debug when something goes wrong.
Displaying video & controlling the lights
For some unknown reason it didn’t work to model a plane in Blender, fetch a “pointer” to the object in code and change its texture to a videotexture. The video plays as I can hear the sound, but the texture is all garbled up. So I ended up defining a plane) in code, assigning a videotexture to it and position it manually in the scene.
One tip that I would like to share is how to control (play, pause, … ) a video that is loaded as a videotexture and that by accessing its DOM properties and methods.
// get internal video DOM object var video = video_plane.material.diffuseTexture.video; // pause video.pause(); // play video.play();
But what didn’t work for the videotexture did work for the lights. I was able to fetch a “pointer” to the light and control its intensity resulting in some code that dims it when the video plays. By dimming the lights the video becomes the main focus element in the scene.
// get first light (there is only one) var light = global.scene.lights; // change its intensity light.intensity = 0.5;
At this moment I’m still waiting for my DK2 devkit to ship, so I wasn’t able to test it on a real device for now. But that is something that certainly plan to do once my unit arrives.
You can add oculus rift support by implementing following piece of code.
// get original camera var originCamera = global.scene.activeCamera; // create oculus camera global.scene.activeCamera = new BABYLON.OculusCamera("Oculus", originCamera.position, global.scene); // set camera parameters global.scene.activeCamera.minZ = originCamera.minZ; global.scene.activeCamera.maxZ = originCamera.maxZ; global.scene.activeCamera.gravity = originCamera.gravity; global.scene.activeCamera.checkCollisions = false; global.scene.activeCamera.applyGravity = false; global.scene.activeCamera.attachControl(global.canvas); global.scene.activeCamera.speed = originCamera.speed; global.scene.activeCamera.rotation.copyFrom(originCamera.rotation);
It’s clear that when my unit arrives I will also need to switch to my other machine who has a beefier graphics card. I love my Macbook Pro for developing stuff, but unfortunately the intel HD graphics card misses the power to deliver a good experience.
Another thing that I would like to try in the future is using a (offline) commercial engine like the Unreal engine once they support video textures. Running things in the browser is nice and a bit of a challenge which I like, but for VR you want a very smooth experience.