When I wrote the blog post about building a virtual classroom, I wasn’t that extremely happy about the 3D scene that I’ve built. It felt a little flat and bland. Unfortunately there are still some limitations when working with a 3D engine in a browser.

But then I’ve read an article about the technique of “baking textures” which makes it possible to use more realistic lighting without a big performance impact. So I tried implementing it in WebGL and with success !

Baking textures, not cookies !

I can’t describe the technique of texture baking any better than what Pixar does on their website. > “Texture baking is the process of taking information from the scene, such as color or lighting, and baking it into an object’s UV space. With baked textures, we can reuse expensive calculations by looking up a texture, avoiding the need to recompute the lighting and surface shading. “

In other words instead of computing the shadows every frame we pre-calculate it and draw it (bake) into a texture map. This helps speed up rendering enormously, especially in a constrained environment like a web browser.

It also opens the possibility to do things that would be computationally impossible to pull off in real-time like caustics) for example. The drawback is that because of its static nature, you could argue that is less realistic in some cases.

Writing a python addon

I still use Blender as my preferred 3D package. Its cycles render engine supports texture baking, but is node based. Unfortunately the BabylonJS exporter doesn’t support (at this moment) materials that use nodes. So in order to leverage texture baking I need to manually create a new material for every baked texture, setting its UV mapping, specularity, light emission… .


That is quite a tedious process after a while. And I wouldn’t be a typical developer if I didn’t found a solution for being lazy through writing some code ;-) Fortunately for me Blender is completely scriptable by using python and its API.

The script works by iterating through all the objects in the scene, check if the material of each object has nodes, processes them, create new material that can be exported, assign the material to the object and save the scene to an exportable blend file so it preserves the original file.

One important bit that I need to mention is that it looks for Image Texture nodes that has no output connections or in other words the “baking target”.

It is quite rough around the edges, but it does the job nicely in saving me a lot of time. I use BabylonJS as an example, but the same principle should apply when using other engines and frameworks.

Demo & source code

A working demo (caustics, soft shadows, … ) can be found here. I also build a small room demo to compare against my virtual classroom. The source code (including the python add-on) can be found on my github account.