Rendering to Texture

Rendering to textures, also known as off-screen rendering, is a technique where all of the output produced by the renderer is stored as a texture instead of being displayed on the screen. Then, this texture can be used on top of other objects to achieve some interesting effects.

I recently improved Crimild to support this feature. As an example, take a look at the following pictures:

Rendering to textures

Rendering to textures

This example renders an iPhone device in which screen the ED-209 robot from the movie Robocop can be seen.

The effect is produced by rendering the robot first, using the new rendering to texture feature to draw everything into an off-screen buffer. Then, two quads are drawn: one by using a texture with a picture of an iPhone device and the other by using the contents of the offscreen buffer.

Although the images above may not be impressive, this is a mayor improvement since it’s the closer step towards post-processing effects like anti-allising, jittering, bloom, etc.

For those interested in the details of how this feature is used, the RenderToTexture example project can be found in the examples directory.

As a bonus, I managed to implement rendering to texture on iPhone as well. This feature is used in exactly the same way as in other platforms:

Yeap, it works on iPhone too!

The OffscreenRendering example project is available in the examples/iPhone directory.

Crosshatch Effects

I came across this article about how to implement a crosshatch effect in WebGL to produce some non-photorealistic effects. Here’s the link to the article:

http://learningwebgl.com/blog/?p=2858

It turns out that it’s a simple technique. Once lighting is calculated, a fragment shader is used to draw the cross-hatch lines as the shadows get darker. So I created a simple demo in order to try it out. The following are a couple of screenshots taken from it.

The model rendered using per-pixel lighting

Rendering with the crosshatch fragment shader

In the first image, the model is rendered using a fragment shader implementing per-pixel lighting. In order to achieve the results shown in the second picture, the fragment shader is modified based on the position of the fragment being drawn, doing a bit of geometric calculation to detect whether or not it lies on a line. If so, the pixel is rendered as a black. Otherwise, it’s white.

A closer look

The project name is Crosshatch Shaders and can be found in the examples/CrosshatchShaders directory.

Experimenting with Shaders

I took some time off from my current project to experiment a little bit with different shader effects. The following is a set of screenshots taken from the Dungeon demo using vertex and fragment programs that modify vertex and pixel colors in different ways.

These effects are possible due to the new rendering components, which allow us to customize the rendering pass for an entire scene, a branch, or just a simple node. They are still lacking support for off-screen drawing though, which is required in order to implement rendering to textures and post-processing effects.