Praise the Metal – Part 6: Post-processing

I knew from the start that Le Voyage needed some kind of distinctive feature in order to be recognized out there. The gameplay was way too simple, so I focused on presentation instead. Since the game is based on early silent films, using some sort of post-processing effect to render noise and scratches was a must have.

About Image Effects

In Crimild, image effects implement post-processing operations on a whole frame. Ranging from simple color replacement (as in sepia effects) to techniques like SSAO or Depth of Field, image effects are quite powerful things.

Le Voyage makes use of a simple image effect to achieve that old-film look.

Screen Shot 2016-05-15 at 3.03.19 PM

There are four different techniques applied in the image above:

  1. Sepia tone: All colors are mapped to sepia values, which is that brownish tint. No more blues, reds or grays. Just different sepia levels.
  2. Film grain: noise produce in films due to luminosity changes (or so I was told)
  3. Scratches: Due to film degradation, old movies display scratches, burnts and other kind of artifacts after some time.
  4. Vignetting: The corners of the screen look more dark than the center, to simulate the dimming of light. This technique is usually employed to frame important objects in the center of the screen, as in closeups.

All of these effects are applied in a single pass for best performance.

How does it work?

Crimild implements post-processing using a technique called ping-pong, where two buffers are switched back-and-forth, serving as both source and accumulation while processing image effects.

A scene is rendered into an off-screen framebuffer, which is designated as source buffer. For each image effect, the source buffer is bound as a texture and used to get the pixel data that serves as input for an effect. The image effect is computed and rendered to a second buffer, known as accumulation buffer. Then, source and accumulation are swapped and, if there are more image effects to apply, the process starts again.

When there are no more image effects to be processed, the source buffer will contain the final image that will to be displayed on the screen.

Confused? Then the following image may help you (spoiler alert: it won’t):

IMG_0548

Ping-pong buffer. For each image effect, the source and destination buffers are swapped. Once all effects have been processed, the source buffer contains the final image

Le Voyage only uses one image effect that applies all four stages in a single pass. No need for additional post-processing is required. If you want to know more implementing the old-film effect, you can check this great article by Nutty.ca in which I based mine.

Powered by Metal

Theoretically speaking, there’s no difference concerning how this effect is applied in either Metal or OpenGL, as the same steps are required in both APIs. In practice, the Metal API is a bit simpler concerning handling framebuffers, so the code ends up more readable. And there’s no need for all that explicit error checking that OpenGL needs.

If you recall from a previous post where we talk about FBOs, I said that we need to define a texture for our color attachment. At the time, we did something like this:

renderPassDescriptor.colorAttachments[ 0 ].texture = getDrawable().texture;

That code sets the texture associated with a drawable to the first color attachment. This will render everything on the drawable itself, which in our case was the screen.

But in order to perform post-processing, we need an offscreen buffer. That means that we need an actual output texture instead:

MTLTextureDescriptor *textureDescriptor = [MTLTextureDescriptor 
   texture2DDescriptorWithPixelFormat: MTLPixelFormatBGRA8Unorm
                                width: /* screen width */
                               height: /* screen height */
                            mipmapped: NO];
 
id< MTLTexture > texture = [getDevice() newTextureWithDescriptor:textureDescriptor];

renderPassDescriptor.colorAttachments[ 0 ].texture = texture;

The code above is executed when generating the offscreen FBO. It creates a new texture object with the screen dimensions and set it as the output for the color attachment. Notice that we don’t pass any data as the texture’s image since we’re going to write into it.

Keep in mind we need two offscreen FBOs created this way, one that will act as source and the other as accumulation.

Once set, binding this offscreen FBO will make our rendering code to draw everything on the texture itself instead of the screen, which will become our source FBO.

Then we proceed to render the image effect, using a quad primitive and that texture as input and producing the final image into the accumulation FBO, which is then presented on the screen since there are no more image effects.

Again, this is familiar territory if you already know how to do offscreen rendering in OpenGL. Except for a tiny little difference…

Performance

When I started writing this series of posts, I mentioned that one of the reasons for me to work with Metal was that the game’s performance was poor on the Apple TV. More specifically, I assumed that the post-processing step was consuming most of the processing resources for a single frame. And I wasn’t completely wrong about that.

Keep in mind that post-processing is always a very expensive technique, both in memory and processing requirements, regardless of the API. In fact, the game is almost unplayable on older devices like an iPod Touch or iPhone 4s just because of this effect (and those devices don’t support Metal). Other image effects, like SSAO or DOF are even more expensive, so you should try and keep the post-processing step as small as possible.

I’m getting ahead of myself, since optimizations will be subject of the next post, but it turns out that Metal’s performance boost over OpenGL not only lets you display more objects in screen, but also allows for more complex post-processing effects to be applied in real-time too. Even without optimizing the shaders of the image effect, I noticed a 50% increment in performance just by switching the APIs. That was amazing!

Et Voilá

And so the rendering process is now completed. We started with a blank screen, and then proceeded to draw objects in an offscreen buffer. Finally, we applied a post-processing effect to achieve that old-film look giving the game that unique look. It was quite the trip, right?

In the next and (almost) final entry in this series we’re going to see some of the amazing tools that will allow us to profile applications based on Metal, as well as some of the most basic optimization techniques that lead to great performance improvements.

To be continued…

Advertisements

One thought on “Praise the Metal – Part 6: Post-processing

Leave a Reply

Fill in your details below or click an icon to log in:

WordPress.com Logo

You are commenting using your WordPress.com account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s