Live Long And Render (IV)

Hello 2020!

As I mentioned before, development for Vulkan support is still happening and in this post I’m going to talk about the biggest milestones that I achieved in very first month of this new year.


This is the classical spinning triangle example that has been part of Crimild for a very long time. Only this time it’s a bit more interesting:

While not visually impressive, this is the first big milestone for every graphics programming, specially those working with Vulkan.

One of the biggest changes I made while working on this demo is a new way to work with vertex and index buffers. Why? The decision may not have much to do Vulkan, to be honest, but the current way of specifying this type of data (basically, as an array of floats or ints) has several limitations, particularly when dealing with multiple vertex attributes (positions, colors, etc.) in the same buffer of data. I’m going to write a different post just to explain this better. But for now, just take a look at how vertices and indices are specified in the new approach:

// The layout of a single vertex
struct VertexP2C3 {
    Vector2f position;
    RGBAColorf color;

// Create vertex buffer
auto vbo = crimild::alloc< VertexP2C3Buffer >(
    containers::Array< VertexP2C3 > {
            .position = Vector2f( -0.5f, 0.5f ),
            .color = RGBColorf( 1.0f, 0.0f, 0.0f ),
            .position = Vector2f( -0.5f, -0.5f ),
            .color = RGBColorf( 0.0f, 1.0f, 0.0f ),
            .position = Vector2f( 0.5f, -0.5f ),
            .color = RGBColorf( 0.0f, 0.0f, 1.0f ),
            .position = Vector2f( 0.5f, 0.5f ),
            .color = RGBColorf( 1.0f, 1.0f, 1.0f ),

// Create index buffer
auto ibo = crimild::alloc< IndexUInt32Buffer >(
    containers::Array< crimild::UInt32 > {
        0, 1, 2,

Even without knowing how it was done before, it cannot be denied that the newer approach is pretty clear and straightforward. Again, I’ll write about it later.


Similar to the previous example, this demo seems to be quite simple, yet it’s another great milestone.

Each quad is rendered with a checkerboard texture and colored vertices.

Working with textures requires us to handle multiple descriptor sets and layouts, since each object has its own set of transformations and textures. I think I came up with a nice approach that may allow us create a shader library in the future (and also stream then directly from disk if needed).

OBJ Loader

Once vertex data and textures were working correctly, the next obvious step was to show something more interesting on screen:

The famous Stanford Bunny is loaded from an OBJ file. Please note that there’s no dynamic lighting in the scene. Instead, the texture file already has ambient occlusion baked on it.

The most difficult part of this example is actually hidden inside the engine. The OBJ loader needs to create a new pipeline based on what data is available (are there any textures? what about normals?). There’s also the option to specify a pipeline to the loader, so every object will use it instead of default one.

Also, OBJ loader is making use of the new vertex/index buffer objects.


The final example I’m showing today is the most complex one so far:

The demo sets up multiple pipelines to render the same scene (the bunny) using different settings: textured, lines, dots and normals.

The challenge for this demo was to being able to override some or all of the settings for whatever pipeline configuration the scene (or in this case, the model) has with new values, like overriding their viewport size.

Up next…

As you can see, I’ve been busy. Being able to load models with textures and setup different pipelines is the very basis for all the the rest of the features.

There are still some unresolved design challenges that I need to tackle, like how to handle render targets and offscreen rendering, but I’m hoping the solution will come up as I move forward with simpler demos.