Sorting particles the right way… I guess…

Close your eyes.

Actually, keep them opened because you have to keep reading.

Imagine a train moving at a constant speed. You’re traveling inside the train, in a seat near a window. There’s a computer resting on a table right in front of you. On its screen, a random frame from a secret project is being rendered looking like this:

Screen Shot 2017-09-09 at 12.10.35 PM.png

The Problem

The above image is the result of using a mixture of offscreen render targets and particle systems to generate a lot of soldiers in real-time, in a similar fashion as other impostor techniques out there. In this case, each particle represents a group of soldiers (in order to avoid clones as much as possible) and they are walking from the left to the right of the screen. Don’t pay attention to the bearded guy at the front.

Did you noticed how the “density” of the soldiers seems to increase after some point at around the middle of the screen? That’s the symptom for the problem. Particles are generated using a uniform distribution random algorithm, and therefore there’s no reason for the empty spaces between them.

If we look at the debug version of the same frame, we would see something like this:

Screen Shot 2017-09-10 at 5.18.59 PM.png

As shown in the image above, the particles are indeed uniformly distributed. Then, where are the soldiers?

Here’s another clue: if I turn the camera just a little bit to the left, I get the following result:

Screen Shot 2017-09-10 at 5.22.22 PM.png

This seems to indicated that, although the soldier-particles do exist, they are just not being rendered in the right way. Actually, I’ve dealt with this kind of problems before and they always seem to be related with object sorting and transparency.

Distance to the Camera

Before any particle is rendered on the screen, they must be sorted in the right order for the transparency to work. OK, so particles are not being sorted and we just need to implement that, right? Alas, after checking the code, it turns out that the particle system does perform sorting over live particles, ordering them from back to front based on the camera position. And yet the problem remains.

It turns out I was making the wrong assumption here. Particles are being reordered, true, but the algorithm handles them as points instead of billboards (quads that always face the camera).

Let’s look at the following diagram:

img_3586-1

The above diagram is quite self explanatory, right? No? OK, let me try and explain it then.

In the first case, particles are sorted using the camera position (just as in the current implementation). There are tree distances to the camera (d1, d2, d3). If we use the camera position as reference, the order in which the particles will be rendered will end up being 3, 2, 1 (back-to-front, remember). But that result is incorrect.

Particle 2 (the one in the middle) is indeed closer than particle 3 to the camera position, but it should be rendered before particle 3 in order to prevent artifacts like before.

Near-plane distance

The second scenario is the right one. We have to sort particles based on their distance to the near plane of the camera, not its position. That way, particles are correctly rendered as 2, 3, 1 (again, back-to-front).

This change produces the correct result:

Screen Shot 2017-09-10 at 5.32.11 PM.png

All soldiers are successfully rendered and our army is completed.

Final comments

We should keep in mind that, while this method fixes this particular sorting problem, it may not work when particles are rotated or they intersect each other. There are other approaches that attempt to solve those cases and, depending on what we’re targeting for, those might end up being too expensive to implement.

That’s it for today. Time to get back to my cave now.

See you later

PS: If you’re still wondering what that thing about the train was, well, I guess I’ve been watching too much Genius lately…

 

Advertisements

Particle System Improvements

Here’s a little something that I’ve been doing on the side.

The fire effect has been created with a new Particle System node that includes tons of customization controls. For example, you can define shapes for the particle emitter from several mathematical shapes like cylinder (the one in the video), cones, spheres, etc.

In addition, some particle properties like size and color can be interpolated using starting and ending values. The interpolation method is fixed for the moment, but it will be customizable in the near future to use different integration curves.

At the moment, the particle system is implemented entirely in CPU, meaning no point sprites or any other optimization techniques are used, which lead to performance issues as particle count gets bigger (the effect in the video has a particle count of 50).

The particle system in the video above is configured on a Lua script as follow

{
   type = 'crimild::ParticleSystem',
   maxParticles = 50,
   particleLifetime = 2.0,
   particleSpeed = 0.75,
   particleStartSize = 0.5,
   particleEndSize = 0.15,
   particleStartColor = { 1.0, 0.9, 0.1, 1.0 },
   particleEndColor = { 1.0, 0.0, 0.0, 0.5 },
   emitter = {
      type = 'cylinder',
      height = 0.1,
      radius = 0.2,
   },
   precomputeParticles = true,
   useWorldSpace = true,
   texture = 'assets/textures/fire.tga',
 },

Even when the new system is quite easy to configure, it still requires a lot of try and error to get something nice on screen. Maybe it’s time to start thinking about a scene editor…