face with diagonal mouth David Holland

3D Pixel Art Rendering

I have developed a 3D pixel art style in the Godot engine, inspired by t3ssel8r’s work in Unity. In this article, I explain how a number of visual effects come together to create this pleasing aesthetic.

Outlines

In t3ssel8r’s early videos circa 2020, he showed off some beautiful procedural outlines. They were pixel perfect, and had edge highlights that only appeared on the convex edges. It had a handcrafted but measured, geometric beauty to it, evoking a traditional pixel art feel.

I desperately wanted to learn how to make shaders like that. It took me a few years of messing around in various game engines and graphics libraries before I finally revisited the 3D pixel art style, and spent a good weekend outlining the details.

I wrote a standard edge detection shader, using the screen depth and normals textures. To make the outlines a single pixel thick, I used a kernel consisting of just the up, down, left, and right texels. The depth texture is mainly used to generate the object outlines, while the normals texture helps with highlighting edges. Convex edges are isolated by taking the cross product of neighbouring texels, and grading it against a number of arbitrary heuristics.

I’m not sure it’s quite possible to create a perfect outline shader that works for all models and all camera angles at this low resolution. Low poly models with clearly defined outer edges suit best, but shader parameters still need to be tweaked to fit each scene. Were I to model more objects, I would be testing them in the scene regularly to see what the shader makes.

I have two main references for the outlines and edge highlights:

Camera

When moving a camera through a 3D scene at low resolution, it does not look like a 2D image being scrolled across the display. There are many temporal artefacts; swimming and creeping and jittering of pixels on screen. This can also happen for 2D pixel art scenes if the positions or scales of sprites are not quite right.

Fixing this is a two process:

See this aarthifical video for an explanation of this technique in 2D, and see this video by me for the 3D adaption.

Lighting

It’s toon lighting! There’s nothing particularly special about the lighting setup. Though, I handle the directional light differently from other lights, smoothing out the attenuation to avoid some of the shadow flickering of a slow moving sun. Adding noise to the normals can be done as well, to help with shadow popping on flat surfaces. Lighting is also integrated with the outlines and edges, so I have a bit more control over when lines are lighter or darker.

The cloud shadows are implemented as a noise texture set up as a global shader uniform and incorporated into the lighting model via convenient shader includes. I have a get_cloud_attenuation(some_world_pos) function to abstract away the cloud projection.

The direct lighting is not that interesting on its own, it’s trying to be clean, flat, and minimal; however, things become deeper and more visually rich when the stylised volumetrics and screen space particles are introduced. More on that below.

Here’s a nice toon shader to get started with:

Grass

The grass is a number of billboard quads with a grass texture, evenly lit to blend in with the terrain, creating pleasant boundaries and correctly layered tufts of grass.

For a good while, I was attempting to get access to the directional shadow map in Godot, so I could implement custom shadowing on the grass, lighting each quad evenly. I spent a lot of time learning how to compile Godot from source so I could add a custom shader function to sample the shadow map at arbitrary positions, inspired by this article. After that success, I realised that all I needed was to write to VERTEX in the fragment shader to affect the standard lighting model, setting it to the base of the mesh for every fragment; just a single line change to the Godot source. The custom shader function did come in handy later though, so well worth the effort.

Skip forward a few months, and as of this PR, no source changes should be necessary for this effect. A LIGHT_VERTEX variable has been introduced for this exact purpose.

Water

Getting a basic water shader up and running is not too difficult, there are countless resources for water shaders. However, the particulars of making a pixel-perfect water shader with refraction and orthographic planar reflections was quite the challenge.

Before I even got to making it look nice, I had to deal with a glaring issue: all objects with the outline shader ended up in the transparent pass, so the outlines didn’t appear in the shader screen texture. This meant that any screen reading or post-processing effects, such as the water’s refraction, didn’t have outlines on objects.

Godot 4 has a depth pre-pass that renders opaque objects with a simplified shader to gather depth and normal information (and do some performance optimisation), stored in textures, available to read in standard shaders. However, this is an optional feature only available for the forward+ renderer (and compatibility renderer?), and by default, any shader that samples these textures is forced into the transparent pipeline in case the pre-pass was not run.

This was a big L for me, so once again I cracked open the Godot source code to get it working how I wanted. The blessing and curse of free open source software. I removed some shader language checks for depth/normal texture access, and then had to shift a couple of back-buffer copies around to ensure the textures were up to date with the current frame. Suddenly, outlines were now visible below water.

I say glaring issue, but it looks fine in this case, however I thought it an important problem to resolve for any future screen reading effects.

One night I decided to stay up until 3 AM crafting a method for world-space, view-aligned, horizontal lines for the little animated waves. The texture below is the result of that journey. This was a time where I felt that I was truly doing technical art. Boiling it all down to a texture read, utilising all the color channels for aspects of direction, variance, and timing. I’d like to thank Material Maker for the excellent procedural tooling.

Perhaps there exist simpler ways to achieve the results below, maybe using a particle system. But, I’m happy that I set a goal, confident it was achievable with a single texture, and then created it.

One more technical part of this water shader is the reflections. I couldn’t get screen-space reflections looking good, so I decided to go with planar reflections. I started adapting a Unity planar reflection plugin to Godot, and it was going okay until I needed a custom projection for the reflecting camera. Godot 4.2 doesn’t support custom projections for Camera3D, PR pending, so once again I recompiled Godot merging the PR myself. Then I just had to yoink the oblique projection matrix calculations from here (public domain), which I understand just enough to get working; see this academic article for the full story. After that, I render the camera to a viewport and slap the texture onto the water plane.

Resources:

Volumetric Lighting

Even the explanation of this technique confused me for the longest time; I just couldn’t quite wrap my head around it. In a comment on this YouTube video, t3essel8r explained:

what I’m sampling is the light space depth map (maybe what you refer to as shadow map) based on world-space coordinates of a series of parallel planes aligned with the light direction to test whether each position is shadowed.

After breaking down each part of this densely technical statement, I understood how the slices of light layer on top of each other to create the volumetric-like, god-ray/atmospheric-scattering effect. I got familiar with the point-normal form of a plane, and implemented it as a number of instanced quads.

It wasn’t long after, that the the excellent Acerola made a video about shell texturing which was essentially the technique I was using. He featured my little shell texturing use-case at the end of his following video about graphics programming, which was great to see.

An interesting issue I encountered with the god rays was softening the intersections with the world. The planes would create clearly defined intersections with terrain and objects, so I implemented a depth-fade to slowly de-emphasize their appearance the closer to objects they were. This worked well, but created another issue: it harshly outline individual tufts of grass, as they also appear in the depth buffer. Without a fully custom render pipeline, I couldn’t find a good way to remove the grass from the depth texture without creating a whole host of sorting and visibility issues. So, I devised a plan to box blur the depth texture to soften out small changes in depth, making the issue much less noticeable.

This worked quite well, but created huge performance issues, as blurring the depth was calculated for every plane instance individually. I needed a blur radius of ~7 pixels and at least 10 planes for the effect to work. Even at 640x360, this was just way too expensive. So, I devised a new plan to rewrite the god rays as a single post-processing effect. I would define the planes entirely in the shader, and raymarch into the scene, testing depths, and sampling shadows. This was only possible due to my previous work adding a sample_directional_shadow() shader function to Godot’s source when implementing grass shading. I no longer needed actual geometry in the scene for Godot to calculate the attenuation for me. I could do it all from a single fullscreen quad. Which means the depth blurring only happens a single time; performance solved.

Don’t get me wrong, this is still an expensive effect (~1.7ms on my 1060 laptop), but it makes such a drastic improvement to the overall look of the scene, it must be justified.

Tree

I made a tree; it looks fine. It’s a bit expensive due to the number of billboard instances used for the leaves, but not many of them can fit on screen at once, so it’s okay for now.

The fun part about making the tree was implementing multithreaded poisson-disk sampling as a Godot C++ module based on the papers:

It generates a large number of uniform random points on a mesh. It picks triangles proportional to their surface area, to ensure uniformity. Then it does a “bunch of stuff” with sorting and hash tables to select ~10% of those points that are at least a minimum radius from each other (poisson-disk). Most of it is parallel across threads, so it does all that at almost interactive frame rates, but it could definitely be faster and more memory efficient. Here’s the code if you’re interested; it will make more sense if you read the referenced papers.

I haven’t done much more with the trees, the shader is basically the same as the grass, but it’s nice to have some foliage for when world editing begins.

Rain, Particles, and the Night

A lot of the work in 3D pixel art is getting 3D effects to look 2D (surprise). So it’s nice to put some actual 2D effects in there to really sell it. This is where Pixel Composer comes into play. It has allowed me to generate perfectly tiling and looping animated particle textures for things such as rain, water splashes, and dust particles.

It’s a nifty little program, it’s constantly receiving new features, though it can be a bit unstable at times, so save often. It’s also made with GameMaker and is open source. I’ve tried a few other programs like this, but Pixel Composer is easily the most powerful.

I also spent a bit of effort to get night time looking acceptable, changing the color of the directional light, adding a vignette, and most importantly a film grain shader. Here it is all together.

What’s Next?

Well, I’ve put out a few YouTube videos, made several reddit posts, and people seem to really gel with the aesthetic. I literally have no gameplay at the moment, so that should probably come next. However, I don’t want to be just another t3sselimit8r, as some commenter put it haha. I want to take it in another direction now, to create a legitimate derivate style, and perhaps a unique game of my own. I wanted to learn technical art, and this past year I think I’ve learned a lot. Explaining how parts of it work is interesting to myself and others. I’ve answered many questions on my r/godot posts, so please have a look at those if it’s of interest.

I did promise a video about pixel perfect outlines, and that is still coming, still using Godot, I just need a good free weekend to record it. The project will be in the git repo when that’s done. I think I might be moving on from Godot though. I want more control over the rendering pipeline and engine setup in general. Inspired by Karl Zylinski and Tsoding (Daily), I’ve been exploring the Odin programming language and raylib, a C library for videogames programming. I’ve even had a simple PR merged into raylib already, which feels great. I’m also looking into D3D12, to really wrap my head around modern graphics programming; I’m unsure if I’ll build my renderer with that or raylib. As easy and powerful as Godot is, I currently want something lo-fi to focus on simpler solutions to technical, artistic, and game design problems.

In my odin-raylib codebase, I have a little game framework setup with DLL hot-reloading, the smooth 3D pixel art camera, and basic character movement. It will be fairly easy to port the Godot shaders over to raylib, so I’m not worried about visual parity at the moment. My current focus is terrain, so I can start building little maps and prototyping gameplay of some sort.

It feels good to start fresh and simple, wish me luck for the long haul.

Goodbye.

Tags