Tightly packed fonts in OpenGL

Work is going well on the line rendering library. Taking a short break to explore some other areas.

LibGDX has a handy freetype extension which allows you to render TTF files to a texture (bitmap font) on-the-fly. Typically each TTF file will be packed into its own atlas; for example, you might end up with one 256×256 atlas for 24 pt font, and another for 16 pt font. If you plan on including many different font sizes and styles (bold, italic, etc) in your game — especially useful for different resolutions — you’re leading to a lot of extra white space.

With a small tweak to LibGDX code, it’s possible to use PixmapPacker to tightly pack your fonts into a single atlas. This also allows you more control over padding and texture size. See an example here, which packs 1,410 glyphs into a 512×256 image. It uses three font styles (Regular, Bold, Italic) and 5 font sizes for each (12 pt – 16 pt), so a total of 15 fonts packed into a single texture atlas. This could then be saved to a file before final release, using only the characters necessary for your game and target language(s), thus making for a very fast load time and nicely scalable, styled text.

If I have time in the future I’d like to contribute a bit to scene2D UI and provide a Label that allows for markup (color, font, etc) to easily produce styled text:

Screen shot 2013-05-03 at 1.40.27 PM

Another cool way of storing multiple levels of text sizes would be to use mipmapping, where each successive mipmap level is uploaded manually (glTexImage2D). The glyphs would need to be packed in the same positions. You could then use nearest sampling when picking the mipmap levels, which would lead to ‘stepped’ scaling without artifacts, or you could use GL_LINEAR for trilinear filtering between mipmap levels. I will have to experiment with this idea a bit further.

For larger sizes, there is of course distance field rendering, but that doesn’t play so nicely with small fonts.

The code changes are pretty minimal. Here is a clone of FreeTypeFontGenerator that shows what is necessary. Next week I may send a pull request to LibGDX to have this feature added officially.

https://gist.github.com/mattdesl/5512218

Anti-Aliased lines in OpenGL ES

 

 

Pixel-perfect line rendering is essential in many 2D games, but it isn’t always easy to achieve in OpenGL and OpenGL ES.

 

The built-in GL_LINES geometry type is not always reliable in regards to line thickness, and may look different depending on the device. Smoothing is very unreliable with GL_LINES. Often, alternative such as FXAA are not a very performant option as they require an extra render pass with FBO. glLineStipple (for dashed lines) is deprecated, and not supported at all in OpenGL ES.

 

Currently I’m working on a 2D line rendering library for LibGDX, which allows for resizable, anti-aliased, and optionally dashed lines. The techniques should be fast enough for mobile (no extra draw passes; anti-aliasing is performed in shader) and have a lot more room for optimizations and further effects (e.g. gradients). Below is a sneak peek of the output from a path drawn with the mouse:

Image

Per-Pixel 2D Shadows on the GPU

Below is my implementation of per-pixel shadows on the GPU, using far fewer passes and state changes than other current techniques such as Catalin Zima’s soft shadows. This technique also has a lot of room for optimization, as only one channel is currently used. Further, the blur pass is performed only in the horizontal direction, and can be included in the same step as the light rendering. Special thanks to the user “Nego” on the LibGDX forums, who expanded on my previous attempts with some great optimizations. 

Per-Pixel 2D Shadows on the GPU (using LibGDX)

Image

 

Graphics by Kenney from OpenGameArt.org.

GLSL Spherical Textures

To create seemingly 3D planets in a 2D space, I’m using a simple “spherize” filter as described here by Paul Bourke.

Depending on the situation, this could prove to be a useful and powerful alternative to simple billboards or textured spheres. For a 2D space game, for example, this could allow us to create resolution-independent planet textures influenced in real-time. It can also be used to quickly generate planet sprites from any input, e.g. files on the user’s device (see Planetary‘s use of album artwork).

The first implementation is an abstract pattern which changes over time:

Screen shot 2013-03-20 at 2.41.13 AM

If you’re running a WebGL-capable browser, check it out here:
http://glsl.heroku.com/e#7632.5

The next example uses WebGL-Noise and a simple fractal sum to create an Earth-like planet. Any variety of noise algorithms would work here.

Screen shot 2013-03-20 at 2.39.16 AM

Check it out in real-time:
http://glsl.heroku.com/e#7662.3

Faking Real-Time Blurs in OpenGL ES (Android, iOS)

“Lerp Blur” is a name I’ve given to a technique to simulate a variable blur in real-time. It’s suitable for Android, iOS and other fill-rate limited devices that can’t rely on multiple render passes and FBOs.

The basic idea of the “Lerp Blur” is to, in software, create a series of images of increasing blur strengths, and use some form of interpolation between two varying strengths in order to simulate a real-time adjustable blur.

The trick, though, is to utilize mipmaps and Level of Detail (LOD) sampling to create the adjustable blur. This leads to a decent result without a huge impact on performance or texture memory. As a bonus, we don’t need to send any extra vertex attributes to the shader.

The GLSL fragment shader looks something like this:

...

//bias to influence LOD picking; e.g. "blur strength"
uniform float bias;

void main() {
	//sample from the texture using bias to influence LOD
	vec4 texColor = texture2D(u_texture, vTexCoord, bias);
	gl_FragColor = texColor * vColor;
}

The full implementation, including code samples in LibGDX, can be seen here:
Faking Real-Time Blurs in OpenGL ES

Real-Time Backgrounds in GLSL

The beauty of GLSL sandboxes is the real-time visual feedback while coding. I’ve been toying with these to learn a bit more about shader programming, and see whether they would be feasible for use in desktop and mobile games.

Creating a background effect (e.g. for a menu screen or screen saver) in GLSL gives you full control over the fragments. You can handle anti-aliasing, real-time noise displacement, animate hues and colors, and even add 3D effects. Best of all, with some minor changes, the result is resolution-independent, and can be scaled infinitely with no loss in quality. Depending on the GPU, using GLSL might lead to better performance and aesthetics than trying to replicate the effect with textures and multiple blend passes.

The first prototype is an underwater scene, using my previous “Angler Fish Game” doodle as a reference. Here is the resulting background:

Screen shot 2013-03-09 at 3.38.08 PM

If you have a WebGL-enabled browser, you can see the effect and code in real-time below:
http://glsl.heroku.com/e#6052.4

The next prototype is a cartoony space background, using my previous “God of the Universe” doodle as a reference. Here is the resulting background:

Screen shot 2013-03-09 at 3.39.43 PM

You can see the effect in real-time and the code in the following link. The code is much simpler; I was hoping to target mobile with this shader.
http://glsl.heroku.com/e#6607.4

Perhaps in the next few years, as WebGL and consumer GPUs evolve, we will see a shift in understanding of what constitutes a “web image.” Already we have GIF (animated images) and SVG (scalable vector graphics). Maybe in the future we will have small snippets of OpenGL/GLSL, contained in a portable “image” format, and even allowing input such as mouse movement.

Game Art Sketches: Part 2

Some more doodling and brainstorming for mobile games.

First is a very quick sketch of an asteroids remake, where everything looks sketchy and squiggly. The challenge here is to create real-time, procedurally noise-displaced squiggly lines in OpenGL and GLSL.

Second is a game where you play an angler fish with a bright light. You can turn on your light to attract fish (and eat them), but it also attracts larger predators. At which point you need to turn off your light to hide.

In the next game you play as a “god” — choose a solar system and a planet capable of supporting life. Then you need to defend it (Fruit Ninja style) from oncoming attacks: asteroids, aliens, stray satellites, space junk, etc. Over time, life may evolve into an intelligent species. You can sometimes catch a powerup which helps boost evolution a little.

Lots of potential in the last one. You could even control multiple planets at once, and somehow have them interact. Sort of like Civilization, set in space and over the course of billions of years.

Game Art Sketches

I’ve always thought it would be cool to control a grim reaper, and wherever you walk, the grass/plants/trees “die” around you in real-time. It wasn’t until seeing Snowball Earth that I was inspired to actually try it out.

Just blocking silhouettes here — would be fun to add some whacky accessories for the player to wear.

I’ve also been sketching blueprints for the modeling process:

And lastly, a quick tree study:

Fruit-Ninja Swipes in OpenGL

For a small game project I am working on in LibGDX (an OpenGL-based framework in Java which targets Desktop, Android, iOS, and WebGL), I decided to implement a “finger swipe” effect much like in Fruit Ninja. For reference, here is the effect in Fruit Ninja which I was trying to replicate:

The result of my efforts led to the following effect, which works on iOS, Android, WebGL, and Desktop:

Because I am working in a pure OpenGL setting (i.e. no luxury of HTML5 Canvas), there are a number of graphical hurdles to overcome. What’s more, many Android phones and tablets have poor accuracy on diagonal finger tracking, which leads to wobbly lines. The problem is demonstrated here. So a typical “finger swipe” on many Android devices might look like this:

I was able to minimize this problem with the Radial Distance Path Simplification algorithm. Douglas-Peucker might provide better results, but it is less efficient. In the image below, the gray line shows the raw “zig-zag” input, while the red line shows the simplified path:

The path is then smoothed using 2 iterations of Chaikin’s smoothing algorithm, to remove some of the sharp edges on corners:

Using some basic vector math, you can extrude the path into a 2D mesh. For efficiency, we use GL_TRIANGLE_STRIP to render the mesh. This leads to a pretty nice “swipe” effect:

The problem with this, as we can see, is that OpenGL doesn’t produce reliable polygon smoothing (anti-aliasing). Unfortunately, full-screen solutions like FXAA are often too slow for mobile platforms. I ended up using some OpenGL texturing tricks to produce smooth edges, and the result is extremely efficient (only requires one extra triangle strip per “swipe”). As a bonus, my solution can be easily used in conjunction with a custom fragment shader to produce 3D lighting, glows, and animated colors.

I go into more detail on the algorithms, 2D mesh generation, and anti-aliasing tricks in the following article:
Creating a Fruit-Ninja Style Swipe in LibGDX