Applying ECS to a simple game

I’ve been going back to work on my Entity-Component-System framework. Aside from being a side project, I will also plan to use it for my voxel platform game. I’ve already created a Minimum Viable Product using it, which is the Bomberman clone game I mentioned a few posts back. Animations are still very buggy, and there is no AI implemented, but a barebones 2-4 player version is working.

Previously I initialized all the Components, Systems, and Entity templates in the framework code. While I did this for testing out the game, it’s not good for portability, so I had to remove all that initialization code out and update the framework so that it can accept new Components, Systems and templates from outside.

Finally, I isolated the code into its own assembly, so it would be possible to just link it as a DLL. This also meant I had to remove any XNA/MonoGame specific classes and all the rendering will have to do be done from outside. In short, it’s really is meant for game logic only, and your game will have to handle the rendering separately.

The framework itself is lightweight (and I hope it stays that way), and only consists of 5 files/classes: Component, EntityManager, EntitySystem, EntityTemplate, and SystemManager. The SystemManager handles all the high level control and flow of the EntitySystems, which you make custom systems from. EntityTemplate is a simple class used as a blueprint to add Components that define an Entity, and is deep-cloneable. EntityManager handles the creation of Entities from these templates, and also the organization of its components. Despite its name, there is no Entity class. I think I wil rename this manager to “ComponentManager” in another revision.

The Bomberman game has the following components:

  • Bomb
  • Collision
  • InputContext
  • PlayerInfo
  • PowerUp
  • ScreenPosition
  • Spread
  • Sprite
  • TilePosition
  • TimedEffect

They are used by the following systems:

  • BombSystem
  • CollisionSystem
  • ExplosionSystem
  • InputSystem
  • MovementSystem
  • PlayerSystem
  • PowerUpSystem
  • TileSystem

Some of the systems are more generic than others. There are a couple of systems like the Bomb system or Power-up system that have very specific logic pertaining to the game, while others like the Input system are pretty abstract and can be adapted to other game types with little or no change. Some are Drawable systems so they have an extra Draw() function that is called in a separate part of the game loop.

The funny thing is that I was going to talk about using Messages in this update, but in the time between, I did away with them completely. Messages were a static list of Message objects that was in the base System class. They were mostly used for one-time player triggered events (like setting a bomb) and every system had access to them, but I decided to just pass along the InputContext component into the systems that will be dependent on player input.

Setup and Gameplay

The game is started by initializing all the components and systems and then creating the entire set of Entities using a Level class. This class has only one job- to lay out the level. Specifically, it adds the components needed to make the tiles, sprites and players. My implementation of the game pre-allocates 9 Bomb entities (the maximum a player can have) for each player.

Each player can be custom controlled but right now that’s facing issues now that I moved from invoking methods to instantiate new Entities, to deep-cloning them. This works as well as long as none of the component have reference types.

The only Component that has reference types is the InputContext component as it needs to keep a Dictionary of the available control mappings. This breaks with deep-cloning and thus with multiple players, they all share the same control scheme. Other than that, it makes the component too bloated, especially with helper functions to initialize the mappings. So I am figuring out how to use value types only to represent an arbitrary group of control mappings.

The game starts immediately after setup, and every InputContext that is tied in with a PlayerInfo controls a player. Movement around the level is handled with the Movement System, while placing and remote-detonating bombs is handled with the Bomb System.

The Input System detects key and button presses from an InputContext’s available control mappings, and changes two integers, “current action” and “current state”, based on it. It is up to other systems to determine what it should do with these values, if needed.

The Tile System is responsible for keeping sprites aligned to a tile grid, or giving them their closest “tile coordinates” which is important in knowing where a bomb should be placed, for example.

Collision System is self-explanatory. It handles different collision types varying by enum, to differentiate solid objects, destructible objects or damaging objects, as well as the response (it pushes players off walls, for example). If a player walks into an explosion, the Collision System knows.

An Explosion System is used to propagate the explosions in pre-set directions. By default it’s in all 4 cardinal directions with a bomb’s Power attribute copied to a Spread component, subtracting one with each tile. It keeps creating more explosions until this attribute reaches 0 or it hits a wall.

The Powerup System handles tracking tile locations with the players’ own tile locations so if two identical locations are found, we know a player is over a power-up and it can be applied.

There used to be a system for drawing sprites, but I decided to remove it and have the rendering be done outside the ECS scope. This makes the code more portable and you can use your own renderer.

Now that the game is now done (with minimal specs), I am now ready to extend its use to produce one of the games I have wanted to make for a while, a top-down arena style shooter. This game will have similarities with components and systems for player movement, tile collision, and power-ups (which will be changed simply to items). I plan to make it in 2D at first but eventually switch the renderer to 3D and also offer customizable maps.

ECS is officially the bomb!

Over the last week and a half, I have been working on my own ECS framework. This is a side project away from my main voxel game, but it is something I wanted to do in order to be able to improve my productivity with making games more quickly. Inspired by Phil’s Bomberman tutorial, I have implemented my own Bomberman clone with my own made-from-scratch ECS framework (though some conventions and names were adapted).

My own framework has less code than Phil’s but as long as the whole game still works on top of it, I would prefer that. Because right now, I don’t need two Entities to check if they are exactly the same, or don’t need to serialize them for other purposes. And I certainly don’t need scripts at the moment.This framework has gone through two main iterations. They differed mostly in how Components are stored in the game.

As you may know in an Entity-Component-System framework Components are just simple data containers, which don’t have any game logic, but are mutable by the game logic code, which resides in the Systems. Entities in my framework do not exist at all as classes, but are rather just numbers implied in Components as you’ll soon see.

Framework Structure

The ECS framework in its current state consists of your typical Component and System base classes, which you can build specific Systems and Components from. Here are the important data structures:

  • A Dictionary of Component arrays indexed by enum. Usually a small amount of different types.
  • Component arrays which are accessed by index, so constant time here.
  • Systems take references of important Component arrays. Iterating them takes linear time.
  • List of Systems which are always executed in the order that they were added.
  • Static array of Messages in Systems. For now, they are just being created by the Input System. Could possibly be made non-static.

The main point of setup and entity processing comes from the SystemManager class, which stores a list of all the different Systems, and calls their Draw and Process function in the game loop.

This class also has an instance of an EntityManager class, which is passed to the System constructors. The EntityManager class is where all the Component arrays are stored (as the base Component class), and where the Systems get all the Components they need. Components are pooled at startup, setting each array to a fixed sized X for max number of entities (though in C# it’s straightforward to resize an array if needed).

The arrays themselves are in a Dictionary, using a Enum for Component type as key. They are arrays of base classes, but they added in as derived classes.

public Dictionary<ComponentType, Component[]> components { get; private set; }

// Add Tile Position components
components.Add(ComponentType.TilePosition, new Components.TilePosition[maxEntities]);

This makes it possible to re-cast them back into their derived classes, but fortunately we would only have to do this on startup. Systems get the arrays of Components they need upon initialization, cast to the the proper type.

// Inside a System that uses Collision and TilePosition components

Components.Collision[] collision;
Components.TilePosition[] tilePosition;

public CollisionSystem(EntityManager entityManager)
    : base(entityManager) 
{
    // Load important components
    collision = components[ComponentType.Collision] as Components.Collision[];     
    tilePosition = components[ComponentType.TilePosition] as Components.TilePosition[];       
}

No further casting is needed for entire arrays after this point. The only casting that is done while the game is running is for getting certain Components at a given index.

When an ECS is more like a CS

Not dealing with Computer Science, but dealing with Components and Systems only. There are no entities in the framework, or at least not as objects. There is no Entity class, but instead entity IDs are stored in the components themselves and also referred to indirectly by the array indexes. The Components are access sequentially by the Systems and you can be sure that any Components in the same location of their respective array together make up an entity.

The EntityManager also has as an integer variable, TotalEntities, for the total amount of entities active in the game. It tells each System how far into the Component arrays it should iterate. An entity is “removed” by replacing the removed entity’s components with the components of the last active entity in the array. TotalEntities is reduced by 1, and this is the new index marker to tell the EntityManager where it should add Components to make a new entity.

Since arrays are fixed size, the amount of entities should not exceed the size provided in the pool. You can usually easily test and find out what a suitable size is for simpler games. I want improve this in the future by making the EntityManager resize the arrays to a much larger size if it should reach the limit (which should generally be avoided anyways to maintain good performance).

Component Organization

In the first iteration, the framework had arrays of each Component type, as concrete classes. Each derives from a base Component class, but the arrays are set up as the derived Component classes. So you had arrays of different classes named spriteComponents, screenPositionComponents, etc. This was inflexible for two reasons. First, adding a new component type meant also adding code for it to do a type check in the function to “Add” an entity.

// Get proper EntityPrefab method
Type prefabsType = typeof(EntityPrefabs);
MethodInfo theMethod = prefabsType.GetMethod("Create" + templateName);

// Call method to create new template
newTemplate = (EntityTemplate)theMethod.Invoke(null, new object[] { nextEntity });

// Check every array for proper insertion 
foreach (Component component in newTemplate.componentList) 
{ 
    if (component is Components.Sprite) 
        components.components[ComponentType.Sprite][nextEntity] = (component as Components.Sprite); 
    if (component is Components.Bomb) 
        components.bomb[nextEntity] = (component as Components.Bomb); 
    if (component is Components.Collision) 
        components.collision[nextEntity] = (component as Components.Collision);

// Etc...
}

This has been improved since, and now adding Components to an array doesn’t require manually going through every possible Component type.

// Check every array for insertion
foreach (Component component in newTemplate.componentList)
    components[component.type][nextEntity] = component;

Enity Prefabs

Every game using ECS benefits from having pre-assembled entities to use right off the bat. It’s a logical way to plan the rules of your game and what kind of game objects it will have. I use a small class called EntityTemplate which stores a list of Components. A class called EntityPrefab contains different methods (CreatePlayer, CreateSolidBlock, etc.) to return a new copy of a template, and its Components are added to the pool.

You still have to invoke EntityPrefab methods since the methods are dynamically chosen with the “templateName” String parameter. I would like to replace it with just adding prefabs to a List of EntityTemplates, so you just select them from a list. In hindsight this should have been the more obvious approach but I was taking from Phil_T’s approach to making entity prefabs.

Getting into the Game

I will talk about this in the next post, since I’ve probably gone long enough already! Then I’ll be able to go into more detail on how the game uses the framework. But since the first draft of this post and now, I have also made some more improvements on the ECS code and ironed out some game bugs too. The game is getting closer to being playable!

New Blog for SeedWorld

My SeedWorld voxel engine is getting its own blog, since it will be used to make a game out of it. It will be an open-ended RPG in the visual style of Cube World. I will be documenting the progress of this game and engine here: seedworldgame.wordpress.com. There, I’ll try to be posting updates regularly as I add new features.

With that said, this blog won’t be completely abandoned. I’ll still be using it for programming articles and other smaller projects I may have. However SeedWorld would be my top priority, so make sure to follow that one as well!

SeedWorld (my voxel world engine) first update

So here’s what I’ve been working on for the past two weeks. It seems that I have wanted to do a procedural voxel style world for some time and now have just started, only 4 years late on the voxel/cube graphics trend 😀

I am still facing a lot of technical issues which is to be expected early in developing something, but still made a lot of progress. I finally have a octave noise function that I am very satisfied with, in creating those very believable rolling hills you see a lot in procedural landscapes. Here is the breakdown of the current technical specs of the voxel world generation.

  • Voxel data is discarded as soon as chunk meshes are made. Chunks store only vertex data at the minimum*
  • Far draw distance (I want to emphasize this in faster PCs)
  • World divided into 32x32x256 chunks, with an area roughly 2000×2000 in size for the visible portion
  • Multi-threaded support for voxel and mesh generation

Future specs include:

  • Material determines the attributes in game, color gradients, and sounds for interaction feedback
  • Persistent storage for voxels only in the player’s immediate surroundings*
  • Different biomes which affect the visuals and interactivity of the materials

*This supports interactivity for making the world destructible, but only where it makes sense (near the player), and keeps the managed memory footprint low.

Voxel World – First Attempt

Eventually, I will want to make some sort of action/adventure type of game with the voxel/cube engine once it is fleshed out well enough. So far it has been a mostly smooth experience to see what it goes into the code. To get a head start I picked out some pre-existing code to make 2D and 3D simplex noise. I quickly learned how the noise functions readily make a continuous texture without being repetitive, as it’s of huge importance when making a procedurally generated world.

I started working on this on Saturday the 6th, and made some decent progress by Sunday night, making a cube world with 3D Simplex noise and some mesh optimization to avoid rendering hidden cubes. The custom shader is very simple and low bandwidth, taking only a Vector3 for local position and a Byte4 for color. The mesh-building code also adds a shade of green to cubes that are visible from above, and the rest are brownish in color. This creates a somewhat convincing grass-and-dirt look. Finally I implemented some basic brute-force culling to avoid rendering invisible chunks. Quick and dirty procedural world!

Posted Image

Posted Image

Some problems I found with this voxel-cube generator were, I frequently ran into out-of-memory exceptions when I decided to go any higher than 192^3 cubes. I was splitting the world into 32^3 sized chunks but it didn’t really help out the memory problem. My guess is that the triple-nested loops used to calculate the noise values are wounded too tight, and it might benefit from single-dimensional arrays. Also, I was storing the chunks in a resizable list, but it makes more sense to have a fixed sized array to be able to keep track of them. Also, while interesting to look at, the shapes produced by 3D noise weren’t very desirable so I switched to 2D to make a more believable height map. From there, I will then experiment with some 3D noise to get some interesting rock formations and caves going on.

Improvements in Terrain Generation

When I was still tweaking with different combinations of noise patterns, I could only come up with very large smooth, round hills, or many little but very bumpy hills. No repetition, but very bland to look at.

I had the basic idea down- combine many layers of Simplex noise of different frequencies, offsetting the X and Y for each of them just a little. But I had a derp moment when I realized I should be reducing the amplitude (effectively, the height variation) as I increase the frequency for best results. JTippets’ article on world generation really helped here.

Here are some screenshots of various builds, in order of progression. Here is “revision 2” as it follows the first build mentioned in my last journal entry:

Posted Image

Already in revision 2 I have added optimized mesh generation to remove hidden faces. The wireframe render shows this well.

Revision 3 shows the vast improvements in terrain generation that I mentioned previously. The draw distance is improved, and noise patterns create much more natural looking hills and valleys. Color is determined by height variation and whether or not the block is a “surface” block. The white patches you see are sides of steep hills that don’t have the top face visible.

Posted Image

Between revisions 3 and 4 I was trying out ways to speed up voxel generation, mostly with octrees. That didn’t work out as planned, for reasons I will state later in this post. So I went back to my previous way of adding voxels. The biggest feature update here is simple vertex ambient occlusion through extensive neighbor voxel lookups.

Posted Image

Posted Image

It is a subtle update but it greatly improves the appearance of the landscape a lot. I applied the AO method that was discussed in the 0FPS blog. The solution is actually simple to do, but the tedious part was combining the numerical ID lookups for all the neighbor voxels so that each side is lit correctly. I should really change those numbers into Enums for voxel locations so the code is less confusing.

Here is a screenshot just showing just the AO effect.

Posted Image

It is around revision 4 when I also made a Git repo for the project, and it has also been uploaded to a private Bitbucket account.

Performance stats, you say? Unfortunately I am not yet counting the FPS in the engine and I believe my stopwatch use of tracking time for chunk updates is wrong, because when it reads 15 milliseconds (about 67 FPS) the program becomes incredibly slow, as if it was updating only twice per second, but at 10 milliseconds or less, the program runs silky smooth without any jerky movement.

What I can tell you, though, is that currently I am sticking to update just one 32x32x256 chunk per frame in order to keep that smooth framerate. At 60 chunks per row, It’s still quick enough for the world generation to catch up to movement up to around 25 blocks/second. This is throttled by a variable that I can change to tell the program how many “dirty” chunks per frame it should update. My processor is a Pentium G3258- a value CPU but still decent for many modern games (as long as they are not greatly dependent on multi-threading), especially since it is overclockable. I have mine overclocked to 4.2 Ghz. If you have a CPU that can run 4 threads, has 4 cores or more, you should be able to update several chunks per frame very easily.

About using octrees- I did not perceive any performance gains from using them so far. I wanted to use octrees as a way to better find potential visible voxels without the brute force option of going through all the voxels in the array. The good news is: I got the octrees to technically work (also did some nice test renders) and I also learned how to do so using Z-curve ordering and Morton encoding. At least I gained some interesting knowledge there. Bad news: reducing the amount of voxel lookups with octrees did not result in being able to quickly update more chunks per frame, which was the ultimate goal. So I am putting aside the octree-related code for now and maybe it will come in handy later.

Persistent local voxel storage concept, and future updates

The persistent storage for local voxels is definitely something I want to implement, and make a key feature in my engine. Keeping voxel data for the entire visible world is usually wasteful and it only makes sense really to know what you will see immediately around you. After all, if you have a pickaxe, you are not going to reach that block that is 500 meters away. This data storage will update as you move around the world, storing at the most 4 chunks worth of voxels.
This can be applied further with other objects that may interact with the world surface. Say you are a mage that can cast a destructive fireball. Upon impact, you want to calculate the voxel data for the area around the fireball so it can make a crater. Or an ice ball to freeze the surface. Obviously you want these calculations to be done very quickly, so it sounds like a good way to stress test the engine with lots of fireballs and who knows what else being thrown around.

Other more features I want to add soon are the creation of pseudo-random rock formations and measuring slope steepness which will help in generating other pseudo-random elements. Probably gonna add those voxel trees first, in order to add more to the landscape.

Returning to form?

It’s been over a year since I updated this blog so I should probably tell you what I’ve been up to. While I was working on my main XNA project, Meteor Engine, I have also been unemployed most of the time, just getting by with a bit of freelance web developer work here and there. Last April I got a full time job and it’s been great most of the time, but the company financial difficulties so they made tough choices I wasn’t happy about. Now it’s back on the job hunt for me. With my much improved experience, hopefully my job hunt now will go much quicker than last time.

So while I’m keeping an eye on my own budget, I decided to get back into the C#/XNA programming I was familiar with. This also means I am taking a new angle.

Previously I was focusing a lot on making a graphics engine. Going back where I left off, I was going to be making a shooter game. The blog will shift focus to “I actually will be making some GAEMS!!!” this time around. So my goal is to start with some base code I have and make a finished game from it.

The graphics engine could be a tool I can now add to my arsenal for making a game and it’s very likely I’ll use it here. My journey into developing the game will begin with my next post.

One Game A Month: Here comes a new game project!

What? Another game already? That’s right, but this one will not be as big as my racing game project, which I expect to be ongoing for several months and likely at least a year. No, this game will be a short-term project, only planned for one month as part of the One Game A Month quest. I want to get in the habit of finishing games quicker. (Maybe then I could rename the blog Electronic Meteor Games! Imagine that) I want a game I can make more quickly and easily, and just as well be leveraged by the coding experience I have gotten so far. So it will re-use some of the code I’m currently working on right now, but refactored to fit the needs of the game.

The game will be a twin-stick top down shooter. The idea may not be original, but carrying it out should be fairly easy. I do not have a name for it yet, only know at least some features in it will include multiple levels and upgradeable weapons, local multiplayer (not sure yet if I can finish online networking code in a month), and a cool lighting atmosphere for the graphics. So basically what one may expect from a top-down shooter. Characters and setting will be fairly abstract and basic. I don’t have much know-how for modeling human characters so it will be robots blasting other robots.

Here are the main goals I intend to follow for the month-long project:

  • Simplistic but nice to look at graphics and setting
  • Multiple weapons and enemy types
  • Controller support (gotta really get a controller for that though 😛 )
  • Learn some more AI programming as I go along
  • Use what I learned from Meteor Engine for the graphics
  • A lighting mechanic to hide/show parts of the map (somewhat of a “fog of war” for a shooter)

I have been mostly inspired by some of the fast-paced games being put up on Steam Greenlight to do a top-down shooter. It’s a genre that is simple fun and engaging for many people, and I believe that a (stripped down) top-down shooter can be another good game for budding programmers, comparable to platform games. So for this month, I will be slowing down progress of the racing game to work on this one.

On the AI side, I have been reading this set of tutorials to create a state machine. Many game programmers may be familiar with the game state switching pattern to code a complete game. These tutorials take it further in applying it to other ways, like setting up rooms for a dungeon crawler or computer-controlled AI characters that follow their own motives. The latter is the one I’m most interested in. I plan to implement the tutorial code for this game to give me a head start on the AI. It won’t be pretty but the functionality is what counts here.

For graphics, I mentioned the Meteor Engine, but I will not be using it as-is. Rather, the game will have its own graphics code that will take several ideas from the engine. It will be a trimmed down, sort of “lite” version of the engine code, using mainly deferred rendering for graphics. The intent is to provide a setting with many moving lights, and most outdoor daytime scenes aren’t good for that. Features include mostly dark rooms, bullets that light up the room along the path they take, reflective surfaces on characters and level objects, and point light shadows. A lot of the visual inspiration comes from the Frank Engine demo, so expect the game to look more or less like that.

I will code this with XNA, as usual, but I will also try to get it portable to MonoGame. I have been researching this for a while but attempts to port any of my code to other platforms haven’t gone well so far. MonoGame (in its current 3.0 version) on Mac seems to be a no-go with Snow Leopard, something to do with the Apple SDK’s not being up-to-date with what MonoDevelop uses so I would have to upgrade XCode to 4.2 which requires a Lion upgrade. Not up to doing that right now. So it will likely be on Linux before Mac 😛 The cross-platform support is not part of the month-long deadline, it’s just something I would like to do to take my game further like online multiplayer.

I would like to get started today with programming the game, if I want to finish it before the 30th. Just for today to use a placeholder model for the character, draw everything with basic graphics and make the character shoot in all directions. At that point it’s not very different logically from a scrolling shoot-em-up. So look forward for more posts related to my month-long game. It’s been a while since I actually release a game and I want this to be the most complete game I’ve released so far.

Terrain picking solved

Finally, I figured out now to do precise terrain picking, so now I can make a lot of progress! As I’m going to be using BEPU for the physics engine, I just let BEPU do the dirty work. Using it to create a terrain height field and doing ray intersection testing is pretty intuitive. Storing 4 million points is no problem for it, but I may look into its source code to see how it’s doing the intersection tests so efficiently.

In the meantime, though, I can move on to creating the brush and mesh placement tools. Mesh placement should be easy, as I want most objects to be touching the ground. Placing meshes will also be treated as brushes so you can fill the landscape with loads of objects quickly. For now I have this video as a proof of concept for testing.

Some ideas on the placement tools:
– Mesh brushes will likely be done the way of Poisson Disk sampling as demonstrated here, so the spacing of objects looks natural and you don’t have to think much about how their placement looks.
– Objects can still be changed individually if you wish. A single Selection tool will make it possible to change an object’s scaling and location.
– Rotation can be done on a basis of either having all objects orient towards the surface normal, or ignore the Y component. Rocks, for example are fine for rotating in all directions, but trees and buildings should generally point up.
– A toggle switch for each object so you can snap its vertical position to the ground surface in one click.

Physical interactions with the objects will come a bit later. I will at least need a wrapper class to associate mesh instances with BEPU Entities.

The current game plan

Bye bye geo-clipmaps, here comes geo-mipmapping! I’ve been busy at converting my terrain rendering code to use it from now on. It’s not fully completed but the basics work. I just need to fix up the holes that appear in the borders between different detail meshes. The terrain system has advanced a lot more than when I used geo-mipmapping. While it still only supports two textures, the cliff texture can be sampled twice in two different scales to remove any semblance of repetition. No fractals or noise patterns here, it’s all from pre-made texture images. I’m also messing around with a vignette effect, currently made part of the Depth of Field shader. The engine is also now running with a game screen framework built on top of the official XNA sample.

screen28-1

Now to move on to the game structure itself. I’m not too keen into drawing UML diagrams or other fancy charts, also because I have not gotten really comfortable with a diagram editor. I’d rather sketch them on paper and take some photos of it with a phone. But I do have a tree-like structure to organize my code in.

This planning comes easier as I figure out the first thing I want to do with my terrain renderer is to put it in a game state that will later on be the in-game map editor.

The puzzle game I made earlier made a good learning experience in understanding how a screen and menu system can be put together where multiple states and screens run independently of each other. The game may not be 100% done, but its code is stable enough for me to be able to port the screen system into this new game. Because this would be the most complex game I’ve attempted, I look forward to seeing how far I can take it. With a loading screen and transition to game modes are in place, it will at least finally start feeling more like something with a greater purpose than a tech demo.

The graphics engine is still a work in progress so I will work on it together with making the game. The game code will be organized in three different areas: Core, Game, and Screens.

Core

  • Graphics engine (my own)
  • Physics engine (BEPU or maybe Bullet)
  • File saving/loading
  • Input
  • Networking
  • Screen system (from my last game)
  • Menu interactions
  • Screen drawing/updating system

Game

  • Game logic, AI
  • Player interactions
  • Game objects
  • Editor
  • Editing tools and functions

Screens

  • Background(s)
  • Loading screen
  • Menus
  • Gameplay modes
  • HUDs and interfaces

Core contains all the systems that deal with the lower-level workings of the game. Sending data to the graphics engine, setting up and managing physics, input management, and the loading and saving of files all go here.

Game contains all the very game-specific logic that’s all about setting the rules, game modes, and specific interactions with game objects. They all tie into Core in some way, depending on what they are responsible for doing. A more specific area, Editor would include all the tools and functions used for the game’s map editor mode.

Screens can be seen sort of like game states and also like components that, when grouped together, can describe a game state or mode of behavior. They are loaded and ran from the screen system, and either specialize on displaying information related to the game, or tell the user what actions are available. Background screens, gameplay screens, HUDs, inventory screens, and menus would belong here.

As you may have noticed, the three groups tend to progress from low-level to high-level code. This was not really intended, but does give me a better idea of how to pass data around.

The graphics engine is already running in the screen system. When the program launches it add a Screen to a list, which loads the content to be rendered. Here is the game loading a terrain in real-time, with some interactions handled by an “Editor” screen.

(lolwut, YouTube detected this video to be shaky. It’s all rigid camera movments in here)

There are a few issues I have to take care of with the screens and graphics. Both the screen system and graphics engine are loaded as XNA game components, which means they draw and update automatically within the game, outside of the screen system’s control. Although the content loading code is in the Editor screen, I need the option to make the explicit choice of what order the graphics should be drawn, so that any graphics that are set in a particular screen get drawn with that screen’s Draw call.

Triplanar normal mapping for terrain

First, before having gotten into terrain normal mapping, I added mouse picking for objects. I have some interactivity now!

screen26-1

This is taken from an XNA code sample, then I modified it so it supports instanced meshes. So now it’s able to pick the exact instances that the ray intersets, and displays their mesh name. It doesn’t do anything other than that for now, but it’s just the first step towards editing objects in the level editor.

Mapping the terrain

The new update was for fixing a problem that’s been bugging me for a few weeks- combining normal mapping with triplanar texturing. It was a tricky affair as the normal maps get re-oriented along three planes so you also have to shift the normals accordingly. After revising how I did my regular normal mapping for other objects, I was able to get correct triplanar normal mapping for the terrain. This goes for both forward and deferred rendering.

I have only two regular textures- the base texture for mostly flat areas, and a blend texture for cliffs in steep areas. My normal map is for the cliff texture, and no normal mapping is applied for the flat areas. You can also set a bump intensity which increases the roughness of the terrain. Naturally, with great roughness comes great respons- less specular highlights. So you would have to tune the specular and roughness so it achieves a good balance. Most of the time terrain, doesn’t need specular lighting, but it’s needed for wet and icy areas.

Bump up the volume

Terrain normals, binormals, and tangents are all calculated on the CPU, which is the ideal way to go as it saves a lot of overhead of doing it every frame. In the vertex shader, the normal, binormal and tangent are transformed to view space and added to a 3×3 matrix.

output.TangentToWorld[0] = mul(normalize(mul(input.tangent, World)), View);
output.TangentToWorld[1] = mul(normalize(mul(input.binormal, World)), View);
output.TangentToWorld[2] = mul(normalize(mul(input.Normal, World)), View);

In the main pixel shader function we must first compute the normal mapping output before it can be contributed to the vertex normal outputs.

PixelShaderOutput PixelTerrainGBuffer(VT_Output input)
{
    // Sample normal map color. 4 is the texture scale
    float3 normal = TriplanarNormalMapping(input, 4);

    // Output the normal, in [0,1] space
    // Get normal into world space

    float3 normalFromMap = mul(normal, input.TangentToWorld);
    normalFromMap = normalize(normalFromMap);
    output.Normal.rgb = 0.5f * (normalFromMap + 1.0f);

    // ... Then output the other G-Buffer stuff
}

The textures are expected to be in the [0, 1] range and TriplanarNormalMapping outputs them to [-1, 1] so they are properly transformed with the TBN matrix. After that we can set the normals right back to the [0, 1] range for the lighting pass. Remember that it outputs to an unsigned format, so if we don’t do this, all values below zero will be lost.

The following function computes triplanar normal mapping for terrains.

float3 TriplanarNormalMapping(VT_Output input, float scale = 1)
{
    float tighten = 0.3679f;

    float mXY = saturate(abs(input.Normal.z) - tighten);
    float mXZ = saturate(abs(input.Normal.y) - tighten);
    float mYZ = saturate(abs(input.Normal.x) - tighten);

    float total = mXY + mXZ + mYZ;
    mXY /= total;
    mXZ /= total;
    mYZ /= total;

    float3 cXY = tex2D(normalMapSampler, input.NewPosition.xy / scale);
    float3 cXZ = float3(0, 0, 1);
    float3 cYZ = tex2D(normalMapSampler, input.NewPosition.zy / scale);

    // Convert texture lookups to the [-1, 1] range
    cXY = 2.0f * cXY - 1.0f;
    cYZ = 2.0f * cYZ - 1.0f;

    float3 normal = cXY * mXY + cXZ * mXZ + cYZ * mYZ;
    normal.xy *= bumpIntensity;
    return normal;
}

Note that where I define the texture lookups, the XZ plane is just set to a normal pointing directly towards the viewer. The X and Y values are in the [-1, 1] range, and Z is by default 1 because it is not used for view-space coordinates. So don’t forget to flip normalized negative values! Then X and Y are multiplied by the bumpIntensity. The default roughness is 1, and a roughness of 0 will completely ignore the normal map for the final output.

A lot of my texture mapping code was adapted from Memoirs of a Texel. Take caution, that if you want to follow that guide, there is a glaring mistake in that code that I noticed only after seeing this GPU Gems example (see example 1-3). You need to clamp your weight values to between 0 and 1 before averaging them out. The blog article doesn’t do this in its code. Otherwise you will get many dark patches in your textures. I fixed this with the saturate() function shown in the above example. This goes for regular texture mapping as well as normal mapping.

Here are some screenshots with the normal mapping in place. The bump intensity is set to 1.8 for a greater effect.

Edit: I’ve used some better textures for testing now. I got some free texture samples at FilterForge.

screen27-4

screen27-3

screen26-4

screen26-2

Normal computation is the same for forward rendering as it is for deferred rendering. The normals as they contribute to lighting would still be in the [0, 1] range in view space.

Looking back, and more game details

It’s been about a year and a half since I’ve started this blog, and back then my engine project was only 2 months old. This January is the first year my blog received more than 1000 visits in a month, and with February being second month in a row, I hope it continues. My most popular posts used to be the code sample articles (I probably should update those) but now more of my recent posts have been catching on there. I still plan to post some helpful code articles for some variety, but it will be more based on what I’m currently working on.

Also, I have started my own GameDev journal to show my progress to the community, so I hope to continue on there to promote my game. I used to think you only need to be a paid member to create journals, but I guess not anymore. This blog will still be updated more liberally and frequently, while articles in the journal will be posted with a bit more consideration and preparation.

Outside of this blog, I have gotten more involved in the game development scene by going to local meetups, usually organized by the IGDA members, and hearing the war stories of experienced developers (indie and AAA) and learning ways to get your games to a finished, workable state and other ways to make your life easier. Meeting people here has been fun, and although I haven’t collaborated with anyone for a project, it’s not out of the question. I would also like to get into a game jam in person, whenever I can schedule enough time for it.

Racing game details

Hey, you know that racing game that’s supposed to happen? Well, I still plan on making it happen, and my ideas are becoming a bit more solidified now. I always planned on making anything but a super-serious racing sim, because those take too long to get all the details right in the physics and the authenticity of the look and feel. Also I won’t have permission to use real-life cars in the game. Previously, I said I wanted to make an off-road racer, but that might still lead to some high expectations on the landscapes and environment. Also, I would have to limit myself to how the vehicles should handle. So for now, it’s a general arcadey racer that will just happen to have some trucks and ATVs in it.

Now on to some of the main features of the game.

Multiplayer

One of the things I want my racing game to have- MULTIPLAYER. That’s a big one. I want people to race other people, and especially across different computers. So this also means including network play. I have never done a networked game before, so this part would be completely new to me. I don’t know how to set up a waiting area, or find available players online. Should I use client-server or a P2P architecture? So many things to figure out. Lidgren is a popular library for online gaming in XNA, which I will probably use anyways for reasons stated later. Work on this likely won’t start until about halfway through the development process.

Before that, and easier to test, is to have local multiplayer. My graphics engine will make split-screen support a breeze, with the way I can hook up different cameras to the same scene and draw it several times. The multiple controller support should be easy as well.

Customizing tracks

Another big one, is wait for it- track creation. Yeah, it sounds ambitious but I figure if I am gonna aim for making the track creation tools easy for me to program, it might as well be user-friendly and simple enough for players to use. See ModNation Racers and a bit of TrackMania for inspiration, but moreso ModNation Racers because their editor looks more fun and inviting to use. And I don’t mean to brag much, but holy cow, my graphics engine looks about as good as MNR’s… I’m basically halfway there 😉 Their splatting techniques do not use a million textures, and cliff textures automatically appear in steep sides.

“Why are you making a clone?” You may ask. Well, it’s obvious it’s one of my favorite games and it gets me inspired. Other people have made clones other than for learning purposes, like CounterStrike clones or _gulp_ Minecraft clones. (Is it just me or are MineCraft clones the new MMO clones?) However one important point is that MNR is only sold on Sony’s platforms, and I do not plan on making something to directly compete with the other popular customizable racers. Besides, TrackMania and TrackVerse seem to co-exist well on the PC and TrackVerse has put their own spin on the genre with a faster-paced WipEout feel. It would be great if a WipEout like game could come out for XBLA or XBLIG but I digress.

Other platforms

That brings me to another feature I plan later down the road, which is cross-platform support. The ultimate goal would be, aside from Windows, to run it natively on Mac and Linux. This obviously means I cannot use XNA forever, and going to MonoGame for non-Windows builds looks like the most attractive option. For those platforms I will need to rebuild the engine library with the MonoGame references instead of the XNA ones. The sooner I can test out the engine on Linux, the better. Lidgren is also used with MonoGame, which will smooth out porting the networking code when I get to it.

The reason I’m sticking with XNA for Windows is because it uses DirectX, while MonoGame for Windows uses OpenGL, and driver compatibility is more of an issue with OpenGL on Windows. I had Linux Mint installed before, but removed it a few months ago. DarkGenesis’ blog looks to be a good resource for porting MonoGame applications, so I’m bookmarking that for now.

Gradually I will be integrating the screen system code that I wrote for the puzzle game I worked on a few months ago. It includes (almost) everything I need to create menus and screen states for the game. This code is big enough to be in its own library, and it may be easier if I just build it as such.

So these are some of the main features of the game. It’s gonna be a tough road ahead, but I think I can make a lot of progress in a few months. Out of these three, the first one I will begin work on is the track editor. It’s going to be the backbone of the game development process, so it makes sense to begin here first.