Today we have a sneak peek at the most common adversary in our game, the lowly zombie. This seems like a good time to give you an inside look into how the zombie was made and brought into the game.
Our zombie was created by Gary Lee, a talented character artist working at Flying Lab Software. You can check out his work here! The zombie has gone through a lot of adjustments since creation. This is his story.
Originally, our game was a series of 2d sprites animating in a 2d engine to fake the effect of 3d. This meant there was no limitation on texture size, polygon count, number of textures, etc. The engine only saw a 128×128 png flipbook and didn’t care how it got there. The initial life cycle of our zombie friend started in Zbrush, moved to Maya for animation, rendered into stages onto a tga sequence that was adjusted in Photoshop, and finally the sprite sequences were composited together in After Effects. There were many reason for this life cycle (they call them pipelines but doesn’t life cycle sounds more fun. Maybe it’s a birth cycle, actually). I could go into the details of it all and why we did what we did but that’s streets behind. Let’s go streets ahead.
We are now making our game in a shiny new 3d engine, Unity. However, we are making our game for the iPad. This isn’t too limiting…. yet. Now we need 64 animating characters to be on the screen at the same time. There’s the problem. The original zombie may have worked in a game with fewer actors on the screen but he definitely won’t work in a game with 64 at the same time!
With the help of Dru’s typing skills, we determined what our vertex limits, texture limits, and bone limits are. Now we have to reduce our 7500 vertex zombie down to 250 verts. We have to drop normal maps. And, we have to drop down to 20 bones.
This may be a bit technical now but here come the pictures and exciting new birth cycle of our friend, Ziggy, the Zombie.
Here is our zombie straight from Gary with some lighting and posed up.
Here he is with a hat.
We start with the sculpt Gary made in ZBrush. As you can see, he is like a bajillion polygons. Gary took this bajillion polygon sculpt, made a low poly mesh, and then created a normal map for the low poly mesh from this high poly sculpt. I’m not sure how he did it or what tools he used but it doesn’t matter since we are moving on.
To reduce Ziggy for iPad, I bring him into this wonderful program, TopoGun. I love this program. I think it’s great and it was like 100 bucks for the license. Well worth it.
As a tip, detail was lost if I reduced too far at this stage. Since this is quite a drastic cut, I found for some situations it was better to reduce to around 400 verts and bake out the textures. Then I would reduce again down to the final 250 verts inside Maya once the textures were made.
Now the new low poly mesh needs to have the UV’s laid out. I was about to use Maya to do this like I usually do when my friend Ken said, “nah son, use Roadkill”. Ken was right. Roadkill is awesome. Check it out yourself. Bonus: It’s free! Roadkill.
With these amazing UV’s, our high poly mesh, and our low poly mesh, it is time to bake out our maps. Here is where I learned of another tool, xNormal. I was used to baking out maps in Maya, as well. It would take a long time. The ambient occlusion bake on Ziggy didn’t even finish in Maya after several hours! I think I may have messed something up but that doesn’t matter. xNormal fixed everything! Also: xNormal is free.
You enter a bunch of settings how where the high poly mesh is and where the low poly mesh is. You can then bake a normal map, base texture map, ambient occlusion map, etc. The longest bake time was for the ambient occlusion at 3 mins! Now that’s a time saver.
Another cool thing about xNormal, if I couldn’t find the orginal Zbrush sculpt or didn’t have it, I could bake a base texture using the normal map as a base texture. That way I could make a normal map for the new super low poly mesh without even having the original sculpt. It worked well it some cases.
Since we can’t use normal maps on the iPad for our characters (Noah still gets to use them in environments) I would take the normal map and make it a greyscale image. Then I would up the contrast a bit and use it like an ambient occlusion layer, basically. It works very well for our game, actually. The distance the camera is from our characters, the normal maps weren’t reading well, anyway. This gave us more detail than the normal maps did!
After all this, Ziggy goes back into Maya for a skeleton and animations! I had to cut half of his bones from the 2d version. Most of the detail in animation was lost due to the camera distance, anyway. I am sad that Ziggy lost his boob and stomach bones, though.
And, there it is; the birth cycle of characters into our game. He starts at 1 bajillion vertices and comes out with 250. As an added bonus, here is a side-by-side comparison of the zombies from the different engines.
That was a fun story but no story is complete without a moral. Creating the new pipeline for our 3d engine required that I learn some new tools. Topogun, Roadkill, and xNormal were all new to me during this process. I think it’s important when making Art for video games to be open to new tools and experiment. It may seem like more steps were added to the pipeline, however, more time was ultimately saved. I would guess at least a day or more has been cut from every character going into the game. You never know what tool out there might be able to help you out. I’m currently messing with 3dCoat. I think it has potential to help us out in the future, as well. We will see, though.
If anyone has any other tools they enjoy using or have saved them time, drop a comment here or on the Facebook page!