A well installed microcode bug will be almost impossible to detect.
-- Ken Thompson
Introduction
There are some things that you take for granted but then again you meet blank stares when you talk about it as it was a given. It is very easy to forget that some things that you take for granted, others have not yet seen or thought about and vice versa. We're all colored by our own quirky way through the maze towards programming enlightenment. My own path has taught me the following of build configurations: Debug and Release are not enough.
Warning, Evil Wizards
When you start up the wizard in Visual Studio it starts by generating a project and solution for you with two configurations, Debug and Release. They both are reasonable for very very small and quick experiments (I rarely bother to change them for those), but for anything else, don't use it. In fact, I'm going to sidetrack for a while and talk a little about the evils of Wizards. And not the Harry Potter ones, just the ones shipped with IDE's like Visual Studio. Wizards are great for getting up and running, but you literally do that quick and dirty. In fact, it's so dirty that you often don't want to look at the code or the setup that was used. And you will also have no more understanding of the actual setup used than the fact that it kind of works and you just pressed a button. For example, ever tried to program in MFC? Did you use the Wizard to do it? Ok. Did you understand any of the code that came out? If so, congrats. If you're like me looking like a cartoon question mark, welcome to the club. Now, MFC is actually not that bad, it just have the learning curve that looks like a Heaviside step function. Best way to learn it is to toss the wizard code and start from scratch. And then, it's not that bad actually. It kind of makes sense, if you know all the historical reasons what ruled at the time (come on, message macros to save space for virtual tables? Today they make no sense). It's the same with the wizard generated project settings, they make little sense for games programmers. Even the names are a little bit misleading.
Canning Debug and Release
The names Debug and Release are implying that you do most of your major development in Debug and then you ship your product in Release. Eeek. Let's look at what typical situations you will most probably encounter in daily production of a game like project:
- Design and implementation
- Debugging
- Profiling
- Testing
- Retail version
So I would propose the following five configurations:
- SlowDebug
- FastDebug
- Profile
- QA
- Retail
Let's go through and see what they each do:
SlowDebug
This configuration would basically be an unoptimized version of the game with all the asserts, debug systems and log-systems enabled. This is usually just used for situations where you go WTF and need to go through large amounts of code and inspect all the variables. This can happen often in other peoples code where you are unfamiliar with the concepts and need to understand the code. Great for learning new systems. For all else it's pretty much useless, since a normal game is simply not running that fast with the most basic compiler optimizations turned off (read, excessive traffic to memory).
FastDebug
Here we still have all the asserts, log systems and other helper systems enabled, but we've also enabled more basic optimizations that lets the compiler do register allocations and not hit the memory so much. Most programs should run at decent speed here, at least interactive and we can deal with most problems here. It should let us always find and inspect our variables in the debugger easily enough (track the asm back, or have reasonable allocations for the this pointer).
Profile
This is what we use to profile our build for speed. This configuration has everything turned off in terms of logging, asserts etc to be as close as the final build as possible. A careless assert might incur an extra load from memory or an extra branch in tight loops and then all bets are off and you're profiling some code that will be different in the final product. You might also unfortunately need to link against some profiling versions of the libraries as well as perhaps turn on compiler hooks for instrumenting the build, they will also throw off the timings slightly.
QA
This is a build that most of the debug systems are turned off, but the asserts and a minimal set of logging are still turned on as well as some way to find out after the fact what happened (crash dumps, more verbose exception handles etc). This build is intended to give to the QA teams during most of the development so it needs some sort of reporting mechanism so that the testers can at least help the developers out a little with information. On most consoles for example, it's pretty easy to write a runtime stack walker that can at least dump down the callstack whenever you crash or hit an assert and dump that out to a textfile that the testers can send to the programmers. The QA build should be able to run through consumer means, so if you're doing a disc based game, make it able to run from disc, through actual media or emulation. You should also consider to turn on as many optimizations as you can in this configuration.
Retail
This is the configuration that really matters. This is the one that you will put on your master disc. Here all the debugging facilities are turned off. No asserts. No log messages. No helper systems. It is helpful if this version have the same optimizations as the QA, since it dramatically lessens the number of bugs unique to this compiler setting you will get, but sometimes that is not feasible. For example, we had one game where we had Link Time Code Generation (LTCG for short) turned on in the retail version only for one of the games I was on. Of course we had lots and lots of crashes in this mode only. The problems was that the link itself took around 40 minutes so there were simply a limited amount of times I could recompile and link the game each day searching for the bugs. Also the fact that we caught these late in the development cycle when we were crunching didn't help of course.
More fun with macros
I usually put a macro on the command line that indicates the configuration I'm building. For example, if I'm building the retail version of the game and my engine is called Foobar, I might send in the command line -DFOOBAR_RETAIL to the compiler to indicate that I'm compiling the retail version. Now it might be really tempting to key everything to these configuration macros, but in general this is a really bad idea (TM). In practice I've found that using one extra level of indirection is helpful in many cases. For example, say you're writing your awesome and nifty math library to kill all others. Yet, you notice that in debug it kind of runs slow and still it does in release. You track it down to all the extra asserts that you put in there to make sure that no one is doing the wrong thing. But they also slow down the build by causing the compiler to not inline functions, insert extra branches and hit memory more. Ouch. The temptation to put the following in and knock of for lunch might be really high:
#ifndef _DEBUG #define FOOBAR_ASSERT( expr ) do { (void)sizeof(expr); } while(false) #else .... #endif
Resist! Removing the assert from the code is not the answer, in reality you are really hurting yourself here since all your assertions of good parameters and state are just compiled out in most builds. The following code will just start to wreak havoc in your codebase:
switch( state ) { case Idle: ... break; default: // should not happen FOOBAR_ASSERT(false); }
If I had a dollar for each time I got this assert I would... well, I would have a reasonable amount of money to buy both you and me a very lavish dinner... And the flight to Paris. Anyhow, a more flexible way is to realize that some asserts are perhaps only needed when you are debugging or suspecting a problem and these we can wrap inside another macro and then key that macro to the config as well letting us turn all of them on/off at will. For our math library we might want to create a file called MathConfig.h and put the following in there:
#ifdef _DEBUG #define FOOBAR_MATH_CHECK_ALIGNMENT #endif #ifdef FOOBAR_MATH_CHECK_ALIGNMENT #define FOOBAR_MATH_ALIGN_ASSERT(expr) FOOBAR_ASSERT(expr) #else #define FOOBAR_MATH_ALIGN_ASSERT(expr) do { (void)sizeof(expr); } while(false) #endif
Now most of the asserts out there will survive and you can still if you suspect the alignment issue in the math library enable just that check and recompile. Hopefully that doesn't bring your game down to a crawl, or alternatively it finds the bug quickly. However, you can do this change locally once you suspect that something is going wrong and still leave the normal day to day math library useful. This an be applied to almost any system, during development you usually have a large number of asserts in there to save the day, but at some point they become more and more unnecessary as for example you move the whole data validation to the tools pipe instead and just have an initial assert/check once you load the data. The asserts might still be useful if anything goes wrong, but for the most part just compile them out.
Compiler flags
Most compilers today support something that is called fast math, or non C compliant mode. This is actually a good thing since the C standard and the IEEE 754 standard requires all sorts of crazy things that usually results in very slow code. However if you turn on the fast mode in all your configurations, you will have less chance to get bitten by this in release modes, as even in debug some floating point operations are done in a non standard way.
Another flag that might be in controversy are the exceptions. I've found that I'm using them for some things, most notably unit testing, and want them turned on in Slow/Fast-Debug configurations where I run the unit tests. It makes the unit tests run ok and the slight speed hit is acceptable in these configurations.
Other than that, I just kind of follow the guide lines I outlined above. I kind of fiddle around with the settings and measure the output to find out things like "Do optimize for size or do Optimize for speed matter?" for the particular compiler I have. Best tip is to experiment and observe both the output assembly and the actual ingame performance. They both can throw you curve balls for very reasonable flags.
In Closing
I've touch briefly upon configurations, why you should consider to use more than just Debug and Release and possible names and uses for them. The next question might be "How do I manage all these configurations without going crazy?". That's a topic for another article, but suffice to say: "Not manually, or you'll wind up in a padded room".