Wednesday, June 27, 2012

What Time Means On a Modern CPU

So, I've decided to go full in on game programming. This should surprise exactly nobody. I sat down and did some rapid math to see how much overhead I had.

My end game engine will be a 2d vector game engine, because I like to make my life hard for myself. The target FPS is 120 FPS; this is for no other reason than that that's the current maximum refresh rate of any monitor or display on the market that I know of. I -think- you see it in the 3d sets; I don't know, I've essentially ignored the recent 3d revolution.

Anyway. 120 FPS means that every frame has 1/120, or .0083333 (etc.) seconds to do its work. Assuming a CPU that is going at 1 GHz, that means I have eight and a third million clock cycles, per frame, to do all the work that needs to get done.

I knew modern machines were fast, who doesn't, but I still managed to be surprised to actually see it quantified. I had some kind of vague in my head idea of how fast a CPU was, but now that I see it, man, that's amazing. We've come a long way from the 10 Mhz 286 I first learned to bit bash on.

Now, I'm well aware that a 'clock cycle' is not necessarily a terribly useful metric by itself. There's lots of questions that need to go with it. How many clocks does it take to do a fetch for a given data size? How many clocks does it take to do certain math operations that are going to come up frequently in your code? So on, so on.

Still. I guess it's not that astonishing, but it is interesting (to me) anyway.