Monday, August 13, 2012

My Learning Process

It's only taken me ten plus years, but I've finally nailed down how I absolutely, positively make sure I've learned something.

If I can't teach it, and I'm not using it, then I have not truly learned it. I have often learned things well -enough-, to be able to perform adequately. But unless I've made some attempt to explain what I've learned to someone else, I find that the lesson is not as deep, and not as wide, either.

I think it's because when I learn something just well enough to perform at a task, I allow myself to have gaps in my knowledge that aren't absolutely critical to the task at hand. But when I go to teach someone, I immediately have to fill these gaps in. And not just fill them in with bullshit, either. No, I go back and I double-check everything I thought I'd learned, and make sure that the knowledge I am passing on is correct. I can tolerate myself being wrong or not being fully knowledgeable, but I can't tolerate possibly corrupting someone else's knowledge pool.

Also, in teaching, I find myself thinking about the whole construct of my knowledge over again. Sometimes even at this late stage, I'll have a sudden a-ha! moment as I think over something I haven't thought about in a long while.

Anyway. At this point in my life, I am now realizing I have been doing an awful lot of doing, and often blindly, and not enough trying to teach. I am by no means a perfect teacher, or even a good one. But the act of teaching is healthy for me. And I like to think it's healthy for others. And I'm not even a teacher, really. Just a fellow student, who wants to share ideas. Where I have more knowledge, I want to give it away. Where I have less, I want to receive it.

Took me ten years to learn this thing, though. Wonder what single thing I'll finally, slowly, dull-headedly learn over the next ten?

Monday, July 23, 2012

Life on Mac Is (Still) Hard

Following up my previous post, which I should have followed up immediately but didn't.

Installing on Mac OS X Lion continues to have some fun problems. I've already mentioned the problem with Vorbis; Flac, too, has problems with its configure script which can be fixed with a switch statement. To get the Flac developer libraries to install correctly, the following must be done.

./configure --disable-asm-optimizations

Thanks goes to Stack Overflow for that. Last and not least, SDL_Mixer's install also has problems. Even with Ogg Vorbis installed and functioning ( you can test that with some of the example c files it comes with ), SDL_Mixer's configure script won't recognize them. In this case, I couldn't find any command line switches to force it to work. I instead went in, and simply bypassed the configure file's test for checking if Vorbis was installed and working. I'm trying to learn more about scripting so I can see how exactly the configure script is doing its test, and then I'll go back into all -three- configure scripts, update them, and see if I can get the changes back to the guys who make this stuff.

For right now, if you find yourself having troubles, post a message, and I'll send you the modified configure script for SDL_Mixer. The reason why I'm not just flat out putting a link to it right now is because even though it's fixed, it's not fixed -correctly-, and I don't like that.

Sunday, July 8, 2012

Life On Mac is Hard

Posting this for posterity, for Mac developers. If you're not a Mac developer who treats it like a glorified Unix machine ( instead of using XCode for everything ), this might not make much sense. Move along. I'll post a recipe or something later.

This post applies to libogg 1.3.0, libvorbis 1.3.3, and was done on Mac OS X 10.7 running on a MacBook Pro. After you get done installing libogg 1.3.0, attempting to run libvorbis 1.3.3's configure script will result in the following error spew:


*** Could not run Ogg test program, checking why...
*** The test program failed to compile or link. See the file config.log for the
*** exact error that occured. This usually means Ogg was incorrectly installed
*** or that you have moved Ogg since it was installed.
configure: error: must have Ogg installed!

Which is, of course, complete nonsense, assuming you did install libogg 1.3.0 first, using proper permissions and everything. You can test that libogg 1.3.0 installed correctly using a C/C++ program if you like; I leave that as an exercise for the reader. The problem is that the configure script for libvorbis tries to build to i386 instead of x86_64. To correct this, you have to force the build script to build for x86_64 instead. Run ./configure --build=x86_64.

I freely admit I wasn't smart enough to figure out anything past 'libogg is trying to build to x86_64, while libvorbis is trying to build to i386, what's up with that?'. This forum post gave me the rest of the pieces necessary to make the whole thing work ( scroll to the bottom to find the relevant post ).

For my next trick, I might fix the configure scripts and see if I can't get the fix back to the Xiph guys, but... probably not. Still not smart enough. Working on that.

Wednesday, June 27, 2012

What Time Means On a Modern CPU

So, I've decided to go full in on game programming. This should surprise exactly nobody. I sat down and did some rapid math to see how much overhead I had.

My end game engine will be a 2d vector game engine, because I like to make my life hard for myself. The target FPS is 120 FPS; this is for no other reason than that that's the current maximum refresh rate of any monitor or display on the market that I know of. I -think- you see it in the 3d sets; I don't know, I've essentially ignored the recent 3d revolution.

Anyway. 120 FPS means that every frame has 1/120, or .0083333 (etc.) seconds to do its work. Assuming a CPU that is going at 1 GHz, that means I have eight and a third million clock cycles, per frame, to do all the work that needs to get done.

I knew modern machines were fast, who doesn't, but I still managed to be surprised to actually see it quantified. I had some kind of vague in my head idea of how fast a CPU was, but now that I see it, man, that's amazing. We've come a long way from the 10 Mhz 286 I first learned to bit bash on.

Now, I'm well aware that a 'clock cycle' is not necessarily a terribly useful metric by itself. There's lots of questions that need to go with it. How many clocks does it take to do a fetch for a given data size? How many clocks does it take to do certain math operations that are going to come up frequently in your code? So on, so on.

Still. I guess it's not that astonishing, but it is interesting (to me) anyway.