Sunday, January 25, 2009

OpenAL - sounds good!

For getting sound to work on the iPhone, I needed to acquaint myself with the OpenAL sound system. OpenAL is also particularly interesting because it is a 3D sound system. You set the position of a sound source, and OpenAL will take care of getting the volume just right, as well as the "position" of the sound where you are perceiving it to be coming from, when you hear it played on your surround set. It can also do cool things like the doppler effect. While I do not need 3D sound and just have started using OpenAL, it would certainly be cool to get some simple stereo sound effects in my game. This is something that SDL_mixer can not do.

OpenAL tutorials and ALUT
There are some nice tutorials on the web, and they all use ALUT. It turns out that ALUT had some (simple to resolve!) bugs in it, and some guys decided to mark it as deprecated. When standard C code gets marked as deprecated, gcc won't compile it anymore. So, now all those nice online tutorials don't work anymore. ALUT had two functions that were really convenient: alutInit() and alutLoadWAVFile(). Writing my own WAV loader took some time, but I managed, and initializing OpenAL properly takes some magic sequence, but read below about how that works.

Ubuntu Linux suxxors
First a sidestep from writing code; getting your system to work, and be able to produce sound at all. My favorite development platform is still Linux. For some reason, Ubuntu decided to break sound in version 8.10 (Intrepid) and I've been hassling with it for days to get some audio output out of my programs. At it turns out, Ubuntu Intrepid mixes asound (ALSA), esnd, and PulseAudio, with an apparent preference for PulseAudio, except that it's not properly integrated or set up or whatever, it just doesn't work in all cases. That's right, one program seems to work, while others don't. The solution: I removed all PulseAudio packages from my system, and installed their ALSA counterparts. (In fact, I compiled the mpd music player daemon by hand, because Ubuntu's version tried to use PulseAudio anyway). After taking these drastic measures, my sound programs produced audio output, hurray!!

Initializing OpenAL
As said, we are going to have to do without ALUT. OpenAL needs to be initialized by creating a "context" and selecting the audio device, otherwise you will get nothing but errors. In some tutorials, I found that you may select "DirectSound3D" as a device, but that obviously only works on Windows. I have no clue what devices are available on Linux, MacOS X, or whatever other platform you may think of. Luckily, there is a way to get the default device.
I'll share The Code with you:
#include "al.h"
#include "alc.h"

int init_openal(void) {
ALCcontext *context;
ALCdevice *device;
const ALCchar *default_device;

default_device = alcGetString(NULL,
ALC_DEFAULT_DEVICE_SPECIFIER);

printf("using default device: %s\n", default_device);

if ((device = alcOpenDevice(default_device)) == NULL) {
fprintf(stderr, "failed to open sound device\n");
return -1;
}
context = alcCreateContext(device, NULL);
alcMakeCurrentContext(context);
alcProcessContext(context);

/* you may use alcGetError() at this point to see if everything is still OK */

atexit(atexit_openal);

/* allocate buffers and sources here using alGenBuffers() and alGenSources() */
/* ... */

alGetError(); /* clear any AL errors beforehand */
return 0;
}
First a remark about the includes. Including the files "al.h" and "alc.h" is much more portable than using <AL/al.h> and <AL/alc.h>, and it saves you from resorting to "#ifdef PLATFORM_SO_AND_SO" constructs.
Using the default device may not always be the best choice. There should be an option in the program to let the user decide on what device to use.
Note the atexit() call; OpenAL will complain loudly when you exit the program without having cleaned up properly. It has a right to complain too; you've allocated some valuable hardware resources and it may be well possible that the operating system is not exactly babysitting this hardware, meaning that when you exit the program without releasing the allocated hardware buffers, they will remain as marked 'in use' until the computer is rebooted (!).
A good atexit() handler frees the allocated resources and closes the sound device.
#include "al.h"
#include "alc.h"

int init_openal(void) {
ALCcontext *context;
ALCdevice *device;
const ALCchar *default_device;

default_device = alcGetString(NULL,
ALC_DEFAULT_DEVICE_SPECIFIER);

printf("using default device: %s\n", default_device);

if ((device = alcOpenDevice(default_device)) == NULL) {
fprintf(stderr, "failed to open sound device\n");
return -1;
}
context = alcCreateContext(device, NULL);
alcMakeCurrentContext(context);
alcProcessContext(context);

/* you may use alcGetError() at this point to see if everything is still OK */

atexit(atexit_openal);

/* allocate buffers and sources here using alGenBuffers() and alGenSources() */
/* ... */

alGetError(); /* clear any AL errors beforehand */
return 0;
}
Mind the order in which to call OpenAL in the atexit handler, always first stop the sound, delete the sound sources, and then delete the buffers. Then move on to the context, and finally close the device. If you don't do it in this order, the sound system will not be properly de-initialized.

Loading .WAV files
Loading WAV sound files is easy, although there is one big gotcha to it: it is a binary format, stored in little endian format. The intel x86 line of processors are all little endian, so no problem there. However, I like to write portable code, so this is a bit of fun. I have the pleasure of having access to a big endian machine, so I could test my WAVE loader code there, too. I came up with a very simple test too see if a machine is big or little endian:
#include <stdint.h>

int is_big_endian(void) {
unsigned char test[4] = { 0x12, 0x34, 0x56, 0x78 };

return *((uint32_t *)test) == 0x12345678;
}
The is_little_endian() function is left as an exercise to the reader.

Back to the WAVE loader, what I do is simply read the whole file into
memory and then map a struct that represents the WAV header over it. Then I do the integer fixups (read as little endian values), and then perform some basic checks on whether this really was a good WAV file. For a music player, you should do these checks before loading the music data, but for my game code, I was happy with simply loading the file all at once.

You can find good information about the structure of a WAV file header on the web. Note how a WAVE loader is really a RIFF loader, and in my case, also a PCM sound loader.
There is a gotcha when blindly pouring the WAV header information into a struct, which is alignment. The compiler may mess around with the size of your struct ... use pragma pack to
instruct the compiler to refrain from doing that:
#pragma pack(1)
typedef struct {
uint32_t Chunk_ID;
uint32_t ChunkSize;
...
uint16_t AudioFormat;
...
} WAV_header_t;
#pragma pack()
I've noticed that there are quite a few WAV files out there that have little mistakes in the
headers, in particular the final field data_size (or subchunk2_size), which represents the length of the music data, is often wrong. The correct value is file size minus header size, and note that you can now 'repair' the WAV file.
Now that we've succeeded in loading the .WAV file, feed this data into alBufferData() and we're ready to make some noise.

Closing remarks
In this blog entry I've shown how you can survive without ALUT. Although SDL_mixer happily handles .mp3s, and many other formats, it doesn't give you the stereo and 3D sound that OpenAL does. For OpenAL, you will need to write your own sample loader. Painful as it seems, it is really just a good exercise in handling data formats.
In the devmaster OpenAL tutorials they show how to stream Ogg Vorbis files through OpenAL. Be sure to read all their tutorials, they're pretty straightforward once you've gotten through lesson 1. Also note how they rely on ALUT ... forget about ALUT. Use OpenAL, it sounds good.

Sunday, January 11, 2009

Porting an OpenGL demo to iPhone

You may read this blog entry as "a beginners guide to OpenGL ES". Like many others, I wanted to write a cute little app for the iPhone. Also, like many others, I know nothing about MacOS X programming (the Cocoa API), so I quickly found myself lost in the woods. I do have experience with SDL and OpenGL, however, so I decided to port the infamous "WHYz!" demo to iPhone, using Apple's Xcode and iPhone simulator.

History of the WHYz! demo
The WHYz! demo is a story by itself. It is included here as background information and for the sake of story-telling. Skip this section if you are interested in the OpenGL ES code only.
Way back in them good old days, 1996, I wrote the first WHYz! demo on an 33 MHz 486 PC. It was meant as an electronic birthday card. During a dark night of coding, I rewrote the whole thing in 100% assembly code, hoping to squeeze a few extra frames out.
In 1997, I ported the C version to Linux using svgalib. I also played around with Qt around that time, but I don't think I ever made a Qt port of the WHYz! demo.
In early 2006, I was first trying out SDL and made an SDL implementation of the WHYz! demo.
In 2007, I made a completely new rewrite in SDL/GL. This version is different from the original, but demonstrates the same kind of wave effect.
Now the wavy WHYz! demo comes to the iPhone with OpenGL ES.

Taking the first hurdle: Objective-C
The iPhone's operating system closely resembles MacOS X. In MacOS X, applications are typically written in Objective-C. This is an object-oriented dialect of C. When you see Objective-C code for the first time in your life, your stomach will turn and your eyes won't know where to look and you will feel dizzy. Remember this: Objective-C is much like C++, only with a different syntax. Readers of this blog may know I have already expressed my aversion of C++, well, Objective-C is even less prettier on the outside. On the other hand, the first impression is not everything. Objective-C seems to come with a very nice API, that is going to cost time and dedication to master. At this very moment, life is too short to spend valuable time on this, so lucky for us, it is possible to mix standard C with Objective-C, just like it is possible to write "C++" code in standard C, and only benefit from the handy "// comment" C++ syntax.
So, we will have Xcode generate the startup code for the main application for us, and call our good old standard C code from there. Cheating? No, just not rewriting working code in a dialect that I don't speak.

Taking the second hurdle: SDL for iPhone
As my existing code is all SDL/GL, I really need SDL for iPhone. How else would we setup the screen, handle input events from the operating system, play sound, and finally, render out OpenGL scene? As of yet, the answer is ... you don't. Maybe I didn't look hard enough, but I did not find a SDL library for iPhone. It appears to be in development. For this demo, I managed to do without SDL, but things will be much harder when trying to port a full blown game.

For setting up the screen: Let Xcode generate an EAGLView.m for you. Insert your own init_gl() routine to setup the viewport, and reset the projection and model view matrices.
Input events: The demo doesn't need input, but I'm sure Apple has some neat CoreInput or whatever framework for this.
Play sound: Not used in the demo, but they say you should use OpenAL, which sounds nice. OpenAL is portable and doesn't look hard to use.
Render OpenGL scene: Insert your own draw_scene() code into the EAGLView drawView() method.

All done? NO, the best is yet to come. Hit the "Build and go" button in Xcode and you will get a ton of errors on OpenGL calls.

Hurdle #3: The limitations of OpenGL ES
OpenGL ES stands for "OpenGL, Embedded Systems". Mobile devices are always relatively slow, simple, low-power computers. In the beginning, there would not even be a FPU. If your mobile device has no FPU and therefore only implements OpenGL ES 1.0, you are stuck with fixed point calculations. Lucky for us, modern mobile devices like the iPhone are capable of doing floating point calculations in hardware.
The graphics chips in mobile devices are also real simple compared their mean big brothers in gaming PCs. The OpenGL ES API has been slimmed down to mirror this.

Just to annoy developers, the great creators have altered a fundamental concept of OpenGL, only to force people into writing more efficient code. Shocking news: in OpenGL ES, there is no glBegin()/glEnd(). There is no glVertex(). There is no glTexCoord(). Argh!
There are only glDrawArrays() and glDrawElements().
(By the way, remember my OpenGL optimization lesson? Use glDrawArrays() to speed things up a little).

There is not even glFrustum() and glOrtho(), but their callings have been made more consistent with the rest of the API; call glFrustumf() and glOrthof().

There is no polygon mode, so no more cool debugging in wireframe (my absolute favorite polygon drawing mode). Well, you can do it, but not through glPolygonMode().

There are no quads, only GL_TRIANGLE and GL_TRIANGLE_STRIP remain.

There are no display lists, use a subroutine as workaround.

You can not push/pop attributes. The programmer should be knowing what the state is at all times, and he should be knowing what he's doing anyway. Just like in the real world.

There is no full set of color manipulation routines, only glColor4f().

There is no GLuint, only GLfloat.

Oh yeah, and you have to find the right include somewhere, too.

This leaves you with a rather big mess of cool OpenGL standard code, that just doesn't work in OpenGL ES. I guess you could write a set of macros that converts existing code to ES, but I'd rather stay away from this headache. Nothing remains but to go through the code and do the tedious work of converting each and every block of OpenGL code to ES.

Taking hurdles: Converting existing OpenGL code
Converting the existing code is mostly looking for glBegin()s and putting all glVertex()s into a vertex array. Finally, call glDrawArrays(). I will give a 2D example for the sake of simplicity:
/* old code, in some kind of orthogonal coordinate system */

#include "SDL_opengl.h"

glBegin(GL_TRIANGLE_STRIP);
glTexCoord2i(0, 0);
glVertex2i(-1, 1);
glTexCoord2i(0, 1);
glVertex2i(-1, -1);
glTexCoord2i(1, 0);
glVertex2i(1, 1);
glTexCoord2i(1, 1);
glVertex2i(1, -1);
glEnd();
New:
#include <OpenGLES/ES1/gl.h>

GLfloat vertex_arr[8] = {
-1.0f, 1.0f,
-1.0f, -1.0f,
1.0f, 1.0f,
1.0f, -1.0f
};
GLfloat tex_arr[8] = {
0.0f, 0.0f,
0.0f, 1.0f,
1.0f, 0.0f,
1.0f, 1.0f,
};

glEnableClientState(GL_VERTEX_ARRAY);
glEnableClientState(GL_TEXTURE_COORD_ARRAY);

glVertexPointer(2, GL_FLOAT, 0, vertex_arr);
glVertexPointer(2, GL_FLOAT, 0, tex_arr);

glDrawArrays(GL_TRIANGLE_STRIP, 0, 4);
I was lucky that my code already used triangle strips ...

If you use display lists, convert those to subroutines. You will lose the performance benefit that glCallLists() has. My guess is that mobile devices lack the GL "compiler" and cache memory that are needed for display lists.
Simply delete calls that set the polygon mode. If you need to draw lines ... pass GL_LINES to glDrawArrays().
Finally, search your source for "glColor" and replace all by glColor4f().
glColor3ub(0xff, 0x80, 0);
becomes:
glColor4f(1.0f, 0.5f, 0.0f, 1.0f);

The final hurdle: Reaching the finish line
When I finally got the code to compile, it still did not work. One of the problems was, the textures did not load. The thing is, on MacOS X, applications and data files are part of a bundle. If you want to load a texture, you have to access the bundle using the NSBundle class in the Cocoa library. Since my code is based on fopen(), I chose to cheat and not use the NSBundle class. At startup, I do a chdir(dirname(argv[0])) and find my texture files there. This is quite ugly, as it manipulates argv[0].

I had another caveat that was pretty obvious after I figured it out ... I deleted the SDL resize_window() code from the source. This routine also (re)initialized OpenGL in the proper way ... so I was trying to start out without having set the right coordinate system (oops).
When this was fixed, I could now happily enjoy watching the WHYz! demo in the iPhone simulator. (Yeah, it's time to go buy a real iPhone ...)