pontifikas's picture

Question About GL.TexImage2D and the PixelFormat argument

I'm making my first steps with texturing. I saw in the example you provide that you call GL.TexImage2D with PixelFormat value set at Bgra

...
BitmapData data = bitmap.LockBits(new System.Drawing.Rectangle(0, 0, bitmap.Width, bitmap.Height),
ImageLockMode.ReadOnly, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
 
GL.TexImage2D(TextureTarget.Texture2D, 0, PixelInternalFormat.Rgba, data.Width, data.Height, 0,
OpenTK.Graphics.OpenGL.PixelFormat.Bgra, PixelType.UnsignedByte, data.Scan0);
 
bitmap.UnlockBits(data);
...

Why is this so? To my understanding the Bitmap loaded by new Bitmap(filename) has and 24bpprgb pixel format which is the value set as the PixelInternalFormat.
I saw the same to an C# conversion of a Nehe texture tutorial also.
Why is this happening? Is the bitmap created with the color bytes reversed?

I thank you in advance.


Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
the Fiddler's picture
Quote:

Why is this so? To my understanding the Bitmap loaded by new Bitmap(filename) has and 24bpprgb pixel format which is the value set as the PixelInternalFormat.

Not really, the System.Drawing.Bitmap format is Format32bppArgb, which matches Bgra rather than Rgba. As to why... no idea. Bgra seems to be the convention used for both Bitmaps and OpenGL textures.

The driver will actually convert from PixelFormat to PixelInternalFormat if the arguments do not match. In this case, PixelInternalFormat.Rgba is actually treated as Bgra so no conversion takes place.

Curiously, a recent thread on gamedev.net suggested that floating-point (64 or 128bit) textures are stored as Rgba in video memory, something that could come in handy if you ever work with HDR textures.

bpunsky's picture

It's actually not true that Bgra matches Argb. The reason that you use Bgra when the image's format is Bgra actually has to do with the way OpenGL reads bytes out of memory.

You have to invert it because your PC (and, hopefully, every computer your program will run on) is little endian. So OpenGL reads the bytes ARGB in reverse order, hence BGRA. It seems to assume big endian, for some reason.

If you were planning on distributing to big endian platforms, you'd have to add a check in your program to see what the endianness of the machine is and switch your PixelFormat accordingly.

EDIT:

I found my source for this info, which, helpfully, is actually the OpenGL documentation.

http://www.opengl.org/wiki/Pixel_Transfer#Endian_issues

It seems, from what the page has to say, that GPUs use big endian ordering.