Spader's picture

FrameBuffer to bitmap problem.

Hello, I am experiencing a bit of a problem when I am trying to render off-screen and save that data as a bitmap. What I currently do is make a frame buffer object, bind a render buffer and texture to it then perform any rendering. Then create the bitmap and read the buffer.

This work fine and well for most of the machines and video cards I have tried it on. It seems to work without issue for the follow video cards:
GeForce 8400 GS (driver version 6.14.0011.9621)
Radeon XPRESS 200 Series (driver version 6.14.0010.6869)
Quadro FX 370 (driver version 6.14.0011.9166)
nVidia 7050 PV / nForce 630a (driver version 6.14.0011.9166)
Radeon XPRESS 1100 (driver version 6.14.0010.6641)
GeForce 7300 GS (driver version 6.14.0010.9371)
GeForce 8500 GT (driver version 6.14.0011.9621)
GeForce 8600 GT (driver version 6.14.0011.8250)

These are on-board and video cards tested between 5 different computers.

The one machine where the off-screen rendering seems to fail, was tested with these video cards/On-board
Radeon 3100 Graphics (driver version 6.14.0010.7050)
Radeon 2600 Pro (driver version 6.14.0010.7050)

What seems to happen is when I call GL.ReadPixels the data saved into the bitmap seems to be corrupted.
I included two images that were created on the machine that fails. The images are of the same off-screen rendering, but each one was created on one of the different video cards. I did resize the images using Paint, so they are somewhat distorted, but the "corrupted" look is what I am trying to convey.

Because it seems to be a single machine and its video cards are different. It could just be something weird with those versions of video cards/chips, or something wrong with the machine itself. I have not had the chance to try the 2600 Pro in a different machine yet.

The other option, which is what I'm mainly asking about, is that I am doing something wrong when I create the frame buffer. Below, I will include the relevant code. I will only post snippets for now, unless more code is needed.

This is the code that is called to create the bitmap.
Renderer is a interface wrapper that exposes some generalized rendering functions. The class this belongs in is OpenGL specific, other rendering stuff was created with more generalized goals in mind.

        public System.Drawing.Bitmap GetBitmap(int width, int height)
        {
            System.Drawing.Bitmap bitmap = null;
 
            if (DrawFunction != null && Renderer != null)
            {
                RenderStart(width, height);
                bitmap = new System.Drawing.Bitmap(width, height);
                System.Drawing.Imaging.BitmapData data = bitmap.LockBits(new System.Drawing.Rectangle(0, 0, width, height),   System.Drawing.Imaging.ImageLockMode.WriteOnly, System.Drawing.Imaging.PixelFormat.Format24bppRgb);
 
#if DEBUG
                {
                    var errorCode = Renderer.Error();
                    if (errorCode != OpenTK.Graphics.OpenGL.ErrorCode.NoError)
                    {
                        throw new Exception("OpenGLImager GetBitmap error: " + errorCode);
                    }
                }
#endif
 
                GL.ReadBuffer(ReadBufferMode.ColorAttachment0);
 
#if DEBUG
                {
                    var errorCode = Renderer.Error();
                    if (errorCode != OpenTK.Graphics.OpenGL.ErrorCode.NoError)
                    {
                        throw new Exception("OpenGLImager GetBitmap error: " + errorCode);
                    }
                }
#endif
 
                GL.ReadPixels(0, 0, width, height, PixelFormat.Bgr, PixelType.UnsignedByte, data.Scan0);
 
#if DEBUG
                {
                    var errorCode = Renderer.Error();
                    if (errorCode != OpenTK.Graphics.OpenGL.ErrorCode.NoError)
                    {
                        throw new Exception("OpenGLImager GetBitmap error: " + errorCode);
                    }
                }
#endif
 
                bitmap.UnlockBits(data);
                bitmap.RotateFlip(System.Drawing.RotateFlipType.RotateNoneFlipY);
                RenderEnd();
            }
 
            return bitmap;
        }

This is the main function that setups the off screen rendering.
InitViewFunction and DrawFunction are just delegates to allow classes that make use of this class to hook their own stuff into it.
The code that is called from these should not be the problem (I hope). Normally InitViewFunction is left null (for now), and the DrawFunction renders what wants to be turned into the bitmap.

        private void RenderStart(int width, int height)
        {
            if (Renderer != null && DrawFunction != null)
            {
                GL.MatrixMode(MatrixMode.Projection);
                GL.PushMatrix();
                GL.LoadIdentity();
                GL.MatrixMode(MatrixMode.Modelview);
                GL.PushMatrix();
                GL.LoadIdentity();
 
                if (InitViewFunction != null)
                {
                    InitViewFunction(Renderer, Parameter);
 
#if DEBUG
                    {
                        var errorCode = Renderer.Error();
                        if (errorCode != OpenTK.Graphics.OpenGL.ErrorCode.NoError)
                        {
                            throw new Exception("OpenGLImager InitViewFunction error: " + errorCode);
                        }
                    }
#endif
                }
 
                if (frameBuffer == null)
                {
                    frameBuffer = new FrameBuffer();
                }
 
                frameBuffer.Resize(width, height);
                frameBuffer.Bind();
 
#if DEBUG
                {
                    var errorCode = Renderer.Error();
                    if (errorCode != OpenTK.Graphics.OpenGL.ErrorCode.NoError)
                    {
                        throw new Exception("OpenGLImager FrameBuffer error: " + errorCode);
                    }
                }
#endif
 
                GL.PushAttrib(AttribMask.ViewportBit);
                GL.Viewport(0, 0, width, height);
                DrawFunction(Renderer, Parameter);
                GL.PopAttrib();
 
#if DEBUG
                {
                    var errorCode = Renderer.Error();
                    if (errorCode != OpenTK.Graphics.OpenGL.ErrorCode.NoError)
                    {
                        throw new Exception("OpenGLImager DrawFunction error: " + errorCode);
                    }
                }
#endif
            }
        }

This is the clean up rendering function

       private void RenderEnd()
        {
            if (Renderer != null && frameBuffer != null)
            {
                frameBuffer.UnBind();
 
                GL.DrawBuffer(DrawBufferMode.Back);
 
                GL.MatrixMode(MatrixMode.Modelview);
                GL.PopMatrix();
                GL.MatrixMode(MatrixMode.Projection);
                GL.PopMatrix();
                GL.MatrixMode(MatrixMode.Modelview);
            }
        }

This is the code that creates the Frame buffer object (where I think the problem is).

        public FrameBuffer()
        {
            Init();
        }
 
        public void Init()
        {
            if (FrameBufferId != -1)
            {
                throw new Exception("FrameBuffer was already defined. Free it first.");
            }
 
            GL.PushAttrib(AttribMask.LightingBit | AttribMask.LineBit | AttribMask.TextureBit | AttribMask.EnableBit | AttribMask.CurrentBit);
            GL.Ext.GenFramebuffers(1, out fboId);
            Bind();
 
            GL.Ext.GenRenderbuffers(1, out depthBufferId);
            GL.Ext.BindRenderbuffer(RenderbufferTarget.RenderbufferExt, DepthBufferId);
            GL.Ext.RenderbufferStorage(RenderbufferTarget.RenderbufferExt, (RenderbufferStorage)All.DepthComponent, DepthWidth, DepthHeight);
            GL.Ext.BindRenderbuffer(RenderbufferTarget.RenderbufferExt, 0);
            GL.Ext.FramebufferRenderbuffer(FramebufferTarget.FramebufferExt, FramebufferAttachment.DepthAttachmentExt, RenderbufferTarget.RenderbufferExt, DepthBufferId);
            Texture = new OpenGLTexture();
            Texture.Resize(DepthWidth, DepthHeight);
            Texture.UnBind();
 
            GL.Ext.FramebufferTexture2D(FramebufferTarget.FramebufferExt, FramebufferAttachment.ColorAttachment0Ext, TextureTarget.Texture2D, Texture.TextureId, 0);    
 
            GL.PopAttrib();
            UnBind();
        }
 
        public void Resize(int width, int height)
        {
            GL.PushAttrib(AttribMask.EnableBit | AttribMask.TextureBit);
 
            if (FrameBufferId != -1 && DepthBufferId != -1)
            {
                DepthWidth = width;
                DepthHeight = height;
 
                GL.Ext.BindRenderbuffer(RenderbufferTarget.RenderbufferExt, DepthBufferId);
                GL.Ext.RenderbufferStorage(RenderbufferTarget.RenderbufferExt, (RenderbufferStorage)All.DepthComponent, DepthWidth, DepthHeight);
                GL.Ext.BindRenderbuffer(RenderbufferTarget.RenderbufferExt, 0);
 
                if (Texture == null)
                {
                    Texture = new OpenGLTexture();                    
                }
 
                Texture.Resize(width, height);
                Texture.UnBind();
            }
 
            GL.PopAttrib();
        }
 
        public void Bind()
        {
            IsBound = true;
            GL.Ext.BindFramebuffer(FramebufferTarget.FramebufferExt, FrameBufferId);
        }
 
        public void UnBind()
        {
            GL.Ext.BindFramebuffer(FramebufferTarget.FramebufferExt, 0);
            IsBound = false;
        }

This is the code that creates a wrapper for a texture.
TextureType is always TextureTarget.Texture2D for now.
textureUnit is always TextureUnit.Texture0 for now.
ChannelBitDepth is always PixelInternalFormat.Rgba8 for now.

        public OpenGLTexture()
        {
            Init();
        }
 
        public void Init()
        {
            if (TextureId != -1)
            {                
                throw new Exception("Texture was already defined. Free it first.");
            }
 
            GL.Enable(TextureType);
            GL.GenTextures(1, out texId);
            Bind();
            GL.TexParameter(TextureTarget, TextureParameterName.TextureWrapS, (int)TextureWrapMode.Clamp);
            GL.TexParameter(TextureTarget, TextureParameterName.TextureWrapT, (int)TextureWrapMode.Clamp);
            GL.TexParameter(TextureTarget, TextureParameterName.TextureMinFilter, (int)All.Nearest);
            GL.TexParameter(TextureTarget, TextureParameterName.TextureMagFilter, (int)All.Nearest);
            Reset();
        }
 
        public void Reset()
        {
            if (TextureId != -1)
            {
                PixelType pixelType = PixelType.Float;
                PixelFormat pixelFormat = PixelFormat.Rgba;
                switch (ChannelBitDepth)
                {
                    case PixelInternalFormat.Rgba8:
                        pixelType = PixelType.UnsignedByte;
                        break;
 
                    case PixelInternalFormat.Rgba12:
                    case PixelInternalFormat.Rgba16:
                    case (PixelInternalFormat)ArbTextureFloat.Rgba32fArb:
                        pixelType = PixelType.Float;
                        break;
 
                    case PixelInternalFormat.Rgb8:
                        pixelType = PixelType.UnsignedByte;
                        pixelFormat = PixelFormat.Rgb;
                        break;
 
                    case PixelInternalFormat.Rgb12:
                    case PixelInternalFormat.Rgb16:
                    case (PixelInternalFormat)ArbTextureFloat.Rgb32fArb:
                        pixelType = PixelType.Float;
                        pixelFormat = PixelFormat.Rgb;
                        break;
 
                    default:
                        break;
                }
 
                if (!IsBound)
                {
                    Bind();
                }
 
                GL.TexImage2D(TextureTarget, 0, ChannelBitDepth, Width, Height, 0, pixelFormat, pixelType, IntPtr.Zero);
            }
        }
 
        public void Bind()
        {
            IsBound = true;
            GL.Enable(TextureType);
            //GL.ActiveTexture(textureUnit);
            GL.BindTexture(TextureTarget, TextureId);
        }
 
        public void UnBind()
        {
            GL.Disable(TextureType);
            IsBound = false;
        }

There are no OpenGL errors from what I can tell. The places I check all return NoError for every machine and video card tested, even those that created the "corrupted" images.

I tried to use examples I found throughout different sources so the setup functions and creation stuff might be the problem.

**edit**

Forgot to mention that GL.ReadPixels works fine if I read from the screen and not from a fame buffer. This will work fine for all the machines/video cards. Which is why I believe it to be related to the frame buffer.

So created a bitmap of what is on screen works perfectly fine on all the tested hardware. It is just the creation of a bitmap based on the frame buffer that seems to fail on the hardware that was mentioned.

I hope I provided enough relevant information. If extra information is needed, please ask and I will provide as much information as I can.
I would appreciate any feedback and help.

Inline Images

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Spader's picture

*didn't notice the edit*

zahirtezcan's picture

framebuffer color attachment is 32 bit texture, but you are trying to read in 24bit samples. For default case, i assume that its initiated GraphicsMode is 24 bit back buffer.

the Fiddler's picture

This could indeed be the issue. Try using System.Drawing.Imaging.PixelFormat.Format32bppRgbaand PixelFormat.Bgra wherever possible.

You should also place a call to GL.Finish() before calling GL.ReadPixels. Some drivers may do that automatically but others will simply give incomplete results back (which is the correct approach as far as I am concerned).

Mincus's picture

I have working code for this in my Mandelbrot/Julia fractal renderer.

        private void saveImageToolStripMenuItem_Click(object sender, EventArgs e)
        {
            SaveFileDialog dialog = new SaveFileDialog(); // Create a save file dialog.
 
            dialog.Filter = "Bitmaps|*.bmp";
 
            if (dialog.ShowDialog() == DialogResult.OK)
            {   // If the user selected a filename and clicked ok.
                Bitmap bmp = new Bitmap(FboSize.Width, FboSize.Height); // Create a bitmap.
 
                System.Drawing.Imaging.BitmapData bmpdata = bmp.LockBits(new Rectangle(new Point(), FboSize), System.Drawing.Imaging.ImageLockMode.WriteOnly, System.Drawing.Imaging.PixelFormat.Format32bppArgb); // Lock the bitmap.
 
                GL.Ext.BindFramebuffer(FramebufferTarget.FramebufferExt, FrameBuffer); // Move to the FBO.
                GL.ReadBuffer((ReadBufferMode)FramebufferAttachment.ColorAttachment0Ext); // Set up where to read the pixels from.
                GL.ReadPixels(0, 0, FboSize.Width, FboSize.Height, PixelFormat.Bgra, PixelType.UnsignedByte, bmpdata.Scan0); // Read the pixels into the bitmap.
                GL.Ext.BindFramebuffer(FramebufferTarget.FramebufferExt, 0); // Move back to the default framebuffer.
                GL.ReadBuffer(ReadBufferMode.Back); // Set the read buffer to the back (I don't think this is necessary, but cleaning up is generally a good idea).
 
                bmp.UnlockBits(bmpdata); // Unlock the bitmap data.
 
                bmp.RotateFlip(RotateFlipType.RotateNoneFlipY); // Flip the bitmap as OpenGL stores it upside down.
 
                bmp.Save(dialog.FileName, System.Drawing.Imaging.ImageFormat.Bmp); // Save the bitmap using the filename we were given.
            }
        }

I suspect there's a few unnecessary state changes in there and I use the Ext calls instead of the plain ones, but the code does work as intended, so hope it helps.
I think the key issue is, as The Fiddler state, that you need to match the PixelFormats correctly. Since you are getting some data out, this is the most likely explanation.

If you take a look at the full project, there is the code for all the other framebuffer creation/rendering/etc as well, so if this snippet doesn't help there might be something else in there that does.

Spader's picture

To add a bit more information. I have tried the code on a newer ATI card, in the 4000 series, I think it was a Radeon HD 4870 and it seemed to work fine. So the problem only seems to appear on R600 Ati cards, R520 and R700 do not show this same behavior.

I can no longer try to run the code on the machine in question (with the 2600 Pro). When the code that tries to create the bitmap from the frame buffer is called, the screen goes black and a VPU Ati message box pops up saying the display driver is not responding. Sometimes I can recover from this, other times the screen goes black and cycles between black and that message box. If it cycles like that I am forced to reboot.

I will uninstall and reinstall the drivers on that machine again, to see if I can get the code to at least not cause the machine to freak out this much.

To reiterate, this behavior only appears on this machine (with a 2600 Pro) or any machine we have that uses the on-board Radeon 3100 Graphics. The other machines with the on-board Radeon 3100 Graphics do not seem to suffer from the black screen and VPU message box issue. And, any other machine we have does not experience this problem at all and will create the bitmap with what appears to be without issue.

I have tried to make it read 32 bits instead of 24 but the resulting bitmap does not reflect what is on screen. I get a washed out look. I will include images of a screen shot of the image and an image of the one created through the code above. Both images are rendered with the same code, in the frame buffer I am not rendering the gradient background or the filled in image at all, only the red and blue lines. But, the red and blue lines are rendered exactly the same way, just the frame buffer version appears to have alpha problems.

Changed the code to look like this (I could not see a Format32bppRgba in the pixel format enum):

System.Drawing.Imaging.BitmapData data = bitmap.LockBits(new System.Drawing.Rectangle(0, 0, width, height), System.Drawing.Imaging.ImageLockMode.WriteOnly, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
GL.ReadPixels(0, 0, width, height, PixelFormat.Bgra, PixelType.UnsignedByte, data.Scan0);

I also included a call to GL.Finish(). This used to be in the code but was removed because it would always cause the Radeon 3100 Graphic/2600 Pro machine to render WPF images with artifacts after the call. Before the call to GL.Finish() WPF will render any images without any artifacts, after the call every image that is on screen, not inside the OpenGL control, will now appear with weird pixelated artifacts. Not calling GL.Finish() seemed to stop that from happening and did not seem to effect the other machines/video cards.

**edit**

These images were not created on the machines with on-board 3100 Graphics/2600 Pro. They were created on my dev machine. Just wanted to clarify that. The on-board 3100 Graphics/2600 Pro still has issues (black screen freak out).

AttachmentSize
glave frame buffer.PNG38.21 KB
glave visible screen.PNG38.55 KB
Spader's picture

Ok, so I relooked at what settings I was turning on, and how I was defining my texture. I changed my texture to be created with the following:

GL.TexImage2D(TextureTarget, 0, ChannelBitDepth, Width, Height, 0, pixelFormat, pixelType, IntPtr.Zero);

where ChannelBitDepth is PixelInternalFormat.Rgba, pixelFormat is PixelFormat.Rgba, and pixelType is PixelType.UnsignedByte.

This is the code for when I read the pixels from the frame buffer.

                bitmap = new System.Drawing.Bitmap(width, height);
                System.Drawing.Imaging.BitmapData data = bitmap.LockBits(new System.Drawing.Rectangle(0, 0, width, height),   System.Drawing.Imaging.ImageLockMode.WriteOnly, System.Drawing.Imaging.PixelFormat.Format32bppArgb);
 
                GL.Finish();
                GL.ReadBuffer(ReadBufferMode.ColorAttachment0);
                GL.ReadPixels(0, 0, width, height, PixelFormat.Bgra, PixelType.UnsignedByte, data.Scan0);

The rest of the code stayed the same as I described in previous posts.

The main cause of that washed out look in my other post (http://www.opentk.com/files/glave%20frame%20buffer.PNG) was due to rendering to the frame buffer with LineSmooth turned on. If I turn LineSmooth off then render to the frame buffer then read pixels from that buffer to a bitmap it appears fine, no washed out look. I am not sure why LineSmooth causes this. I am rendering with VBOs, I set the line color and thickness then draw the VBO with line strips. Unless this is an issue with LineSmooth that I over looked, I would like to understand why rendering that way with LineSmooth on causes the frame buffer image to appear like that, but what appears on screen looks fine.

I do turn the bitmap into a png using a PngBitmapEncoder. The resulting images all appear fine for my machine now. There is no longer any washed out look to it.

When I run this code on a machine with on-board 3100 Graphics, the resulting image appears to have artifacts. Like there is some kind of noise in the buffer.

Below are three images. The 8600 GT image was made on my dev machine. The other two are made on another machine with the specified graphics. This is not the same machine that I talked about before. This is a newer mother board. It is the same 2600 Pro card though.

This noise/corruption issue still only appears to happen on R600 generation graphics. On screen all machines display fine, it is just an issue with I render to a frame buffer and try to get an image from that frame buffer.
Any reason to explain this behavior would be greatly appreciated.

AttachmentSize
Ati 3100 Graphics.PNG103.97 KB
Radeon HD 2600 PRO.PNG492.87 KB
nVidia 8600 GT.PNG13.47 KB
zahirtezcan's picture

Do you call for GL.Clear after binding framebuffer? (eg. in DrawFunction) And if so, what is your ClearColor is set to?

Spader's picture

I do call Clear. I use black with 0 alpha, so (0.0f,0.0f,0f,0.0f). Could it have anything to do with the texture I generate having garbage data, and the drivers on the machines that work init the memory for you?

I am not at work, so I can not test this out until Monday. But, just wondering if this is a possibility. I do render with Blending on, with the same blend function I use for normal render.

GL.BlendFunc(BlendingFactorSrc.SrcAlpha, BlendingFactorDest.OneMinusSrcAlpha);

Also, any feedback on the LineSmooth behavior would also be greatly appreciated. If possible I would like the image I create from off-screen rendering to be as close as possible to what is displayed on screen.

c2woody's picture
Quote:

I do call Clear. I use black with 0 alpha, so (0.0f,0.0f,0f,0.0f).

Just a side remark: for clearing you should set alpha to 1.0f, and when testing use some value different from black (like pink to check if the correct buffer is addressed).

Spader's picture

I tried to use Clear with White (1.0f, 1.0f, 1.0f, 1.0f) and I get the same results. The Image still looks the same as before on the on-board 3100 Graphics. Image still looks fine on the 8600 GT card.

I did also try Clear with a deep pink (1.0f, 1.0f, 0.0f, 1.0f). This had unexpected results. The 8600 GT produced an image like I would have guessed. Background is set and the lines all look correct. The 3100 Graphics produced the weird results. The image looks the same as before. I will post both images. The "noise" on the 3100 Graphic's image appears to be the same every time. The color pixels all seem to be in the exact same spot every time I go to make an image.

Both machines are running the same code. I use the same installer I created and I even reran the installer on the 3100 machine, and installed in a different directory to make sure the code was the same code that I have on my dev machine. Just to clarify I am not running this through a dev IDE, I am running this through an installed version of the code on both machines. So it should not be possible that the 3100 machine has old code. The date stamps on the files all correspond to the time I created the installer on both machines.

So it seems the 3100 Graphics/2600 Pro do not seem to issue a Clear correctly or the frame buffer is getting messed up someplace else.

Any ideas why this would happen? These results seem totally unexpected to me, I would have thought both machines would have created images with the same background color.

AttachmentSize
nVidia 8600 GT (Pink Clear).PNG13.53 KB
Ati 3100 Graphics (Pink Clear).PNG104.03 KB