ToastyJustice's picture

Texture transparency converting from Bitmap

Hi guys,

For a good while I was actually working on my project using XNA. But since it's completely inadequate and a hassle for other people to use on their computer without an installer to deploy all the platform tools needed, I've been attempting to switch to OpenTK. However, when I am loading my textures I've got a problem: The alpha sections with 0 are displayed in full black! Now, I've tried googling and looking around for answers, but I've come up empty so far.

Here is the important part of the class I'm using to store the texture:

  public class TexturedResource : IEquatable<int>, IEquatable<TexturedResource>
  {  
    private Int32 id = -1;
    private Bitmap bitmap;
 
    public TexturedResource( Image image )
    {
      Initialize( image );
    }
 
    private void Initialize( Image image )
    {
      this.bitmap = (Bitmap)image;
    }
 
    public Int32 Id
    {
      get { return id; }
      set { id = value; }
    }
 
    private Int32 CreateTextureForImage( Bitmap bitmap )
    {
      Imaging.BitmapData TextureData = bitmap.LockBits(
          new Rectangle( 0, 0, bitmap.Width, bitmap.Height ),
          Imaging.ImageLockMode.ReadOnly,
          Imaging.PixelFormat.Format32bppArgb
          );
 
      Int32 target;
 
      GL.GenTextures( 1, out target );
      GL.BindTexture( TextureTarget.Texture2D, target );
 
      //the following code sets certian parameters for the texture
      GL.TexEnv(
        TextureEnvTarget.TextureEnv,
        TextureEnvParameter.TextureEnvMode,
        (float)TextureEnvMode.Replace
      );
      GL.TexParameter(
        TextureTarget.Texture2D,
        TextureParameterName.TextureMinFilter,
        (float)TextureMinFilter.Linear
      );
      GL.TexParameter(
        TextureTarget.Texture2D,
        TextureParameterName.TextureMagFilter,
        (float)TextureMagFilter.Linear
      );
 
      GL.TexImage2D(
        TextureTarget.Texture2D, 0,
        PixelInternalFormat.Three,
        bitmap.Width, bitmap.Height, 0,
        PixelFormat.Bgra,
        PixelType.UnsignedByte, TextureData.Scan0 );
 
      //free the bitmap data (we dont need it anymore because it has been passed to the OpenGL driver
      bitmap.UnlockBits( TextureData );
 
      return target;
    }
    ...
}

Now I've made sure to use Bgra and that my bitmaps are in the 32bpp format. After I've loaded the bitmap, I also made sure to set the transparency for the fuchsia pixels I want to make transparent - i.e. bitmap.MakeTransparent( System.Drawing.Color.Fuchsia );. I've tried checking the pixel values in the debugger before and after it calls TexImage2D, and the proper pixels are set to TransparentBlack - (0, 0, 0, 0). But when I displayed them they were just black, even with blending on and seemingly working. So I went back to the loading and after using GetTexImage, the value from the texture comes up as Black - (255, 0, 0, 0). Am I missing something here?


Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Mincus's picture

Blending for simple "billboard" sprites in OpenGL has always seemed overcomplicated to me, I do it the following way:
- Create a mask for the image with the part you want to render as white and the transparency black (if you have a known colour for transparency this is very easy with the .NET imaging system)
- Upload the mask to OpenGL.
- Upload the texture to OpenGL.

When you come to render do the following:
- Enable blending
- Set the blend function to: GL.BlendFunc(BlendingFactorSrc.Zero, BlendingFactorDest.SrcColor);
- Render the mask
- Set the blend function to: GL.BlendFunc(BlendingFactorSrc.One, BlendingFactorDest.One);
- Render the sprite.

AFAIK, there's no other (simple) way to do it without delving into shaders, but someone else might have a better answer.
You can see this technique working on my latest project here, anyway.

ToastyJustice's picture

Although that may indeed work (I could grab the pixels I don't want to display and generate the mask) there has to be something I'm missing in the actual loading/binding because the textures come back out with the incorrect alpha. Also, if in the future I have textures with partial alpha values, the problem would still remain. If desired, I could always post a link to my source code (which is open anyways) for a closer look.

the Fiddler's picture

I think, but I'm not 100% certain, that GDI+ uses premultiplied alpha. In that case, you need to use GL.BlendFunc(BlendingFactorSrc.One, BlendingFactorDest.OneMinusSrcAlpha).

See here and here for more information on premultiplied alpha and why it's useful.

ToastyJustice's picture

Aye that's the blending mode I've got enabled. Here's my rendering code, actually:

      GL.Clear( ClearBufferMask.ColorBufferBit | ClearBufferMask.DepthBufferBit );
 
      // Other Rendering
 
      GL.Enable( EnableCap.Blend );
      GL.BlendFunc( BlendingFactorSrc.SrcAlpha, BlendingFactorDest.OneMinusSrcAlpha );
 
      GL.Color4( 1f, 1f, 1f, 1f );
      GL.Enable( EnableCap.Texture2D );
 
      Int32 width   = bitmap.Width,
                height  = bitmap.Height;
 
      GL.BindTexture( TextureTarget.Texture2D, id );
      GL.Begin( BeginMode.Quads );
 
      GL.TexCoord2( 0.0, 0.0 );
      GL.Vertex2( position.X, position.Y );
 
      GL.TexCoord2( 1.0, 0.0 );
      GL.Vertex2( position.X + width, position.Y );
 
      GL.TexCoord2( 1.0, 1.0 );
      GL.Vertex2( position.X + width, position.Y + height );
 
      GL.TexCoord2( 0.0, 1.0 );
      GL.Vertex2( position.X, position.Y + height );
 
      GL.End();

To demonstrate my problem with the texture values being different after loading, I present o you the following addition to my previous code:

      GL.TexImage2D(
        TextureTarget.Texture2D, 0,
        PixelInternalFormat.Three,
        bitmap.Width, bitmap.Height, 0,
        PixelFormat.Alpha,
        PixelType.UnsignedByte, TextureData.Scan0 );
 
      //free the bitmap data (we dont need it anymore because it has been passed to the OpenGL driver
      bitmap.UnlockBits( TextureData );
 
      Bitmap chkTexture = new Bitmap( 16, 16, Imaging.PixelFormat.Format32bppArgb );
      Imaging.BitmapData textureData = chkTexture.LockBits(
          new Rectangle( 0, 0, chkTexture.Width, chkTexture.Height ),
          Imaging.ImageLockMode.ReadOnly,
          Imaging.PixelFormat.Format32bppArgb
          );
      GL.GetTexImage( TextureTarget.Texture2D, 0, PixelFormat.Bgra, PixelType.UnsignedByte, textureData.Scan0 );
      chkTexture.UnlockBits( textureData );

Gives me the following results when inspecting the values:

> bitmap.GetPixel(0, 0).ToString();
"Color [A=0, R=0, G=0, B=0]"
> chkTexture.GetPixel(0, 0).ToString();
"Color [A=255, R=0, G=0, B=0]"
zahirtezcan's picture

PixelInternalFormat determines the memory layout of your texture on graphics card. You have defined three components for your texture which is composed of RGB but you need alpha component. So, setting internal format to RGBA or Four should do the trick