ThatsGobbles's picture

Weird blending issue?

In my game, I'm experimenting with blending, but I have some weird behavior. My blending function is set to:
GL.BlendFunc(BlendingFactorSrc.SrcAlpha, BlendingFactorDest.OneMinusSrcAlpha);

and I'm trying to draw a quad with a partially-transparent gradient like this:

GL.Begin(BeginMode.Quads); {
    GL.Color4(1, 1, 1, alpha); GL.Vertex2( 0.5f,  0.5f);
    GL.Color4(1, 1, 1, 0); GL.Vertex2(-0.5f,  0.5f);
    GL.Color4(1, 1, 1, alpha); GL.Vertex2(-0.5f, -0.5f);
    GL.Color4(1, 1, 1, 0); GL.Vertex2( 0.5f, -0.5f);
} GL.End();

However, it seems that my calls to GL.Color4(1, 1, 1, 0) aren't actually setting the color to white. I'm getting the same effect as if I were to have called GL.Color4(0, 0, 0, 0) instead. Here's a screencap of the above quad being drawn over a khaki background. You can see how the gradient fades towards RGBA == (0, 0, 0, 0) instead of RGBA == (1, 1, 1, 0) like I had intended. Any insight?

EDIT: Wow, I figured it out, and boy do I feel dumb! I didn't realize that there were float and integer overloads of GL.Color. GL.Color4(1, 1, 1, 0) was setting the color to almost black. But since my alpha parameter is a float, GL.Color4(1, 1, 1, alpha) was calling the correct overload (the one I intended). Feel free to close this thread.

Inline Images