I suspect this might be the case when I sent a release configuration of my project to a friend and he told me a it didn't render properly. I had a sneaking suspicion that the nVidia XP drivers woudn't support the same extensions as the ATi ones...
My XP installation (which, it seems is no longer available, but that's another story relating to a dodgy HD) didn't throw any unhandled debugging exceptions - it just went on and rendered a grey textbox I'd used as a floating debugging window over the entire game window. It does this even if I disable the command to draw this FBO as a texture. As the debug window is the last item to be rendered to a framebuffer before I draw to the visible one. This is exactly what my friend saw on his XP/nVidia system.
VS2008 in my XP Virtualbox on Ubuntu throws a NullReferenceException when I try to call the GL.Ext.GenRenderBuffers(...) method. I've no idea why it handles the problem differently, but it's actually more helpful to be told the extension is unsupported (if the exception means what I think it does).
So like I say, I suspect this new drivers are to blame.
I'm using FrameBuffers at the moment by closely following Inertia's very helpful tutorials in the Documentation section of this site. I notice, however, that a lot of similar methods seem to exist outside of GL.Ext. What's the difference with these, and how can I use framebuffers without calling the unsupported methods?