iliak's picture

Values returned by GL.GetString() problems

Here's my code for getting hardware informations :

Trace.WriteLine("Video informations :");
Trace.WriteLine("Graphics card vendor : {0}", GL.GetString(StringName.Vendor));
Trace.WriteLine("Renderer : {0}", GL.GetString(StringName.Renderer));
Trace.WriteLine("Version : {0}", GL.GetString(StringName.Version));
Trace.WriteLine("Shading Language Version : {0}", GL.GetString(StringName.ShadingLanguageVersion));
string ext = GL.GetString(StringName.Extensions);
Trace.WriteLine("Supported extension ({0}) : {1}", ext.Split(new char[] { ' ' }).Length, ext);

Here's the output :
Video informations :
Graphics card vendor : NVIDIA Corporation
Renderer : GeForce 8800 GTS/PCI/SSE2
Version : 2.1.2
Shading Language Version : 1.20 NVIDIA via Cg compiler
Supported extension (180) [..........]

Now, the output from GPU Caps Viewer :
===================================[ OpenGL GPU Capabilities ]
- OpenGL Version: 3.2.0
- GLSL (OpenGL Shading Language) Version: 1.50 NVIDIA via Cg compiler
- OpenGL Extensions: 197 extensions [.......]

Why such differences ?


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
the Fiddler's picture

You need to request a 3.x context explicitly, by setting the "major" and "minor" parameters in the GameWindow, GLControl or GraphicsContext constructor.

This is in accordance to the OpenGL specs, which specify that you get the highest compatible OpenGL version by default. For OpenGL 1.0-2.1, this is 2.1. For OpenGL 3.x, this is 3.2 (as long as your drivers support that, 3.1 or 3.0 otherwise).

Edit: this is now documented.

iliak's picture

Ok, my bad for not reading docs...

Thanks !