Goz3rr's picture

int or uint for ID's?

I'm wondering what the recommended data type is for storing the ID's returned, since GL.GenBuffers has a uint overload, but GL.CreateProgram and GL.CreateShader don't. Furthermore, would there be any difference between using

int id;
GL.GenBuffers(1, out id);

or

int id = GL.GenBuffer();

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
winterhell's picture

int works just fine, and is a more general type. Some functions may return a negative value to indicate an error, so that helps. 2 million different texture ids should be enough for most applications (assuming the uint version can actually represent all 4 million)

No, there is no practical difference between the two approaches.

the Fiddler's picture

As winterhell said, int vs uint makes no difference. Both can represent the whole 2^32 range of handles (it doesn't matter if a handle is negative or positive, OpenGL only cares about its value.)

OpenGL defines functions taking either uint or int handles. Since uint is not CLS-compliant (meaning it cannot be used in e.g. VB.Net), OpenTK generates additional int overloads for every such function. If the original function takes int, then only int is used. (This explains why some functions have both overloads and some do not. If I was designing the bindings now, I would simply drop the uint handle overloads - they bloat the API without offering anything in return.)

As for:

int id;
GL.GenBuffers(1, out id);

vs

int id = GL.GenBuffer();

the first one is the original C prototype of the function. The latter is an additional "convenience" overload added in OpenTK 1.1 to help functional languages such as F#, where types are immutable by default. For example, in OpenTK 1.1 the idiomatic assignment for F# becomes:

let id = GL.GenBuffer()

which is much nicer than OpenTK 1.0:

let mutable id = 0
GL.GenBuffers(1, &id)
// or
let id = ref 0
GL.GenBuffers(1, id)
Goz3rr's picture

I see, on a similar note, what would be the preferred way to get the OpenGL Version?

I see a lot of your samples do this:

var version = GL.GetString(StringName.Version);
    major = Int32.Parse(version[0]);
    minor = Int32.Parse(version[2]);

And myself i've been doing this:

var version = new Version(GL.GetString(StringName.Version));

Until i encountered a setup where StringName.Version returned something along the lines of "4.4.0 NVIDIA 331.38" (this was on windows with the stated nvidia driver) which broke the Version thing, so then i switched to

            Version version;
            try {
                version = new Version(GL.GetString(StringName.Version));
            } catch(Exception) {
                version = new Version(GL.GetInteger(GetPName.MajorVersion), GL.GetInteger(GetPName.MinorVersion));
            }

Is there a reason you're manually parsing instead of using GetPName.MajorVersion/MinorVersion?

the Fiddler's picture

According to the glGetString documentation:

Quote:

The GL_VERSION string begins with a version number. The version number uses one of these forms:

major_number.minor_number
major_number.minor_number.release_number

Vendor-specific information may follow the version number. Its depends on the implementation, but a
space always separates the version number and the vendor-specific information.

Which means that you cannot pass the version string directly to System.Version. You can either pass a substring or parse each field separately.

Note that GL.GetString(StringName.Version) is only valid in the compatibility profile of OpenGL and will cause an error in the core profile. GL.GetInteger(GetPName.Major/MinorVersion) is only available in OpenGL 3.x/4.x and will cause an error on drivers that only support 2.1 or lower. You need to query both:

            Version version;
            try
            {
                var version_string = GL.GetString(StringName.Version);
                if (GL.GetError() != ErrorCode.NoError)
                    throw new NotSupportedException();
                version = new Version(version_string.Substring(0, 3));
            }
            catch
            {
                version = new Version(GL.GetInteger(GetPName.MajorVersion), GL.GetInteger(GetPName.MinorVersion));
            }

Also note that OpenGL ES uses a slightly different version string, so that also needs to be handled differently.