miujin's picture

Problem running program on Intel Graphics


i've written a program and it runs on all computers with a "normal" GPU. Now I try to run the programm on a tablet with an Intel HD Graphics 4400. This GPU can handle OpenGL 4.0. My code is written with the OpenGL 3.3 core functions, I don't use any extensions.

The first problem was, that the program doesn't start, because he has initalized only OpenGL 1.1
This problem was solved by changing the code for creating the GraphicsContext


GraphicsMode gm = new GraphicsMode(new ColorFormat(8,8,8,8), 32, 1, 0, 0, 2, false);
_panel = new GLControl(gm, 3, 3, GraphicsContextFlags.Default);


_panel = new GLControl(GraphicsMode.Default, 3, 3, GraphicsContextFlags.Default);

I've created my own GraphicsMode because of some problems I've mentioned in an earlier thread.

Know OpenGL is initialized with Version 3.3. My problem is that some shader code creates error (where no error are).
For example when I compile a fragment shader the third parameter of GL.GetShader return 0, and so there was a problem compiling the shader. With GL.GetShaderInfoLog I only get an empty string.
Another example is by compiling a geometry shader, there I get some error messages from GetShaderInfoLog but the errors are no errors:

#version 330
layout(triangles) in;
layout(triangle_strip, max_vertices=3) out;
in int gl_PrimitiveIDIn;
uniform bool flatShading;
out vec3 vertex_normal;
void main()
  vec3 triangleNormal;
      vertex_normal = triangleNormal;
      vertex_normal = vertex[i].normal;

This code (not all lines are include, only the important ones) creates the following error:

  1. 'gl_PrimitiveIDIn' : geometry shader input varying variable must be declared as an array
  2. 'flatShading' : undeclared identifier
  3. 'vertex_normal' : undeclared identifier
  4. 'assign' : cannot convert from '3-component vector of float' to 'float'

I don't unterstand why there are these errors, because the same code runs on all the other computers here with mainly a nVidia GPU.

Any idea what the problem is? The driver are up to date.


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
the Fiddler's picture

First of all, make sure you are using OpenTK 1.1 stable, which contains improvements to the context construction logic that will avoid falling back to OpenGL 1.1 / software rendering when requesting unsupported GraphicsModes. For instance:

GraphicsMode gm = new GraphicsMode(new ColorFormat(8,8,8,8), 32, 1, 0, 0, 2, false);

You are requesting a 32bit depth buffer, which is not actually supported by (many/most) GPUs. OpenTK 1.1 will correctly fall back to a 24bit depth buffer in this case. Previous versions would fail to construct a hardware accelerated context and would fall back to software rendering.

Second, I would suggest testing your shaders with the GLSL Reference Compiler. Nvidia's drivers are notorious for accepting invalid GLSL code, while Intel's drivers are notorious for failing to compile valid GLSL code. The reference compiler should indicate whose fault it is in this case.

If the reference compiler succeeds without an error, you should report this to Intel as a bug.

miujin's picture

Tank you for the info.

The problem was that I've declared the In Build Varaibles of GLSL and the reference compiler says that gl_ is reserved.

GetShaderInfoLog and GetProgramInfoLog return an empty string, so everthing works well.
The functions GL.UseProgram(ShaderProgramID) creates an AccessViolationException with the message that protected storage is read or writen.

Any idea whats the problem?

the Fiddler's picture

Ah, yes, "gl_" is reserved in OpenGL 3+ so you cannot use it in your declarations. Nvidia should have rejected this code - this is what I meant with "Nvidia drivers are notorious for accepting invalid code".

The access violation in GL.UseProgram sounds very strange. I've never seen this before, can you please create a small test project that reproduces this issue and attach it to a bug report at https://github.com/opentk/opentk?

miujin's picture

I've tried to reproduce the error in a small test project, but there the code works.

I've found the line which produces the error (commenting it out, everything work well).
In my fragmet shader I have a uniform samplerBuffer and I read this buffer with the function texelFetch.
If I comment out the line with texelFetch I can call GL.UseProgram without the exception.

the Fiddler's picture

Interesting. A crash here might indicate one of three things:

  • The samplerBuffer is not bound correctly
  • The samplerBuffer is bound to an unsupported texture format. E.g. for 3-component textures you need to check for the ARB_texture_buffer_object_rgb32 extension (not certain when/if this became core.)
  • The driver has a bug. Reporting this to Intel and/or updating the drivers might help in this case.
miujin's picture

mhh I think the TBO is bound correctly.
here is the code for creating the TBO:

GL.GenBuffers(1, out vbo.A_ID);
GL.BindBuffer(BufferTarget.TextureBuffer, vbo.A_ID);
attributes = new float[(obj.data.Count / 3) * 4];
GL.BufferData(BufferTarget.TextureBuffer, (IntPtr)((obj.data.Count / 3) * 4 * sizeof(float)), attributes, BufferUsageHint.DynamicDraw);
GL.GenTextures(1, out vbo.A_TID);
GL.BindTexture(TextureTarget.TextureBuffer, vbo.A_TID);
GL.TexBuffer(TextureBufferTarget.TextureBuffer, SizedInternalFormat.Rgba32f, vbo.A_ID);
GL.BindBuffer(BufferTarget.TextureBuffer, 0);

And here the code to bind it

GL.BindTexture(TextureTarget.TextureBuffer, vbo.A_TID);
GL.Uniform1(st_InInfo, 2);

Do I have to bind the TBO first and than activate the shaderprogram?

GL.GetString(StringName.Extensions); returns an empty string, but with the OpenGL Extension Viewer, GL_ARB_texture_buffer_object_rgb32 is green, so it is supported.

the Fiddler's picture

Do I have to bind the TBO first and than activate the shaderprogram?

I don't think so, but I cannot quote the spec on that. My experience is that you need to have everything bound before issuing the draw call, not before GL.UseProgram. Might be worth a try.


GL.GetString(StringName.Extensions); returns an empty string

Oh right, according to the documentation core contexts need to use GL.GetString(StringName.Extensions, i) instead of GL.GetString(StringName.Extensions) (thank you Khronos! /s)

As a sanity check, could you test against a fresh debug version of OpenTK from github? This will immediately throw an exception if any OpenGL error occurs, which can be very helpful when debugging weird issues/crashes.

Another option would to inspect what is going on via apitrace. It might give a hint that leads to the solution.