Cifram's picture

Weird encoding issue with transform feedback

This issue (or one that looks very similar) was brought up in a couple posts back in 2009, but none of the solutions mentioned there seem to work for me. I'm attempting to use transform feedback, but when I specify the output varyings with GL.TransformFeedbackVaryings(), I get a link error on my shader, along the lines of "error: Varying (named 4) specified but not present in the program object." Of course, my varying is not called "4". The code that builds the shader is spread over multiple files, but here are the key elements:

The top-level calls that build the shader:

			vertShader = graphics.CreateShader(ShaderType.VertexShader, "data/shaders/MarchingCubes-Pass1.vert");
			shaderPass1 = graphics.CreateShaderProgram(new[] { vertShader });
			GL.TransformFeedbackVaryings(shaderPass1, 1, new string[] { "gl_Position" }, TransformFeedbackMode.InterleavedAttribs);

The implementation of the three functions called on "graphics":

		public int CreateShader(ShaderType type, string shaderFile) {
			string vertShaderSrc = File.ReadAllText(shaderFile);
			int shaderObj = GL.CreateShader(type);
			GL.ShaderSource(shaderObj, vertShaderSrc);
			int success;
			GL.GetShader(shaderObj, ShaderParameter.CompileStatus, out success);
			if (success == 0) {
				Console.WriteLine("### Error compiling shader " + shaderFile + " ###");
			return shaderObj;
		public int CreateShaderProgram(int[] shaders) {
			int program = GL.CreateProgram();
			foreach (int shader in shaders) {
				GL.AttachShader(program, shader);
			return program;
		public void LinkShaderProgram(int program) {
			int success;
			GL.GetProgram(program, ProgramParameter.LinkStatus, out success);
			if (success == 0) {
				Console.WriteLine("### Error linking shader program ###");

And the shader itself (simple as can be, since for now I'm just trying to get TF working at all):

#version 120
void main() {
	gl_Position = gl_Vertex;

And the shader link error:

Link info
error: Varying (named 4) specified but not present in the program object.

If I run this repeatedly, the value it finds is sometimes "4", and sometimes "&". It seems to alternate from one to the other between runs. The inconsistency indicates, to me, that it might be some kind of memory issue, not just a simple encoding problem.

For reference, I'm running Win7 64-bit, a GeForce 8800 GT, nVidia drivers v8.17.12.9610, and OpenTK pulled from SVN head on August 1st.


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
the Fiddler's picture

Use GL.GetShaderSource to read and print the shader right after you upload it. If there's a memory corruption issue, it will show itself there.

Additionally, try changing the encoding of the shader file to ASCII (you can do that with NotePad, in the "Save As" dialog). It's a wild shot, but I've seen this help before.

Cifram's picture

Changing the encoding didn't make a difference. I tried to output the shader source, but somehow it causes the DEBUGGER to crash! I'm running it in the debugger in Visual Studio 2010 Pro, and a dialog pops up saying vshost.exe crashed and asking if I want to close it or debug it. Of course, debugging it does little good because I don't have the source code for vshost.exe.

However, I wouldn't expect these to reveal anything. The trouble isn't that it's not finding the variable I'm specifying. The trouble is that it's looking for the wrong variable name. I specify the variable name "gl_Position" and it gives me an error message that it can't find a variable called "4" or "&", neither of which are even legal identifiers in GLSL. Further, my shaders work, when not using transform feedback. They compile, link and produce the expected output, so they must be getting transferred to the graphics card properly.

If there's an encoding problem or transmission problem, it's almost certainly in the string that was passed in to GL.TransformFeedbackVaryings.

amulware's picture

I hope it is alright if I revive this thread, but I seem to be having the same problem.

Some varying names (and even several together) seem to work fine, but others, or some combinations of them result in nonsensical names including numbers, punctuation and (I believe) non-ASNI characters.
I mostly get errors telling me that duplicate names are not allowed, but when I remove some of my (12 in total) varying names, I start getting the strange names.
(I assume the duplicate error is only a symptom of the translation/encoding/memory problem that seems to be going on.)

I tried everything in the posts from 2009 but nothing helps.

I am also very sure that the shader itself is correct.
In the cases where I get it to accept some varyings, it renders just like I would expect. Except for that the transform feedback buffer is filled with garbage.
(Also, all my other OpenGL stuff works just fine, including geometry shaders

In case it may be relevant, I am running on a GT755M, driver version 340.52(pretty recent), Win8.1, .Net4 and OpenTK 1.1.0.

I really hope someone has an idea about what can be done here, and if there is anything I should clarify, let me know.

Frassle's picture

We marshal the string array for TransformFeedbackVaryings the same as every other string array method, I don't think that's the problem.
Can you try and use the program query methods glGetProgramInterface and glGetProgramResource to try and print out all the all information about the GL_TRANSFORM_FEEDBACK_VARYING interface, and as Fiddler suggested get the shader source back from the driver with glGetShaderSource.

the Fiddler's picture

Additionally, you can use apitrace to verify that the shader source is marshaled correctly.

A minimal test case that reproduces the issues would also help.

amulware's picture

Thanks for the quick response, guys!

Here is what I got so far:
Using glGetProgramInterface on my failing program, I get only zeros (and I am sure I'm using it right, see also below).
I also checked glGetShaderSource, and it gives me back the source correctly.

I am not sure how to use glGetProgramResource unfortunately and couldn't really find any examples.
Could you give me a few lines of code for that? (if relevant after this post)

Now, next I tried making a minimal example to reproduce the issue without all the baggage of our program around it.
However, so far I have not succeeded, meaning that it works just fine here.
I used the exact same shader code, and as far as I can tell the same gl calls as in the other program.

glGetProgramInterface also returns the correct numbers (16 max name length, and 12 active resources).

So for now I am a bit perplexed.
Got no time to continue on this right now, but as soon as I can I will again make sure I use the exact same gl calls, and then take a look at apitrace.
Will post update in a few days.

Thanks again!

amulware's picture

Well, so much for 'a few days'.

I tried looking into apitrace a bit, but did not get very far with it yet.

I did notice however, that I figured out how to reliably make it work or break:

My laptop can switch between the integrated Intel GPU and the dedicated NVidia one per application.
As it turns out, it works on Intel (HD 4600), but not on NVidia (755M, with up to date drivers).

I also already tested it on two other machines at hand, both with NVidia GPUs (560 Ti and 760), and neither of them work either.

The entire particle effect that this is about also works 100% like I expect it to on the Intel chip, so I am pretty convinced that my code is fine.

Speaking of code, here is the code that compiles and links the shader program (includes source of original vertex shader):

And if you are interested in just quickly running it yourself, you can download a little executable I made here:
(<1MB, same code as above wrapped in a bit of makeshift UI)

I am now using that program to collect more data from our beta testers, to see if it really is a problem with NVidia, and will report on the results on that.
(And if it turns out that NVidia is the problem, I will see about asking in the appropriate place as well.)

For reference, some example output, both Intel and NVidia:

Alright, that is it for now. I will try and take a look at apitrace again sometime soon, and maybe make a comparison between the different GPUs.

Let me know if you have any ideas what the problem could be, or how to go about fixing it.

Thanks (yet) again!

amulware's picture

Alright, quick report:

So far it runs on 2/2 AMD and 4/4 Intel cards, and it does NOT run on 6/6 NVidia chips.

For my part that sounds convincingly conclusive, and I will see and check with NVidia what the problem might be.

Link to question on NVidia forums.

If I don't get an answer there (place looks a bit deserted) I'll try to get a hold of them another way.
Unfortunately I have no contacts at NVidia.

daidudu's picture

Hi, there,
I've encountered this issue a few months ago, and tried lots of suggestions from this website and stackoverflow, the problem still was not solved. Today, after reading the specification of transform feedback api provided by NVIDIA, I tried a new method, and perhaps the problem is solved. I already tested it on two computers with different NVIDIA video card. Here is the code snippet:

//create vertex shader object, geometry shader object, fragment shader object, and attach them to program object first
//get feedback varing names
string[] varings=GetFeedbackVaringsFromGlslSrc();
if(!isNvidia){//if the video card is not from NVIDIA
    GL.TransformFeedbackVarings(programObject, varings.Length, varings, TransformFeedbackMode.InterleavedAttribs);
//check link status here
if(isNvidia){//if the video card is from NVIDIA
    var locs=new int[varings.Length];
    for(int i=0;i<varings.Length;i++)
//check possible errors here

The main idea is: for a NVIDIA video card, we should link and use the program first, then we can deal with transform feedback varings.
Hope it helps :)