anathema's picture

Apparent - and massive - memory-leak with GraphicsMode objects

Hi all,

I'm seeing something odd when creating GraphicsMode objects :-( When the object goes out of scope and the GC gets it, it appears to leave a fair chunk of memory allocated to my process. The exact amount seems to vary from one graphics-card to another, but it usually appears to be in the 8-10MB range and it stays allocated until my app exits.

For example, I'm doing this during application startup to determine the available FSAA modes:

int samplecount = 0;
int samples;
List<int> _fsaaLevels = new List<int>();
while (true)
    samples = new GraphicsMode(32, 24, 0, samplecount).Samples;
    if (samples < samplecount)
    samplecount = samples + 2;

Before running this, the process is using ~50MB, all of which I can account for. After completion on a graphics-card which supports 4 levels of FSAA it's using ~95MB. The additional 45MB is 'lost' permanently :-(

I've done some tracing and I can confirm that the memory is allocated when GraphicsMode's 'lazy init' function executes. The code has been tested on five different graphics-cards with OpenGL versions ranging from 1.3 to 3.2. The only one the leak doesn't seem to appear on is my Nvidia DT250 (and it's possible that this one is leaking, but is simply allocating far less RAM than the others so I haven't noticed it yet...).

- am I doing something dumb?
- is there a bug in the graphics-drivers?
- is GraphicsMode in need of a Dispose() method? :-)

Please help! :-)

I'm using an svn checkout of taken about a week ago. Language is C#, target .NET version is 2.0, IDE is VisualStudio 2010, although the problem was also observed when compiling with SharpDevelop 3.2.

Tests have been carried out on:
- WindowsXP and a graphics-card from before the dawn of time with OpenGL 1.3
- WindowsXP and a reasonable ATI chipset with OpenGL 2.1
- Windows7 and a low-end ATI chipset with OpenGL 2.1
- Windows7 and a very low-end IBM chipset with OpenGL 2.1
- Windows7 and an Nvidia DT250 with OpenGl 3.2 :-)

All machines are running a 32-bit OS.

Thanks in advance for your help!



Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
the Fiddler's picture

How are you measuring memory usage?

Is memory usage correlated with how many times this loop runs? For example:

for (int samples = 0; samples < 10; samples++)
    Console.WriteLine(new GraphicsMode(32, 24, 0, samples).Samples);
// vs
for (int samples = 0; samples < 20; samples++)
    Console.WriteLine(new GraphicsMode(32, 24, 0, samples).Samples);

Does the second loop leak (roughly) twice the memory of the first loop?

Internally, OpenTK constructs a temporary OpenGL context whenever you call new GraphicsMode (at least on modern graphics cards, see WinGraphicsMode.cs, line 154). This might indicate a memory leak when destroying this context but I can't say if this is driver related or a bug in OpenTK yet.

anathema's picture

I'm stepping through the code while keeping an eye on process memory-usage in TaskManager. Each time the line 'samples = new GraphicsMode(32, 24, 0, samplecount).Samples;' executes, memory usage jumps by several MB.

On my workstation with the DT250, total usage spikes during this process and then returns to approximately the level it was at before entering the loop. On all other machines tried to date, the usage remains high after the loop exits. (tbh I'm more than slightly concerned at the sheer *amount* of memory being allocated, too!)

This isn't the only place I create GraphicsMode objects, of course, and again the memory usage jerks up each time one is created.

I'll run your code this evening when I get home and let you know what I see :-)

the Fiddler's picture

Thanks. The spike is expected, since OpenGL contexts are quite costly in terms of memory. On the other hand, the spike should be very short, as the context is destroyed immediately.

Let me see if I can reproduce this.

anathema's picture

Yes, I've noticed that :-) Where does all the memory go, exactly?

It has occurred that one way around this might be to generate a 'pool' of GraphicsMode objects up-front. Is it possible to use a single GraphicsMode to construct multiple instances of GLControl?

the Fiddler's picture

Yes, that would work. When you create a GraphicsMode, it queries the drivers for the mode that most closely matches your parameters. Once the GraphicsMode is constructed, it can be reused at will.

anathema's picture

Thanks, that's what I thought :-) Given how many GLControls I need to create over time, having a pool of GraphicsModes will speed things up!

Incidentally - this is probably not related - I've noticed that GLControl.MakeCurrent() can take a fair bit of time to execute on some of my test machines. Presumably this is dependent on the hardware/drivers?

anathema's picture

Right, I've run your code suggestion now :-)

On my Nvidia DT250, everything behaves as expected - the memory allocated each time a GraphicsMode is created is gc'd again immediately the object goes out of scope.

On my laptop, which has an ATI X1200, the first loop results in a total allocation of ~78MB and the second takes twice that. None of it is released until the app exits.

Interestingly enough, the Nvidia card only seems to allocate a couple of hundred kB per GraphicsMode - any idea why there's such a huge difference between the two cards?

anathema's picture

Further debugging turns up more data :-(

I'm getting the same leak when creating multiple GLControls from a single GraphicsMode, so it's presumably the underlying OpenGL context that's not being freed.

the Fiddler's picture

I haven't been able to reproduce the leak on Win7 64bit / Nvidia NVS135M / 197.16 drivers or with Microsoft's GDI renderer using the "Test GraphicsModes" sample. Unfortunately, my Ati desktop is lacking a monitor right now so I can't test there directly.

Are you actually calling Dispose() on those GLControls? The following code shouldn't be leaking memory that isn't (especially not MBs worth of memory):

GraphicsMode mode = GraphicsMode.Default;
for (int i = 0; i < 25; i++)
    using (GLControl control = new GLControl(mode))
anathema's picture

In the cases where I need to replace a GLControl (due to the user changing fsaa level), I explicitly call the Dispose() method. If the control still exists when the window is closed, I do not - I assume the GC gets these?

Something crossed my mind last night: it may be a coincidence, but the amount of memory allocated by my laptop for each GraphicsMode is approximately the same amount required for a 32-bit screen-buffer.

It also occurred that what I might be seeing is the allocations taking place in graphics-memory, since of course that's shared with the OS on the laptop, and so the same 'leak' may in fact be happening on my Nvidia and I simply haven't noticed leaks on a 1GB card :-)