IanRayns's picture

[solved] Point Rendering Problem

Hi all,

I'm trying to develop an application which displays a pointcloud overlayed on an image which has been mapped to a sphere. Millions of points will be displayed over the image, and very close to each other.

I have a strange problem with the point rendering itself, when rotating the camera some points disappear. Also, the points seem to stay in lines perpendicular to the camera rather than staying in their positions.

Does this sound like an accuracy problem? Or a rendering issue?

Oddly I thought I'd fixed this issue one night on my laptop which doesn't have a graphics card. When I then ran the program on the machine I've been developing on, which does have a GFX card, the same issue still exists. The problem was fixed on my laptop by translating to each point & drawing at 0,0,0 (Very very slow!), whereas I've just been simply drawing each point at it's location (is there a better way of doing this?)

I'm currently using point sprites for the points, but the issue exists if I use point sprites or normal points. The rendering for the points is below

            GL.Rotate(m_camLongitude, 1.0, 0.0, 0.0);
            GL.Rotate(m_camLatitude, 0.0, 1.0, 0.0);
            GL.Rotate(m_imgHeading, 0.0, 1.0, 0.0);
            GL.Rotate(m_imgPitch, 1.0, 0.0, 0.0);
            GL.Rotate(-m_imgRoll, 0.0, 0.0, 1.0);
            GL.Translate(-m_camX + m_leverArmY, -m_camY - m_leverArmZ, -m_camZ + m_leverArmX);
            GL.GetDouble(GetPName.ModelviewMatrix, m_currentModelMatrix);
            GL.GetDouble(GetPName.ProjectionMatrix, m_currentProjectionModel);
private void renderPoints()
                GL.BindTexture(TextureTarget.Texture2D, m_spriteTextureID);
                GL.BlendFunc(BlendingFactorSrc.SrcAlpha, BlendingFactorDest.One);
                float[] quadratic = { 1.0f, 0.0f, 0.01f };
                GL.PointParameter(PointParameterName.PointDistanceAttenuation, quadratic);
                // Query for the max point size supported by the hardware
                float maxSize = 0.0f;
                float pointSize = 3.0;
                GL.GetFloat(GetPName.PointSizeMax, out maxSize);
                if (pointSize > maxSize)
                    pointSize = maxSize;
                GL.PointParameter(PointParameterName.PointFadeThresholdSize, 60.0f);
                GL.PointParameter(PointParameterName.PointSizeMin, 1.0f);
                GL.PointParameter(PointParameterName.PointSizeMax, maxSize);
                GL.TexEnv(TextureEnvTarget.PointSprite, TextureEnvParameter.CoordReplace, (float)OpenTK.Graphics.All.True);
                // Render point sprites...
                        for (int i = 0; i < m_numPoints; i += INCREMENT)
                            if (i == m_snappedPointIdx)
                                GL.Color4(0.0, 0.0, 1.0, 1.0);
                                GL.Color4(1.0, 0.0, 0.0, 1.0);
                            GL.Vertex3(m_pointStore[i].X , m_pointStore[i].Y, m_pointStore[i].Z);
                GL.BlendFunc(BlendingFactorSrc.SrcAlpha, BlendingFactorDest.OneMinusSrcAlpha);

Any help or pointers with this problem will be greatly appreciated.

Many Thanks,



Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
the Fiddler's picture

Can the issue be captured in screenshots? This would help understand what might be wrong.

Do points disappear randomly or is there something common to their behavior?

  • Ensure your zfar plain is far enough to contain the whole cloud
  • Try enabling antialiasing. I don't know whether this behavior is expected or not, but I used to have an issue where points wouldn't fade out unless I enabled antialiasing.
IanRayns's picture

Typically as soon as I posted the question I fixed it!

It seems that If I translate to the 1st point & subtract the coordinates of the 1st point from every other point all points are rendered properly!

the Fiddler's picture

This sounds like a precision issue, then. You are moving the origin to the first point which should improve precision.

Just keep in mind that GL.Begin()-GL.End() is quite slow and will not scale to millions of points. Display lists and vertex buffer objects are at least an order of magnitude faster.

IanRayns's picture

Thanks for the tip, I'll look into changing this over.

Are vertex buffers okay to use when I'm going to be processing millions of points? From what I understand about them, doesn't the memory on the graphics card rather than system memory to improve efficiency?

the Fiddler's picture

Indeed, vertex buffer objects are stored in video memory. Not only does this avoid slow bus transfers, it also frees the CPU to perform other tasks while the vertex buffer is being processed.

If your points are static, VBOs will scale perfectly with GPU performance (i.e. if your GPU can process millions of points per second, then VBOs will allow you to achieve that throughput). If you need to modify those points, you might have to implement some intelligent update scheme to maintain performance.