Norris's picture

Very slow normals averaging. A better algorithm ?

Hi !
First, sorry for my bad english.
I write a Wavefront model importer in a modeler style application. I can already load the geometry and view it with textures and normals when they exist in the file. Now, when a file has no normals, I compute them for each face and I have a flat shading.
All of these computations works fine and quickly. Now I want a smooth shading and try to compute normals average. I understand how to do it, but it's very very very slow.
For computing normals per face or vertex, after parsing my .obj file, I have an array of vertices (with positions, textures coords and default/empty normals) that are in the same order as the faces (and positions indices in face). So, I know the first three vertices are for the first polygon, the following three vertices for the second face ...
So, I have duplicated vertices, and I can compute normals per face and assigning the. this is my pseudo code for this. "GroupVertices" is my array (or List(Of Vertex)) of vertices.

For i As Integer = 0 To GroupVertices.Count - 1 Step 3
    'get 3 vertices of the face
    Dim v1 as Vector3 = GroupVertices(i).Position
    Dim v2 as Vector3 = GroupVertices(i + 1).Position
    Dim v3 as Vector3 = GroupVertices(i + 2).Position
    Dim faceNormal as Vector3 = ComputeFaceNormal(v1, v2, v3)
    GroupVertices(i).Normal = faceNormal :  GroupVertices(i + 1).Normal = faceNormal :  GroupVertices(i + 2).Normal = faceNormal
Next

I can put details for "ComputeFaceNormal" if you want.

Now this is a pseudo code for computing normals average:

For i As Integer = 0 To GroupVertices.Count - 1
    For j As Integer = 0 To GroupVertices.Count - 1
        If Not j = i Then
            If groupVertices(i).Position.Equals(groupVertices(j).Position) Then
                Dim vnX As Single = (groupVertices(i).Normal.X + groupVertices(j).Normal.X) / 2
                Dim vnY As Single = (groupVertices(i).Normal.Y + groupVertices(j).Normal.Y) / 2
                Dim vnZ As Single = (groupVertices(i).Normal.Z + groupVertices(j).Normal.Z) / 2
                Dim vn As New Vector3(vnX, vnY, vnZ)
                vn.Normalize()
                groupVertices(i).Normal = vn : groupVertices(j).Normal = vn
            End If
        End If
    Next
Next

I don't know if it is possible or not to do this without crossing many times each vertex. For a simple model, it actually take approximatively 20 seconds with normals average, and less than a second without !!!

I have an idea, but I don't know if it would works.
Make the same computations, but not when "GroupVertices" is full. If I make it for a face each time I had its vertices to "GroupVertices", I think it will be faster because my list is small at this moment.

Have you a better idea ?


Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.
Inertia's picture

If per-face normals is all you want, SVN contains C# code to do this:

Taken from http://opentk.svn.sourceforge.net/viewvc/opentk/trunk/Source/Examples/Sh...

160 	internal static void FindNormal( ref Vector3d A, ref Vector3d B, ref Vector3d C, out Vector3d result )
161 	{
162 	Vector3d temp1, temp2;
163 	Vector3d.Subtract( ref A, ref B, out temp1 );
164 	temp1.Normalize();
165 	Vector3d.Subtract(ref C, ref B, out temp2);
166 	temp2.Normalize();
167 	Vector3d.Cross( ref temp1, ref temp2, out result );
168 	result *= -1.0;
169 	result.Normalize();
170 	}

Basically the vectors temp1 and temp2 define a plane and the resulting cross product is a perpendicular vector to that plane (don't forget about the winding of your triangle, it matters)

Norris's picture

Thanks Inertia for your answer, but I already understand per-face normals and it works. What I want is per-vertex normals. I know how to compute this, but I need to fo know for each vertex the neighbor faces.
This is what is very slow. I need to compare each vertex position with the others. I use two nested For loops and it works, but, as I have many vertices in my list this is very very long.

nythrix's picture

This is what I use. It should be pretty much self-explanatory.

Vector[] normals = new Vector[points.Count];
Dictionary<Point, Vector> normalsPerPoint = new Dictionary<Point, Vector>();
 
foreach (Triangle t in Triangles)
{
  if (t.IsDegenerate) continue;
 
  Vector n = t.Normal;
 
  if (!normalsPerPoint.ContainsKey(t.A)) normalsPerPoint.Add(t.A, n);
  else normalsPerPoint[t.A] = normalsPerPoint[t.A] + n;
 
  if (!normalsPerPoint.ContainsKey(t.B)) normalsPerPoint.Add(t.B, n);
  else normalsPerPoint[t.B] = normalsPerPoint[t.B] + n;
 
  if (!normalsPerPoint.ContainsKey(t.C)) normalsPerPoint.Add(t.C, n);
  else normalsPerPoint[t.C] = normalsPerPoint[t.C] + n;
}
 
for (Int32 i = 0; i < points.Count; i++)
  normals[i] = normalsPerPoint[points[i]].Normalize();

This should perform faster than what you have now. You're doing O(numVert * numVert) while this is O(numTris * log(numVert) + numVert). log(numVert) should be the dictionary lookup time. And it's a wild guess. The rest is obvious.

I don't know/haven't checked if any faster algorithms exist. Mine is home baked.

Inertia's picture

Sorry, I understood your 2nd code snippet to be the function ComputeFaceNormal you used in the 1st code snippet, and that just looked very wrong. No more comments until I find the time to take more than a superficial look at your code.