Calculating Normals in a Triangle Mesh

How to compute vertex normals for a triangle mesh in OpenGl?

You have to specify 1 normal attribute for each vertex coordinate. A vertex coordinate and its attributes form a tuple.

Furthermore you have to use the data type float rather than int, for computing the normal vectors:

GLfloat normals[49 * 49 * 18];
curr = 0;
for (int i = 0; i < 49 * 49 * 18; i += 9){
float Ux = vp[i+3] - vp[i];
float Uy = vp[i+4] - vp[i+1];
float Uz = vp[i+5] - vp[i+2];
float Vx = vp[i+6] - vp[i];
float Vy = vp[i+7] - vp[i+1];
float Vz = vp[i+8] - vp[i+2];

float nx = Uy * Vz - Uz * Vy;
float ny = Uz * Vx - Ux * Vz;
float nz = Ux * Vy - Uy * Vx;

for (int j = 0; j < 3; ++j) {
normals[curr++] = nx;
normals[curr++] = ny;
normals[curr++] = nz;
}
}
glBufferData(GL_ARRAY_BUFFER, 49 * 49 * 18 * sizeof(GLfloat), normals, GL_STATIC_DRAW);

I recommend to invert the normal vector of the back faces for a double sided light model:

vec3 normal = normalize(Normal);
vec3 viewDir = normalize(-fpos);
if (dot(normal, viewDir) < 0.0)
normal *= -1.0;

Sample Image

Fragment shader:

#version 410

// Define INPUTS from fragment shader
//uniform mat4 view_mat;
in vec3 Normal;
in vec3 fpos;

// These come from the VAO for texture coordinates.
in vec2 texture_coords;

// And from the uniform outputs for the textures setup in main.cpp.
uniform sampler2D texture00;
uniform sampler2D texture01;

out vec4 fragment_color; //RGBA color

const vec3 lightPos = vec3(0.0,0.0,5.0);
const vec3 diffColor = vec3(1.0,0.5,0.0);
const vec3 specColor = vec3(1.0,1.0,1.0);

void main () {
vec3 normal = normalize(Normal);
vec3 viewDir = normalize(-fpos);
if (dot(normal, viewDir) < 0.0)
normal *= -1.0;

vec3 lightDir = normalize(lightPos - fpos);
float lamb = max(dot(lightDir, normal), 0.0);
float spec = 0.0;

if (lamb > 0.0) {
vec3 refDir = reflect(-lightDir, normal);

float specAngle = max(dot(refDir, viewDir), 0.0);
spec = pow(specAngle, 4.0);
}

fragment_color = vec4(lamb * diffColor + spec * specColor, 1.0);
}

Calculating and Applying normals to a triangle mesh

If I'm correct you're still only computing a normal per triangle? This is correct, but after that you should computed what the normal is per vertex. This is simply the normalized sum of all triangle normals that the specific vertex is attached to.
Once completed you can proceed with your immediate mode drawing, specifying a normal per vertex.

Calculating per-face normal for a simple triangle

Problem 1 : Normals don't have positions. They are the orientation of the vector that is perpendicular to the plane created by the combination of the directions of any two vectors of your triangle.

Problem 2 : All verticies of a triangle should have the same normal as they are part of the same plane, the plane of your triangle.

Normal

As you can see, the direction of the normal vector will be the same, where ever you are on the triangle.

Calculating Vertex Normals of a mesh

but how the heck do you find the adjacent faces for each vertex?

Think it otherway round: Iterate over the faces and add to the normal of the vertex. Once you processed all faces, normalize the vertex normal to unit length. I described it in detail here

Calculating normals in a triangle mesh

If you really want to find the faces for a vertex, the naive approach is to perform (linear) search for the vertex in the list of faces. A better approach is to maintain an adjancy list.

Triangle normal vs vertex normal

Implemented smooth vertex normal like this:

std::vector<Vec3d> points = ...
std::vector<Vec3i> facets = ...

// Count how many faces/triangles a vertex is shared by
std::vector<int> counters;
counters.resize(points.size());

// Compute normals
norms.clear();
norms.resize(points.size());
for (Vec3i f : facets) {
int i0 = f.x();
int i1 = f.y();
int i2 = f.z();
Vec3d pos0 = points.at(i0);
Vec3d pos1 = points.at(i1);
Vec3d pos2 = points.at(i2);
Vec3d N = triangleNormal(pos0, pos1, pos2);

// Must be normalized
// https://stackoverflow.com/a/21930058/3405291
N.normalize();

norms[i0] += N;
norms[i1] += N;
norms[i2] += N;

counters[i0]++;
counters[i1]++;
counters[i2]++;
}

for (int i = 0; i < static_cast<int>(norms.size()); ++i) {
if (counters[i] > 0)
norms[i] /= counters[i];
else
norms[i].normalize();
}

General method for calculating Smooth vertex normals with 100% smoothness

The correct method is to weight the "facet" normals against the angle between the two neighbor vertices of the one shared in the edge/corner.

(here shown the angle with respect to each face)
per face angle for common point

Here's a rundown of an example implementation:

for (int f = 0; f < tricount; f++)
{
// ...
// p1, p2 and p3 are the points in the face (f)

// calculate facet normal of the triangle using cross product;
// both components are "normalized" against a common point chosen as the base
float3 n = (p2 - p1).Cross(p3 - p1); // p1 is the 'base' here

// get the angle between the two other points for each point;
// the starting point will be the 'base' and the two adjacent points will be normalized against it
a1 = (p2 - p1).Angle(p3 - p1); // p1 is the 'base' here
a2 = (p3 - p2).Angle(p1 - p2); // p2 is the 'base' here
a3 = (p1 - p3).Angle(p2 - p3); // p3 is the 'base' here

// normalize the initial facet normals if you want to ignore surface area
if (!area_weighting)
{
normalize(n);
}

// store the weighted normal in an structured array
v1.wnormals.push_back(n * a1);
v2.wnormals.push_back(n * a2);
v3.wnormals.push_back(n * a3);
}
for (int v = 0; v < vertcount; v++)
{
float3 N;

// run through the normals in each vertex's array and interpolate them
// vertex(v) here fetches the data of the vertex at index 'v'
for (int n = 0; n < vertex(v).wnormals.size(); v++)
{
N += vertex(v).wnormals.at(n);
}

// normalize the final normal
normalize(N);
}

Here's an example of a "naive" average of the normals (i.e. without angle weighting);

Sample Image

You can see the facet components being all the same, but since some sides have two faces, their part of the interpolation is doubled, making the average skewed. Weighting only against the surface area, but not the angles, produces similar results.

This is instead the same model, but with angle weighting enabled;

Sample Image

Now the interpolated normals are all geometrically correct.

Calculating normals on terrain mesh

Thanks to ybungalobill I did the following to make it work:

  1. Created a normal map from the original heighmap (symmetrical grid) using the following code:

Calculating normals from height map

// Calculating normals from height map
public void calcNormals() {
Vec3 up = new Vec3(0, 1, 0);
float sizeFactor = 1.0f / (8.0f * cellSize);
normals = new Vec3[rows * cols];

for (int row = 0; row < rows; row++) {
for (int col = 0; col < cols; col++) {
Vec3 normal = up;

if (col > 0 && row > 0 && col < cols - 1 && row < rows - 1) {
float nw = getValue(row - 1, col - 1);
float n = getValue(row - 1, col);
float ne = getValue(row - 1, col + 1);
float e = getValue(row, col + 1);
float se = getValue(row + 1, col + 1);
float s = getValue(row + 1, col);
float sw = getValue(row + 1, col - 1);
float w = getValue(row, col - 1);

float dydx = ((ne + 2 * e + se) - (nw + 2 * w + sw)) * sizeFactor;
float dydz = ((sw + 2 * s + se) - (nw + 2 * n + ne)) * sizeFactor;

normal = new Vec3(-dydx, 1.0f, -dydz).getUnitVector();
}

normals[row * cols + col] = normal;
}
}
}

Creating image from normals

public static BufferedImage getNormalMap(Terrain terrain) {
Vec3[] normals = terrain.getNormals();
float[] pixels = new float[normals.length * 3];

for (int i = 0; i < normals.length; i++) {
Vec3 normal = normals[i];
float x = (1.0f + normal.x) * 0.5f;
float y = (1.0f + normal.y) * 0.5f;
float z = (1.0f + normal.z) * 0.5f;
pixels[i * 3] = x * MAX;
pixels[i * 3 + 1] = y * MAX;
pixels[i * 3 + 2] = z * MAX;
}

BufferedImage img = new BufferedImage(cols, rows, BufferedImage.TYPE_INT_RGB);
WritableRaster imgRaster = img.getRaster();
imgRaster.setPixels(0, 0, cols, rows, pixels);
return img;
}

  1. Applied the image in the fragment shader and calculated the texture coordinates using the vertex position from the vertex shader:

Part of fragment shader:

void main() {
vec3 newNormal = texture(normalMap, vec2(worldPos0.x / maxX, worldPos0.z / maxZ)).xyz;
newNormal = (2.0 * newNormal) - 1.0;
outputColor = calcColor(normalize(newNormal));
}

The result is the following:

C++ Sample Image 1

Same view with point rendering:

C++ Sample Image 2

In other words: few vertices but visually high detail terrain



Related Topics



Leave a reply



Submit