## пятница, 8 июня 2012 г.

### Stage3D Normal Mapping

Hello. Today I want to show you what normal mapping is. In simple words - it's a mapping tehnique that adds mesostructures to your model - such as noise, cracks, scars - i.e. small bumby details. Of course it can be done with just triangles but in that case we need a lot of them and it will slow down our scene.

What is normal map? Normal map is just a picture that holds normal for certain fragment. Here's a bunch of normal maps. XYZ coordinates encoded as rgb values - usually red responsible for X, green for Y and blue for Z coordinate. Direction of such normal is just a matter of convention but usually 0-0.5 red means that normal points to the left (i.e. x is negative) and 0.5-1 red means that normal points to the right. Similary with green. Blue more interesting. Since we assume that normal always points from surface, not into it, blue component varies from 0.5 to 1. That's why normal maps have bluish color.

So, how can we use it? It's easy - in fragment shader we take our normal and direction to light. If they point in same direction we light up fragment. The more their directions differs the more darker fragment become. The most challenging part is to understand how to transform normal to apropriate space. Take a look at the picture:

Consider some simple model - cube - and apply same normal map to all of it's faces. Next, take normal from center of the front face - let's say it's direction is [0, 0, -1] - i.e. directly from monitor. Next, let's take normal from the center of the right face. Snce maps are the same it's value also [0, 0, -1], but it's wrong - we know that it should be [1, 0, 0]! Somehow we need a way to convert our normals. And tangent (or texture) space is our saviour. Recall that texture have coordinates called [u, v]. And we assign that coordinats to vertices as attributes. Knowing that coordinates can help us to contruct correct space where normal 'lives' and we can transform light direction vector into that space and calculate direction between normal and that vector. Normal is easy to get - we can take cross product of two edges (be aware of handedness). Calculating tangents is bit harder. I followed this resource. Next our normals and tangets are passed as attributes and in fragment shader we calculate third basis vector as their cross product. And our tangent space ready! Here's a vertex shader:

```//attributes
alias va0, pos;
alias va1, tangent;
alias va2, normal;
alias va3, uv;

//constants
alias vc0, LIGHT_WORLD_POSTION; // position of light in world coordinates
alias vc1, ZERO_VECTOR; // [0, 0, 0, 1]
alias vc2, VIEW_PROJECTION_MATRIX; // cameraInverseMatrix * perspectiveProjectionMatrix
alias vc6, MODEL_MATRIX; // model transform
alias vc10, MODEL_ROTATION_MATRIX; // only models rotation

//temps
alias vt0, worldPosition;
alias vt1, normalModel;
alias vt2, tangentModel;
alias vt3, binormalModel;
alias vt4, lightDistance;

//varyings
alias v0, uvOut;
alias v1, lightDistanceOut;

worldPosition = mul4x4(pos, MODEL_MATRIX); // vertex position in world space
lightDistance = LIGHT_WORLD_POSTION - worldPosition; // vertex-light distance in world space
op = mul4x4(worldPosition, VIEW_PROJECTION_MATRIX);

lightDistanceOut = ZERO_VECTOR;

// next we need to rotate normals and tangents. That's why modelRotationMatrix needed
tangentModel = mul4x4(tangent, MODEL_ROTATION_MATRIX); // vertex tangent in world space
normalModel = mul4x4(normal, MODEL_ROTATION_MATRIX); // vertex normal in world space
crs binormalModel.xyz, normalModel, tangentModel; // vertex binormal in world space

/* the most interesting part. I have 3 basis vectors - tangent, normal and binormal that forms tangent space. Matrix is formed as that bases as rows:
|tx  ty  tz|
|bx  by  bz|
|nx  ny  nz|
In order to transform light direction into tangent space, I need to multiply that vector by tangent space transform inverse. Since tangent space matrix is rotation only - it's inverse is transpose:
|tx  bx  nx|
|ty  by  ny|
|tz  bz  nz|
Since I use row-layout for matrix, I need to use row layout vector. Then, multipliyng gves a result:
...............|tx  bx  nx|
|dx  dy  dz| * |ty  by  ny| =
...............|tz  bz  nz|

= |dx * tx + dy * ty + dz * tz, dx * bx + dy * by + dz * bz, dx * nx + dy * ny + dz * nz|
Or simply |dot(d, t), dot(d, b), dot(d, n)|
*/
dp3 lightDistanceOut.x, lightDistance, tangentModel;
dp3 lightDistanceOut.y, lightDistance.xyz, binormalModel.xyz;
dp3 lightDistanceOut.z, lightDistance, normalModel;

uvOut = uv;
```

Fragment shader is much simpler:

```//constants
alias fc0.x, ONE;
alias fc0.y, TWO;

//textures
alias fs0, diffuseTexture;
alias fs1, normalTexture;

//temps
alias ft0, color;
alias ft1, normal;
alias ft2, light;
alias ft3, ang;
alias ft4, temp;
alias ft5, normalColor;

//varyings
alias v0, uvIn;
alias v1, lightDistanceIn;

tex color, uvIn, diffuseTexture<2d, linear, miplinear, repeat>; // get fragment color from color texture
tex normalColor, uvIn, normalTexture<2d, linear, miplinear, repeat>; // get normal from map

// in map all colors stored in 0-1 range. So normals components are in 0-1 range. We need to bring it to -1-1 range.
normalColor *= TWO;
normalColor -= ONE;

// here I negate component to change bumpiness drection (in or out). It can be omitted
neg normalColor.y, normalColor.y;

// calculate direction between normal and light distance
nrm light.xyz, lightDistanceIn.xyz;
dp3 ang.x, normalColor.xyz, light.xyz;

// since dot product gives us result in -1-1 range we need to clamp negative values to 0.
sat ang.x, ang.x;

// finaly multiply color from texture by direction value.
color.xyz *= ang.x;
oc = color;
```

Time for demos. Notice that I used only normal mapping and no other light like ambient or specular.