Sunday, 19 October 2014

Texture Mapping

Set of mappings from Model Coordinates (Object Space) to Texel Values 
- Object space $(x,y,z) \rightarrow $ Parameter space $(u,v)$
- Parameter space $(u,v) \rightarrow $ Texture Image space
- Texture Image space $ \rightarrow $ Texel Values
- Transformation of Texel Values /*optional*/

Resolving mismatch between number of pixels required for rasterizing geometry and the number of texels it maps to
Issues:
- Magnification Aliasing:
  - Source of Jaggies
  - Occurs when texture image is smaller than the rasterization
  - Multiple screen pixels map to the same texture image coordinates
  - Solution: Bilinear Interpolation is required to pick the value from the inverse mapping from screen pixel coordinates onto the texture image coordinates

- Minification Aliasing:
  - Occurs when texture image is larger than rasterization
  - Multiple texture image coordinates map to the same screen pixel
  - It is inefficient to compute the average of all these texels
  - Solution: MIP mapping

MIP (multum im parvo) Mapping:
- Generate a precomputed image pyramid where each lower resolution image is half the dimension of its predecessor
- While doing texture mapping, choose the texture image resolution whose projected texel size closely matches the pixel size

Choosing MIP map level:
$(x,y)$ be the screen pixel coordinates and $(u,v)$ be the texel coordinates. Compute the change in the $u,\; v$ along $x,\; y)$. Choose the resolution for which the $\text{max}(\frac{\partial u}{\partial x},\frac{\partial u}{\partial y},\frac{\partial v}{\partial x},\frac{\partial v}{\partial y})$ is closest to $1$.

Bump Mapping
- Store the height field in a texture
- Perturb the geometric normal by the slopes in the height field along the directions of the tangent and bi-tangent
- Use this normal for lighting computation
Alternatively: Store the normals themselves

Environment Mapping
- An environment map stores the view in every direction from a single point
- It assumes that the objects environment is infinitely distant from the object
- While rendering a point on the object
    - Compute the reflection direction
    - Look up the color stored in that direction in the environment map
    - Use this color to render the point

Sphere Map
A sphere map is a technique that allows us to store the precomputed view in every direction  from a point. It is basically, a photograph of a reflective sphere placed in the environment.
Issues: Distortion towards the edge of the sphere
Solution: Use Cube mapping instead of Sphere

Other uses of texture mapping
- Can implement shading as texture mapping
- Can implement Seeliger lighting model for more realistic skin color rendering


No comments:

Post a Comment