www.xbdev.net
xbdev - software development
Friday February 20, 2026
Home | Contact | Support | Computer Graphics Powerful and Beautiful ...
     
 

Physically-Based
Rendering

Lights and Rays ...

 



[TOC] Chapter 9: Textures


Textures play a crucial role in ray tracing as they add detail and realism to surfaces by providing color, patterns, and other surface properties. This section explores various aspects of textures, including sampling and antialiasing, texture coordinate generation, interfaces for textures, image textures, solid and procedural texturing, and the use of noise.


Sampling and Antialiasing


Sampling in the context of textures refers to the process of retrieving color or intensity information from a texture map. Antialiasing is a technique used to minimize artifacts, such as jagged edges or moir patterns, which can occur due to insufficient sampling when rendering images.

Sampling from Textures


When sampling from textures, especially at non-integer coordinates, bilinear interpolation is commonly used. Bilinear interpolation considers the four nearest texels (texture pixels) to compute the color at a given UV coordinate.

For a texture defined at coordinates \( (u, v) \), the colors of the surrounding texels can be defined as:
- \( C_{00} = T(x_0, y_0) \)
- \( C_{10} = T(x_0, y_1) \)
- \( C_{01} = T(x_1, y_0) \)
- \( C_{11} = T(x_1, y_1) \)

where \( T \) is the texture function and \( (x_0, y_0), (x_0, y_1), (x_1, y_0), (x_1, y_1) \) are the corners of the texels.

The color \( C \) at the coordinate \( (u, v) \) can be computed as follows:

\[
C = (1 - u) \cdot (1 - v) \cdot C_{00} + u \cdot (1 - v) \cdot C_{10} + (1 - u) \cdot v \cdot C_{01} + u \cdot v \cdot C_{11}
\]

Antialiasing Techniques


To implement antialiasing for textures, various techniques can be employed:
- Supersampling: Involves sampling the texture at higher resolutions and averaging the results.
- Mipmap: A precomputed sequence of textures at different resolutions that allows for smoother transitions and reduces aliasing when textures are viewed at a distance.

Example of Bilinear Interpolation in JavaScript:

function bilinearInterpolation(textureuv) {
    const 
width texture.width;
    const 
height texture.height;
    
    
// Calculate texel coordinates
    
const x0 Math.floor(width);
    const 
y0 Math.floor(height);
    const 
x1 Math.min(x0 1width 1);
    const 
y1 Math.min(y0 1height 1);
    
    const 
C00 texture.getPixel(x0y0);
    const 
C10 texture.getPixel(x1y0);
    const 
C01 texture.getPixel(x0y1);
    const 
C11 texture.getPixel(x1y1);
    
    
// Calculate the contributions of each corner
    
const du width x0;
    const 
dv height y0;

    
// Perform bilinear interpolation
    
return (du) * (dv) * C00 +
           
du * (dv) * C10 +
           (
du) * dv C01 +
           
du dv C11;
}




Texture Coordinate Generation


Texture coordinates (UV coordinates) are crucial for mapping textures onto 3D models. They define how textures are applied to surfaces, determining which part of a texture corresponds to which part of a surface.

Generating Texture Coordinates


For different geometric shapes, texture coordinates can be generated in various ways:

Planar Mapping: Directly projects a texture onto a plane. For a rectangle in the XY plane, \( (u, v) \) can be directly mapped as:

\[
u = \frac{x - x_{\text{min}}}{x_{\text{max}} - x_{\text{min}}}, \quad v = \frac{y - y_{\text{min}}}{y_{\text{max}} - y_{\text{min}}}
\]

Cylindrical Mapping: For cylindrical shapes, the coordinates are generated based on the angle around the cylinder and height:

\[
u = \frac{\theta}{2\pi}, \quad v = \frac{z - z_{\text{min}}}{z_{\text{max}} - z_{\text{min}}}
\]

where \( \theta = \text{atan2}(y, x) \).

Spherical Mapping: For spherical shapes, the coordinates can be generated based on latitude and longitude:

\[
u = \frac{\theta + \pi}{2\pi}, \quad v = \frac{\phi + \frac{\pi}{2}}{\pi}
\]

where \( \theta \) and \( \phi \) are the azimuthal and polar angles respectively.

Example of Generating UV Coordinates for a Sphere in JavaScript:

function getSphereUV(xyz) {
    const 
0.5 Math.atan2(zx) / (Math.PI);
    const 
0.5 Math.asin(y) / Math.PI// Inverse of y for v
    
return { u};
}



Basic Texture Implementation


Defines how different types of textures are implemented in a ray tracing system. Basic textures can include image textures, solid colors, and procedural textures.

Texture Example


The texture class should define methods for retrieving colors based on UV coordinates and any other relevant properties.

class Texture {
    
getColor(uv) {
        throw new 
Error("getColor() method must be implemented.");
    }
}


Solid Color Texture: A texture that returns a constant color regardless of UV coordinates.

class SolidColorTexture extends Texture {
    
constructor(color) {
        
super();
        
this.color color;
    }

    
getColor(uv) {
        return 
this.color;
    }
}


Image Texture: A texture loaded from an image file, utilizing the bilinear interpolation method discussed earlier.

class ImageTexture extends Texture {
    
constructor(image) {
        
super();
        
this.image image// Load image data
    
}

    
getColor(uv) {
        
// Convert UV to pixel coordinates
        
const Math.floor(this.image.width);
        const 
Math.floor(this.image.height);
        return 
this.image.getPixel(xy);
    }
}




Image Texture


Image textures are a widely used form of texture mapping where a bitmap image is mapped onto a surface. They can add rich details and complexity to a model without increasing the geometric complexity.

Loading and Using Image Textures


When using image textures, it is essential to load the image data and provide a way to access pixel colors.

class ImageTexture extends Texture {
    
constructor(imageSrc) {
        
super();
        
this.image = new Image();
        
this.image.src imageSrc;
        
this.image.onload = () => {
            
this.canvas document.createElement('canvas');
            
this.canvas.width this.image.width;
            
this.canvas.height this.image.height;
            const 
ctx this.canvas.getContext('2d');
            
ctx.drawImage(this.image00);
            
this.imageData ctx.getImageData(00this.canvas.widththis.canvas.height);
        };
    }

    
getColor(uv) {
        if (!
this.imageData) return { r0g0b}; // Default color if image not loaded

        
const Math.floor(this.image.width);
        const 
Math.floor(this.image.height);
        const 
index = (this.canvas.width x) * 4;

        return {
            
rthis.imageData.data[index] / 255,
            
gthis.imageData.data[index 1] / 255,
            
bthis.imageData.data[index 2] / 255,
        };
    }
}


Solid and Procedural Texturing


Solid texturing refers to textures that are generated procedurally based on mathematical functions rather than being loaded from image files. This approach can yield complex patterns without requiring large image files.

Solid Texture Example


A simple example of a solid color texture is one that generates a checkerboard pattern.

class CheckerboardTexture extends Texture {
    
constructor(color1color2scale) {
        
super();
        
this.color1 color1;
        
this.color2 color2;
        
this.scale scale;
    }

    
getColor(uv) {
        const 
uScaled Math.floor(this.scale);
        const 
vScaled Math.floor(this.scale);
        const 
checker = (uScaled vScaled) % === 0;

        return 
checker this.color1 this.color2;
    }
}


Procedural Textures


Procedural textures can include patterns like noise, gradients, or more complex functions. For example, Perlin noise can be used to create realistic textures.

Example of Simple Noise Generation

Start by defining a simplex noise class.

class SimplexNoise {
    
constructor() {
        
this.permutation = [];
        
this.grad3 = [
            [
110], [-110], [1, -10], [-1, -10],
            [
101], [-101], [10, -1], [-10, -1],
            [
011], [0, -11], [01, -1], [0, -1, -1]
        ];
        
this.initPermutation();
    }

    
initPermutation() {
        const 
= [];
        for (
let i 0256i++) {
            
p[i] = Math.floor(Math.random() * 256);
        }
        
// Duplicate the permutation array
        
for (let i 0512i++) {
            
this.permutation[i] = p[255];
        }
    }

    
dot(gxy) {
        return 
g[0] * g[1] * y;
    }

    
noise(xinyin) {
        
// Skewing factors
        
const F2 = (Math.sqrt(3) - 1) / 2// F2 = 0.5*(sqrt(3)-1)
        
const = (xin yin) * 0.5 * (Math.sqrt(3) - 1); // Skew the input space to determine which simplex cell we're in.
        
const Math.floor(xin s);
        const 
Math.floor(yin s);
        
        const 
= (j) * (Math.sqrt(3)) / 6// Unskew the cell origin back to (x,y) space.
        
const x0 xin - (t); // The x and y distances from the cell origin.
        
const y0 yin - (t);
        
        
// For the 2D case, the simplex shape is an equilateral triangle.
        // Determine which simplex we are in.
        
const i1 = (x0 y0) ? 0// i1 = 1 if x0 > y0, 0 otherwise
        
const j1 i1// j1 = 1 if x0 <= y0, 0 otherwise
        
        // The x and y distances from the simplex corners.
        
const x1 x0 i1 + (Math.sqrt(3)) / 6// Offsets for the corners
        
const y1 y0 j1 + (Math.sqrt(3)) / 6;
        const 
x2 x0 * (Math.sqrt(3)) / 6// Offsets for the corners
        
const y2 y0 * (Math.sqrt(3)) / 6;
        
        
// Hash coordinates of the three simplex corners.
        
const ii 255;
        const 
jj 255;
        const 
gi0 this.permutation[ii this.permutation[jj]] % 12;
        const 
gi1 this.permutation[ii i1 this.permutation[jj j1]] % 12;
        const 
gi2 this.permutation[ii this.permutation[jj 1]] % 12;

        
// Calculate the contribution from the three corners.
        
const n0 this.calculateContribution(gi0x0y0);
        const 
n1 this.calculateContribution(gi1x1y1);
        const 
n2 this.calculateContribution(gi2x2y2);

        
// Add contributions together and scale to return value.
        
return (n0 n1 n2) * 70// Scale the result to return value
    
}

    
calculateContribution(gixy) {
        const 
grad this.grad3[gi];
        const 
0.5 y// This is the square of the distance to the 2D corner
        
if (0) return 0// In this case, the contribution is zero
        // Compute the gradient and scale it by the intensity function
        
return (t) * this.dot(gradxy); // (t^2) * dot(gradient, distance)
    
}
}

// Example usage
const noise = new SimplexNoise();
const 
value noise.noise(0.50.5);
console.log(value);


Using the noise function to generate a texture

function generateNoiseTexture(widthheight) {
    const 
noiseTexture = new Uint8ClampedArray(width height 4);
    const 
simplex = new SimplexNoise();
    
    for (
let y 0heighty++) {
        for (
let x 0widthx++) {
            const 
value simplex.noise(100100); // Scale coordinates to control noise frequency
            
const colorValue Math.floor((value 1) * 127.5); // Normalize to [0, 255]

            
const index = (width x) * 4;
            
noiseTexture[index] = colorValue;      // Red
            
noiseTexture[index 1] = colorValue;  // Green
            
noiseTexture[index 2] = colorValue;  // Blue
            
noiseTexture[index 3] = 255;          // Alpha
        
}
    }
    return 
noiseTexture;
}

// Example usage to create a canvas image
const width 512;
const 
height 512;
const 
noiseData generateNoiseTexture(widthheight);

const 
canvas document.createElement('canvas');
canvas.width width;
canvas.height height;
const 
ctx canvas.getContext('2d');
const 
imageData ctx.createImageData(widthheight);
imageData.data.set(noiseData);
ctx.putImageData(imageData00);

document.body.appendChild(canvas);


Noise


Noise is often used in textures to create randomness or variation, contributing to the realism of rendered images. It can simulate natural phenomena, such as rough surfaces, clouds, and terrains.

Types of Noise


Perlin Noise: A gradient noise that produces smooth transitions, often used for textures.

Simplex Noise: An improved version of Perlin noise with better visual quality and performance in higher dimensions.

Example of Using Noise in Texturing


You can combine noise with color manipulation to create more dynamic textures. For example, to create a cloud-like texture:

class CloudTexture extends Texture {
    
constructor(scale) {
        
super();
        
this.scale scale;
    }

    
getColor(uv) {
        const 
noiseValue simplexNoise(this.scalethis.scale);
        const 
colorValue Math.max(0Math.min(1noiseValue));
        return { 
rcolorValuegcolorValuebcolorValue }; // White clouds
    
}
}








Ray-Tracing with WebGPU kenwright WebGPU Development Cookbook - coding recipes for all your webgpu needs! WebGPU by Example: Fractals, Image Effects, Ray-Tracing, Procedural Geometry, 2D/3D, Particles, Simulations WebGPU Games WGSL 2d 3d interactive web-based fun learning WebGPU Compute WebGPU API - Owners WebGPU & WGSL Essentials: A Hands-On Approach to Interactive Graphics, Games, 2D Interfaces, 3D Meshes, Animation, Security and Production Kenwright graphics and animations using the webgpu api 12 week course kenwright learn webgpu api kenwright programming compute and graphics applications with html5 and webgpu api kenwright real-time 3d graphics with webgpu kenwright webgpu for dummies kenwright webgpu api develompent a quick start guide kenwright webgpu by example 2022 kenwright webgpu gems kenwright webgpu interactive compute and graphics visualization cookbook kenwright wgsl webgpu shading language cookbook kenwright WebGPU Shader Language Development: Vertex, Fragment, Compute Shaders for Programmers Kenwright wgsl webgpugems shading language cookbook kenwright WGSL Fundamentals book kenwright WebGPU Data Visualization Cookbook kenwright Special Effects Programming with WebGPU kenwright WebGPU Programming Guide: Interactive Graphics and Compute Programming with WebGPU & WGSL kenwright



 
Advert (Support Website)

 
 Visitor:
Copyright (c) 2002-2025 xbdev.net - All rights reserved.
Designated articles, tutorials and software are the property of their respective owners.