www.xbdev.net
xbdev - software development
Tuesday April 14, 2026
Home | Contact | Support | Computer Graphics Powerful and Beautiful ...
     
 

Physically-Based
Rendering

Lights and Rays ...

 



[TOC] Chapter 9: Textures


Textures play a crucial role in ray tracing as they add detail and realism to surfaces by providing color, patterns, and other surface properties. This section explores various aspects of textures, including sampling and antialiasing, texture coordinate generation, interfaces for textures, image textures, solid and procedural texturing, and the use of noise.


Sampling and Antialiasing


Sampling in the context of textures refers to the process of retrieving color or intensity information from a texture map. Antialiasing is a technique used to minimize artifacts, such as jagged edges or moir patterns, which can occur due to insufficient sampling when rendering images.

Sampling from Textures


When sampling from textures, especially at non-integer coordinates, bilinear interpolation is commonly used. Bilinear interpolation considers the four nearest texels (texture pixels) to compute the color at a given UV coordinate.

For a texture defined at coordinates \( (u, v) \), the colors of the surrounding texels can be defined as:
- \( C_{00} = T(x_0, y_0) \)
- \( C_{10} = T(x_0, y_1) \)
- \( C_{01} = T(x_1, y_0) \)
- \( C_{11} = T(x_1, y_1) \)

where \( T \) is the texture function and \( (x_0, y_0), (x_0, y_1), (x_1, y_0), (x_1, y_1) \) are the corners of the texels.

The color \( C \) at the coordinate \( (u, v) \) can be computed as follows:

\[
C = (1 - u) \cdot (1 - v) \cdot C_{00} + u \cdot (1 - v) \cdot C_{10} + (1 - u) \cdot v \cdot C_{01} + u \cdot v \cdot C_{11}
\]

Antialiasing Techniques


To implement antialiasing for textures, various techniques can be employed:
- Supersampling: Involves sampling the texture at higher resolutions and averaging the results.
- Mipmap: A precomputed sequence of textures at different resolutions that allows for smoother transitions and reduces aliasing when textures are viewed at a distance.

Example of Bilinear Interpolation in JavaScript:

function bilinearInterpolation(texture, u, v) {
    const width = texture.width;
    const height = texture.height;
    
    // Calculate texel coordinates
    const x0 = Math.floor(u * width);
    const y0 = Math.floor(v * height);
    const x1 = Math.min(x0 + 1, width - 1);
    const y1 = Math.min(y0 + 1, height - 1);
    
    const C00 = texture.getPixel(x0, y0);
    const C10 = texture.getPixel(x1, y0);
    const C01 = texture.getPixel(x0, y1);
    const C11 = texture.getPixel(x1, y1);
    
    // Calculate the contributions of each corner
    const du = u * width - x0;
    const dv = v * height - y0;

    // Perform bilinear interpolation
    return (1 - du) * (1 - dv) * C00 +
           du * (1 - dv) * C10 +
           (1 - du) * dv * C01 +
           du * dv * C11;
}




Texture Coordinate Generation


Texture coordinates (UV coordinates) are crucial for mapping textures onto 3D models. They define how textures are applied to surfaces, determining which part of a texture corresponds to which part of a surface.

Generating Texture Coordinates


For different geometric shapes, texture coordinates can be generated in various ways:

Planar Mapping: Directly projects a texture onto a plane. For a rectangle in the XY plane, \( (u, v) \) can be directly mapped as:

\[
u = \frac{x - x_{\text{min}}}{x_{\text{max}} - x_{\text{min}}}, \quad v = \frac{y - y_{\text{min}}}{y_{\text{max}} - y_{\text{min}}}
\]

Cylindrical Mapping: For cylindrical shapes, the coordinates are generated based on the angle around the cylinder and height:

\[
u = \frac{\theta}{2\pi}, \quad v = \frac{z - z_{\text{min}}}{z_{\text{max}} - z_{\text{min}}}
\]

where \( \theta = \text{atan2}(y, x) \).

Spherical Mapping: For spherical shapes, the coordinates can be generated based on latitude and longitude:

\[
u = \frac{\theta + \pi}{2\pi}, \quad v = \frac{\phi + \frac{\pi}{2}}{\pi}
\]

where \( \theta \) and \( \phi \) are the azimuthal and polar angles respectively.

Example of Generating UV Coordinates for a Sphere in JavaScript:

function getSphereUV(x, y, z) {
    const u = 0.5 + Math.atan2(z, x) / (2 * Math.PI);
    const v = 0.5 - Math.asin(y) / Math.PI; // Inverse of y for v
    return { u, v };
}



Basic Texture Implementation


Defines how different types of textures are implemented in a ray tracing system. Basic textures can include image textures, solid colors, and procedural textures.

Texture Example


The texture class should define methods for retrieving colors based on UV coordinates and any other relevant properties.

class Texture {
    getColor(u, v) {
        throw new Error("getColor() method must be implemented.");
    }
}


Solid Color Texture: A texture that returns a constant color regardless of UV coordinates.

class SolidColorTexture extends Texture {
    constructor(color) {
        super();
        this.color = color;
    }

    getColor(u, v) {
        return this.color;
    }
}


Image Texture: A texture loaded from an image file, utilizing the bilinear interpolation method discussed earlier.

class ImageTexture extends Texture {
    constructor(image) {
        super();
        this.image = image; // Load image data
    }

    getColor(u, v) {
        // Convert UV to pixel coordinates
        const x = Math.floor(u * this.image.width);
        const y = Math.floor(v * this.image.height);
        return this.image.getPixel(x, y);
    }
}




Image Texture


Image textures are a widely used form of texture mapping where a bitmap image is mapped onto a surface. They can add rich details and complexity to a model without increasing the geometric complexity.

Loading and Using Image Textures


When using image textures, it is essential to load the image data and provide a way to access pixel colors.

class ImageTexture extends Texture {
    constructor(imageSrc) {
        super();
        this.image = new Image();
        this.image.src = imageSrc;
        this.image.onload = () => {
            this.canvas = document.createElement('canvas');
            this.canvas.width = this.image.width;
            this.canvas.height = this.image.height;
            const ctx = this.canvas.getContext('2d');
            ctx.drawImage(this.image, 0, 0);
            this.imageData = ctx.getImageData(0, 0, this.canvas.width, this.canvas.height);
        };
    }

    getColor(u, v) {
        if (!this.imageData) return { r: 0, g: 0, b: 0 }; // Default color if image not loaded

        const x = Math.floor(u * this.image.width);
        const y = Math.floor(v * this.image.height);
        const index = (y * this.canvas.width + x) * 4;

        return {
            r: this.imageData.data[index] / 255,
            g: this.imageData.data[index + 1] / 255,
            b: this.imageData.data[index + 2] / 255,
        };
    }
}


Solid and Procedural Texturing


Solid texturing refers to textures that are generated procedurally based on mathematical functions rather than being loaded from image files. This approach can yield complex patterns without requiring large image files.

Solid Texture Example


A simple example of a solid color texture is one that generates a checkerboard pattern.

class CheckerboardTexture extends Texture {
    constructor(color1, color2, scale) {
        super();
        this.color1 = color1;
        this.color2 = color2;
        this.scale = scale;
    }

    getColor(u, v) {
        const uScaled = Math.floor(u * this.scale);
        const vScaled = Math.floor(v * this.scale);
        const checker = (uScaled + vScaled) % 2 === 0;

        return checker ? this.color1 : this.color2;
    }
}


Procedural Textures


Procedural textures can include patterns like noise, gradients, or more complex functions. For example, Perlin noise can be used to create realistic textures.

Example of Simple Noise Generation

Start by defining a simplex noise class.

class SimplexNoise {
    constructor() {
        this.permutation = [];
        this.grad3 = [
            [1, 1, 0], [-1, 1, 0], [1, -1, 0], [-1, -1, 0],
            [1, 0, 1], [-1, 0, 1], [1, 0, -1], [-1, 0, -1],
            [0, 1, 1], [0, -1, 1], [0, 1, -1], [0, -1, -1]
        ];
        this.initPermutation();
    }

    initPermutation() {
        const p = [];
        for (let i = 0; i < 256; i++) {
            p[i] = Math.floor(Math.random() * 256);
        }
        // Duplicate the permutation array
        for (let i = 0; i < 512; i++) {
            this.permutation[i] = p[i & 255];
        }
    }

    dot(g, x, y) {
        return g[0] * x + g[1] * y;
    }

    noise(xin, yin) {
        // Skewing factors
        const F2 = (Math.sqrt(3) - 1) / 2; // F2 = 0.5*(sqrt(3)-1)
        const s = (xin + yin) * 0.5 * (Math.sqrt(3) - 1); // Skew the input space to determine which simplex cell we're in.
        const i = Math.floor(xin + s);
        const j = Math.floor(yin + s);
        
        const t = (i + j) * (3 - Math.sqrt(3)) / 6; // Unskew the cell origin back to (x,y) space.
        const x0 = xin - (i - t); // The x and y distances from the cell origin.
        const y0 = yin - (j - t);
        
        // For the 2D case, the simplex shape is an equilateral triangle.
        // Determine which simplex we are in.
        const i1 = (x0 > y0) ? 1 : 0; // i1 = 1 if x0 > y0, 0 otherwise
        const j1 = 1 - i1; // j1 = 1 if x0 <= y0, 0 otherwise
        
        // The x and y distances from the simplex corners.
        const x1 = x0 - i1 + (3 - Math.sqrt(3)) / 6; // Offsets for the corners
        const y1 = y0 - j1 + (3 - Math.sqrt(3)) / 6;
        const x2 = x0 - 1 + 2 * (3 - Math.sqrt(3)) / 6; // Offsets for the corners
        const y2 = y0 - 1 + 2 * (3 - Math.sqrt(3)) / 6;
        
        // Hash coordinates of the three simplex corners.
        const ii = i & 255;
        const jj = j & 255;
        const gi0 = this.permutation[ii + this.permutation[jj]] % 12;
        const gi1 = this.permutation[ii + i1 + this.permutation[jj + j1]] % 12;
        const gi2 = this.permutation[ii + 1 + this.permutation[jj + 1]] % 12;

        // Calculate the contribution from the three corners.
        const n0 = this.calculateContribution(gi0, x0, y0);
        const n1 = this.calculateContribution(gi1, x1, y1);
        const n2 = this.calculateContribution(gi2, x2, y2);

        // Add contributions together and scale to return value.
        return (n0 + n1 + n2) * 70; // Scale the result to return value
    }

    calculateContribution(gi, x, y) {
        const grad = this.grad3[gi];
        const t = 0.5 - x * x - y * y; // This is the square of the distance to the 2D corner
        if (t < 0) return 0; // In this case, the contribution is zero
        // Compute the gradient and scale it by the intensity function
        return (t * t) * this.dot(grad, x, y); // (t^2) * dot(gradient, distance)
    }
}

// Example usage
const noise = new SimplexNoise();
const value = noise.noise(0.5, 0.5);
console.log(value);


Using the noise function to generate a texture

function generateNoiseTexture(width, height) {
    const noiseTexture = new Uint8ClampedArray(width * height * 4);
    const simplex = new SimplexNoise();
    
    for (let y = 0; y < height; y++) {
        for (let x = 0; x < width; x++) {
            const value = simplex.noise(x / 100, y / 100); // Scale coordinates to control noise frequency
            const colorValue = Math.floor((value + 1) * 127.5); // Normalize to [0, 255]

            const index = (y * width + x) * 4;
            noiseTexture[index] = colorValue;      // Red
            noiseTexture[index + 1] = colorValue;  // Green
            noiseTexture[index + 2] = colorValue;  // Blue
            noiseTexture[index + 3] = 255;          // Alpha
        }
    }
    return noiseTexture;
}

// Example usage to create a canvas image
const width = 512;
const height = 512;
const noiseData = generateNoiseTexture(width, height);

const canvas = document.createElement('canvas');
canvas.width = width;
canvas.height = height;
const ctx = canvas.getContext('2d');
const imageData = ctx.createImageData(width, height);
imageData.data.set(noiseData);
ctx.putImageData(imageData, 0, 0);

document.body.appendChild(canvas);


Noise


Noise is often used in textures to create randomness or variation, contributing to the realism of rendered images. It can simulate natural phenomena, such as rough surfaces, clouds, and terrains.

Types of Noise


Perlin Noise: A gradient noise that produces smooth transitions, often used for textures.

Simplex Noise: An improved version of Perlin noise with better visual quality and performance in higher dimensions.

Example of Using Noise in Texturing


You can combine noise with color manipulation to create more dynamic textures. For example, to create a cloud-like texture:

class CloudTexture extends Texture {
    constructor(scale) {
        super();
        this.scale = scale;
    }

    getColor(u, v) {
        const noiseValue = simplexNoise(u * this.scale, v * this.scale);
        const colorValue = Math.max(0, Math.min(1, noiseValue));
        return { r: colorValue, g: colorValue, b: colorValue }; // White clouds
    }
}








Ray-Tracing with WebGPU kenwright WebGPU Development Cookbook - coding recipes for all your webgpu needs! WebGPU by Example: Fractals, Image Effects, Ray-Tracing, Procedural Geometry, 2D/3D, Particles, Simulations WebGPU Games WGSL 2d 3d interactive web-based fun learning WebGPU Compute WebGPU API - Owners WebGPU & WGSL Essentials: A Hands-On Approach to Interactive Graphics, Games, 2D Interfaces, 3D Meshes, Animation, Security and Production Kenwright graphics and animations using the webgpu api 12 week course kenwright learn webgpu api kenwright programming compute and graphics applications with html5 and webgpu api kenwright real-time 3d graphics with webgpu kenwright webgpu for dummies kenwright webgpu api develompent a quick start guide kenwright webgpu by example 2022 kenwright webgpu gems kenwright webgpu interactive compute and graphics visualization cookbook kenwright wgsl webgpu shading language cookbook kenwright WebGPU Shader Language Development: Vertex, Fragment, Compute Shaders for Programmers Kenwright wgsl webgpugems shading language cookbook kenwright WGSL Fundamentals book kenwright WebGPU Data Visualization Cookbook kenwright Special Effects Programming with WebGPU kenwright WebGPU Programming Guide: Interactive Graphics and Compute Programming with WebGPU & WGSL kenwright



 
Advert (Support Website)

 
 Visitor:
Copyright (c) 2002-2026 xbdev.net - All rights reserved.
Designated articles, tutorials and software are the property of their respective owners.