www.xbdev.net
xbdev - software development
Friday February 20, 2026
Home | Contact | Support | Computer Graphics Powerful and Beautiful ...
     
 

Physically-Based
Rendering

Lights and Rays ...

 



[TOC] Chapter 8: Materials


Materials in computer graphics define how surfaces interact with light, influencing their appearance and visual characteristics. They are characterized by their reflective, refractive, and absorptive properties, which determine how light behaves when it hits the surface. Different types of materials, such as diffuse, specular, or transparent, allow for the simulation of a wide range of real-world surfaces in rendering applications.

Bidirectional Scattering Distribution Functions (BSDFs)


BSDF (Bidirectional Scattering Distribution Function) is a function that describes how light is scattered when it hits a surface. It encompasses both BRDFs (Bidirectional Reflectance Distribution Functions), which account for reflected light, and BTDFs (Bidirectional Transmission Distribution Functions), which account for transmitted light.

In ray tracing, BSDFs are essential for determining how light interacts with materials. The equation that defines a BSDF is:

\[
f(\omega_o, \omega_i) = \frac{dL_o(\omega_o)}{dE_i(\omega_i)}
\]

where:
\(\omega_o\) is the outgoing direction,
\(\omega_i\) is the incoming direction,
\(L_o(\omega_o)\) is the outgoing radiance,
\(E_i(\omega_i)\) is the incoming irradiance.

The BSDF integrates both reflection and transmission, allowing you to model transparent materials (e.g., glass) or reflective surfaces (e.g., metals).

Example: Implementing a BSDF in JavaScript


For a basic BSDF, we combine both reflection (BRDF) and transmission (BTDF) components.

class BSDF {
    
constructor(reflectancetransmittance) {
        
this.reflectance reflectance;  // BRDF reflectance
        
this.transmittance transmittance;  // BTDF transmittance
    
}

    
evaluate(incomingoutgoingnormal) {
        const 
reflection this.evaluateBRDF(incomingoutgoingnormal);
        const 
transmission this.evaluateBTDF(incomingoutgoingnormal);
        return 
reflection transmission;
    }

    
// Evaluate reflection (BRDF component)
    
evaluateBRDF(incomingoutgoingnormal) {
        const 
cosTheta Math.max(0dot(incomingnormal));
        return 
this.reflectance cosTheta Math.PI;  // Lambertian reflection
    
}

    
// Evaluate transmission (BTDF component)
    
evaluateBTDF(incomingoutgoingnormal) {
        const 
cosTheta Math.abs(dot(incomingnormal));
        return 
this.transmittance cosTheta;  // Simple transmission
    
}
}

// Usage
const bsdf = new BSDF(0.80.2);  // Reflective and transmissive material
const incoming = [1, -10];  // Incoming ray
const outgoing = [-110];  // Outgoing ray
const normal = [010];  // Surface normal
const bsdfValue bsdf.evaluate(incomingoutgoingnormal);
console.log("BSDF Value:"bsdfValue);


This code evaluates the BSDF for a basic reflective and transmissive material using Lambertian reflection for diffuse surfaces.

Implementing Materials


Materials define how surfaces interact with light. You want to specific the shading models (e.g., diffuse, reflective, transmissive) and providing the necessary data (like the BSDF) to determine the appearance of an object.

General Material Implementation in JavaScript


This implementation shows a basic material system. The `Material` class uses a BSDF to evaluate the interaction between incoming and outgoing light, which will be later used in the ray tracing loop.

class Material {
    
constructor(bsdf) {
        
this.bsdf bsdf;
    }

    
// Method to compute shading at a surface point
    
shade(rayintersectionscene) {
        const 
incoming ray.direction;
        const 
normal intersection.normal;
        const 
outgoing this.computeOutgoing(incomingnormal);

        
// Get the BSDF response for the given incoming and outgoing directions
        
return this.bsdf.evaluate(incomingoutgoingnormal);
    }

    
// Dummy implementation for outgoing direction
    
computeOutgoing(incomingnormal) {
        
// Reflect the ray for simplicity
        
return reflect(incomingnormal);
    }
}

// Utility function to reflect a vector
function reflect(directionnormal) {
    return 
subtract(directionscale(normaldot(directionnormal)));
}

// Usage
const material = new Material(new BSDF(0.80.2));
const 
ray = { direction: [1, -10] };  // Incoming ray
const intersection = { normal: [010] };  // Surface normal at intersection
const scene = {};  // Dummy scene object
const shadedColor material.shade(rayintersectionscene);
console.log("Shaded Color:"shadedColor);




Material Types


Diffuse: A surface that reflects light equally in all directions (Lambertian).
Specular: A shiny, mirror-like surface where the angle of reflection equals the angle of incidence.
Glossy: A surface that reflects light predominantly in one direction but with some spread.
Transparent: A material that allows light to pass through, potentially refracting based on the refractive index.

Bump Mapping


Bump mapping is a technique used to simulate small surface details (like wrinkles or grooves) without altering the actual geometry of the object. Instead of modifying the mesh, bump mapping alters the surface normal at a point, creating the illusion of a more complex surface when light interacts with it.

Bump mapping uses a height map (a grayscale texture) to perturb the surface normal during shading calculations. This change in the normal creates the appearance of depth without increasing the polygon count.

Bump Mapping Math


The new perturbed normal \( \mathbf{n}' \) is computed as:

\[
\mathbf{n}' = \mathbf{n} + \frac{\partial h}{\partial u} \mathbf{t} + \frac{\partial h}{\partial v} \mathbf{b}
\]

where:
\( h(u, v) \) is the height map,
\( \mathbf{t} \) and \( \mathbf{b} \) are the tangent and bitangent vectors of the surface,
\( \mathbf{n} \) is the original surface normal.

Bump Mapping Example in JavaScript


function perturbNormal(normaluvheightMaptangentbitangent) {
    const 
deltaU getHeight(heightMapuv[0] + 0.01uv[1]) - getHeight(heightMapuv[0] - 0.01uv[1]);
    const 
deltaV getHeight(heightMapuv[0], uv[1] + 0.01) - getHeight(heightMapuv[0], uv[1] - 0.01);

    
// Perturb the normal based on the height map deltas
    
const newNormal add(normalscale(tangentdeltaU), scale(bitangentdeltaV));
    return 
normalize(newNormal);
}

function 
getHeight(heightMapuv) {
    
// This function would sample the height from the texture at UV coordinates (u, v)
    // For simplicity, we use a dummy value here (replace with actual texture sampling logic)
    
return Math.sin(10) * Math.sin(10) * 0.1;  // Example height pattern
}

// Example usage:
const normal = [010];  // Original normal
const tangent = [100];  // Tangent vector
const bitangent = [001];  // Bitangent vector
const uv = [0.50.5];  // UV coordinates for height map
const heightMap = {};  // Placeholder for height map texture

const perturbedNormal perturbNormal(normaluvheightMaptangentbitangent);
console.log("Perturbed Normal:"perturbedNormal);


This code demonstrates the basic idea of bump mapping: perturbing the surface normal using a height map. The `getHeight` function simulates a height map texture, and the `perturbNormal` function calculates the new normal by adjusting the original normal based on the height deltas.

Applying Bump Mapping in Material Shading


To apply bump mapping in the material shading process, we modify the normal before using it in the lighting calculations:

class BumpMappedMaterial extends Material {
    
constructor(bsdfheightMap) {
        
super(bsdf);
        
this.heightMap heightMap;
    }

    
shade(rayintersectionscene) {
        const 
normal this.perturbNormal(intersection.normalintersection.uv);
        const 
incoming ray.direction;
        const 
outgoing this.computeOutgoing(incomingnormal);

        
// Get the BSDF response for the perturbed normal
        
return this.bsdf.evaluate(incomingoutgoingnormal);
    }

    
// Perturb the normal using the height map
    
perturbNormal(normaluv) {
        const 
tangent = [100];  // Example tangent
        
const bitangent = [001];  // Example bitangent
        
return perturbNormal(normaluvthis.heightMaptangentbitangent);
    }
}

// Example usage
const bumpMappedMaterial = new BumpMappedMaterial(new BSDF(0.80.2), {});
const 
ray = { direction: [1, -10] };
const 
intersection = { normal: [010], uv: [0.50.5] };
const 
shadedColor bumpMappedMaterial.shade(rayintersection, {});
console.log("Shaded Color with Bump Mapping:"shadedColor);








Ray-Tracing with WebGPU kenwright WebGPU Development Cookbook - coding recipes for all your webgpu needs! WebGPU by Example: Fractals, Image Effects, Ray-Tracing, Procedural Geometry, 2D/3D, Particles, Simulations WebGPU Games WGSL 2d 3d interactive web-based fun learning WebGPU Compute WebGPU API - Owners WebGPU & WGSL Essentials: A Hands-On Approach to Interactive Graphics, Games, 2D Interfaces, 3D Meshes, Animation, Security and Production Kenwright graphics and animations using the webgpu api 12 week course kenwright learn webgpu api kenwright programming compute and graphics applications with html5 and webgpu api kenwright real-time 3d graphics with webgpu kenwright webgpu for dummies kenwright webgpu api develompent a quick start guide kenwright webgpu by example 2022 kenwright webgpu gems kenwright webgpu interactive compute and graphics visualization cookbook kenwright wgsl webgpu shading language cookbook kenwright WebGPU Shader Language Development: Vertex, Fragment, Compute Shaders for Programmers Kenwright wgsl webgpugems shading language cookbook kenwright WGSL Fundamentals book kenwright WebGPU Data Visualization Cookbook kenwright Special Effects Programming with WebGPU kenwright WebGPU Programming Guide: Interactive Graphics and Compute Programming with WebGPU & WGSL kenwright



 
Advert (Support Website)

 
 Visitor:
Copyright (c) 2002-2025 xbdev.net - All rights reserved.
Designated articles, tutorials and software are the property of their respective owners.