The GPURenderPipeline
interface of the WebGPU API represents a pipeline that controls the vertex and fragment shader stages and can be used in a GPURenderPassEncoder
or GPURenderBundleEncoder
.
A GPURenderPipeline
object instance can be created using the GPUDevice.createRenderPipeline()
or GPUDevice.createRenderPipelineAsync()
methods.
Our basic render demo provides a simple example of the construction of a valid render pipeline descriptor object, which is then used to create a GPURenderPipeline
via a createRenderPipeline()
call.
const vertexBuffers = [
{
attributes: [
{
shaderLocation: 0,
offset: 0,
format: "float32x4",
},
{
shaderLocation: 1,
offset: 16,
format: "float32x4",
},
],
arrayStride: 32,
stepMode: "vertex",
},
];
const pipelineDescriptor = {
vertex: {
module: shaderModule,
entryPoint: "vertex_main",
buffers: vertexBuffers,
},
fragment: {
module: shaderModule,
entryPoint: "fragment_main",
targets: [
{
format: navigator.gpu.getPreferredCanvasFormat(),
},
],
},
primitive: {
topology: "triangle-list",
},
layout: "auto",
};
const renderPipeline = device.createRenderPipeline(pipelineDescriptor);