/Web APIs

GPUDevice: createRenderPipelineAsync() method

Experimental: This is an experimental technology
Check the Browser compatibility table carefully before using this in production.

The createRenderPipelineAsync() method of the GPUDevice interface returns a Promise that fulfills with a GPURenderPipeline, which can control the vertex and fragment shader stages and be used in a GPURenderPassEncoder or GPURenderBundleEncoder, once the pipeline can be used without any stalling.

Note: It is generally preferable to use this method over GPUDevice.createRenderPipeline() whenever possible, as it prevents blocking of GPU operation execution on pipeline compilation.






See the descriptor definition for the GPUDevice.createRenderPipeline() method.

Return value

A Promise that fulfills with a GPURenderPipeline object instance when the created pipeline is ready to be used without additional delay.


If pipeline creation fails and the resulting pipeline becomes invalid as a result, the returned promise rejects with a GPUPipelineError:

  • If this is due to an internal error, the GPUPipelineError will have a reason of "internal".
  • If this is due to a validation error, the GPUPipelineError will have a reason of "validation".

A validation error can occur if any of the following are false:

  • For depthStencil objects:
    • format is a depth-or-stencil format.
    • If depthWriteEnabled is true or depthCompare is not "always", format has a depth component.
    • If stencilFront or stencilBack's properties are not at their default values, format has a stencil component.
  • For fragment objects:
    • targets.length is less than or equal to the GPUDevice's maxColorAttachments limit.
    • For each target, writeMask's numeric equivalent is less than 16.
    • If any of the used blend factor operations use the source alpha channel (for example "src-alpha-saturated"), the output has an alpha channel (that is, it must be a vec4).


Note: The WebGPU samples feature many more examples.

Basic example

The following example shows a basic example of the construction of a valid render pipeline descriptor object, which is then used to create a GPURenderPipeline via a createRenderPipelineAsync() call.


async function init() {
  // ...

  const vertexBuffers = [
      attributes: [
          shaderLocation: 0, // position
          offset: 0,
          format: "float32x4",
          shaderLocation: 1, // color
          offset: 16,
          format: "float32x4",
      arrayStride: 32,
      stepMode: "vertex",

  const pipelineDescriptor = {
    vertex: {
      module: shaderModule,
      entryPoint: "vertex_main",
      buffers: vertexBuffers,
    fragment: {
      module: shaderModule,
      entryPoint: "fragment_main",
      targets: [
          format: navigator.gpu.getPreferredCanvasFormat(),
    primitive: {
      topology: "triangle-list",
    layout: "auto",

  const renderPipeline =
    await device.createRenderPipelineAsync(pipelineDescriptor);

  // ...


Browser compatibility

Desktop Mobile
Chrome Edge Firefox Internet Explorer Opera Safari WebView Android Chrome Android Firefox for Android Opera Android Safari on IOS Samsung Internet
113Currently supported on ChromeOS, macOS, and Windows only.
113Currently supported on ChromeOS, macOS, and Windows only.
previewCurrently supported on Linux and Windows only.
99Currently supported on ChromeOS, macOS, and Windows only.
No No No No No No No

See also

© 2005–2023 MDN contributors.
Licensed under the Creative Commons Attribution-ShareAlike License v2.5 or later.