9/24/2025, 5:24:32 PM
Shdaerpunk Editor GUI

June 13, 2023

Shaderpunk—Real-Time Video Shaders in the Browser

Try it out here: shaderpunk.pillar.land or take a peek at the source code

Over the past few years, I've been interested by how shader programming can be used outside of the traditional demo scene or 3D graphics pipelines. While tools like ShaderToy have popularized fragment shaders for visual experiments, there hasn't been a straight-forward way to apply those same effects directly to video-without opening up a heavy video editor or running an offline render.

That's why I built Shaderpunk: a browser-based tool that lets you quickly add GLSL visual effects to your videos, export them, and even use it as a minimalist shader playground. Everything runs locally in your browser, no installs required. This is super useful if you need to add a video effect to a small video clip, or if you want to experiment with a shader without the hassle of all the boilerplate.

Overview

I designed Shaderpunk around my own workflow so you can quickly upload a video, write or tweak a GLSL shader, and watch your effect render in real time. Updates happen live, and if something breaks, there's a built-in console to show shader errors as you go.

What makes Shaderpunk different from a traditional video editor is that I focused on making the UI as lightweight and intuitive as possible. Far too many shader-based tools require extensive setup which becomes a growing pain if you want do test a lot of shaders or export many videos.

Here are a few examples of what Shaderpunk can do:

Chromatic Aberration Effect

Chromatic Aberration Effect

"Night Vision" Effect

"Night Vision" Effect

How It Works

Under the hood, Shaderpunk combines modern browser APIs with GLSL:

  • Solid.js is a modern UI Library with a focus on speed and simplicity.
  • WebGL2 drives the fragment shaders and processes video frames in real time.
  • WebCodecs handles video decoding and recording, ensuring everything stays local and efficient.

The WebCodecs API is great for encoding video, but it is only supported on the latest releases of Chromium-based browsers and Firefox. As of this writing, Safari doesn't support it in a stable release but does have an implementation supported in a Technology Preview release. You can verify its support via caniuse.com

At the core of Shaderpunk is a loop that takes each frame of the video, uploads it as a texture to the GPU, and draws it through your shader program. Here's the function that does that:

typescript
export function drawVideoFrame( gl: WebGL2RenderingContext, program: WebGLProgram, video: HTMLVideoElement, ): void { const level = 0; const internalFormat = gl.RGBA; const srcFormat = gl.RGBA; const srcType = gl.UNSIGNED_BYTE; // shader uniforms updateRuntimeUniform(gl, program, [ { name: "uTime", value: [video.currentTime], }, { name: "uResolution", value: [video.videoWidth, video.videoHeight], }, ]); // upload current video frame as a texture gl.texImage2D( gl.TEXTURE_2D, level, internalFormat, srcFormat, srcType, video, ); // draw a rectangle covering the viewport gl.drawArrays(gl.TRIANGLE_STRIP, 0, 4); }

Instead of a timeline-based scene, Shaderpunk treats each frame as a rectangle made from two triangles that fully cover the viewport. The fragment shader is then applied to each pixel in the rectangle, and the result is drawn directly to the canvas. Here is the code that creates the rectangle:

typescript
export async function createVertexBuffer( gl: WebGL2RenderingContext, program: WebGLProgram, name: string, ): Promise<{ error: string | null }> { // Two triangles stripes combined to create a rectangle const vertexPositions = new Float32Array([ -1.0, 1.0, 1.0, 1.0, -1.0, -1.0, 1.0, -1.0, ]); const vertexPositionsLoc = gl.getAttribLocation(program, name); if (vertexPositionsLoc < 0) { return { error: `Couldn't find attribute [${name}] location.` }; } const vertexPositionBuffer = gl.createBuffer(); gl.bindBuffer(gl.ARRAY_BUFFER, vertexPositionBuffer); gl.bufferData(gl.ARRAY_BUFFER, vertexPositions, gl.STATIC_DRAW); gl.vertexAttribPointer( vertexPositionsLoc, 2, // (x, y) gl.FLOAT, // floating point numbers false, // normalize 0, // stride 0, // offset ); gl.enableVertexAttribArray(vertexPositionsLoc); return { error: null }; }

This is common in graphics and game development for applying post-processing effects like chromatic aberration or vignettes to the video buffer. This process allows the GPU toefficently apply the fragment shader in real-time, without needing to process the entire video beforehand.

"Night Vision" Effect

"Night Vision" Effect

Of course, before we can render the frame, the vertex and fragment shaders need to be compiled and linked into the WebGL program. Shaderpunk handles this through a setup function that ensures errors are caught before drawing the video frame:

typescript
export async function setupProgram( gl: WebGL2RenderingContext, vertexShaderSource: string, fragmentShaderSource: string, ): Promise<{ program: WebGLProgram | null; error: string | null }> { // VERTEX SHADER const { shader: vertexShader, error: vertexShaderError } = await createShader( gl, gl.VERTEX_SHADER, vertexShaderSource, ); if (vertexShaderError) { return { program: null, error: vertexShaderError }; } logger.info("Compiled vertex shader."); // FRAGMENT SHADER const { shader: fragmentShader, error: fragmentShaderError } = await createShader(gl, gl.FRAGMENT_SHADER, fragmentShaderSource); if (fragmentShaderError) { return { program: null, error: fragmentShaderError }; } logger.info("Compiled fragment shader."); // PROGRAM const { program, error: programError } = await createProgram( gl, vertexShader, fragmentShader, ); if (programError) { return { program: null, error: programError }; } logger.info("Created program."); return { program, error: null }; }

Shader Example

Shaderpunk supports GLSL fragment shaders, so if you've written shaders before, you'll feel right at home. A very simple example is a color inversion shader, which flips the colors of your video:

glsl
#version 300 es precision highp float; // in from pipeline in highp vec2 aTextureCoord; // <x, y> (normalized by default) // uniforms uniform sampler2D uFrame; // current video frame uniform float uTime; // current time in seconds uniform vec2 uResolution; // <width, height> // out out highp vec4 oColor; void main() { highp vec4 texelColor = texture(uFrame, aTextureCoord); // invert colors texelColor = vec4(1.0 - texelColor.rgb, texelColor.a); oColor = texelColor; }