Rendering SVG Paths in WebGL

The following is a guest post by Matt DesLauriers. Matt combined a bunch of techniques and open source modules to build an incredible demo where simple recognizable icons explode into vector triangles only to reform into a completely new icon. Here he talks about some of the tools and ideas involved in making it happen, but beyond that, other approaches and other related concepts. This is some pretty advanced stuff and I hope it turns on some mental lightbulbs for people interested in this kind of mathematical, algorithmic web animation stuff. Take it away, Matt.

SVG is a great way to deliver resolution-independent icons, fonts, logos, and various other imagery. The format is widely supported, very flexible, and highly compact. For example, the new Google logo can be represented in as little as 146 bytes with SVG and a few tricks.

At the heart of the format is the <path> element, which provides a succinct means of describing a complex set path operations, like a glyph in a font set. Simply put, it describes a series of moveTo, lineTo, curveTo, and similar commands. For example, here's the Google "G" in SVG:

<path d="M173 102a51 51 0 1 1-13-30m20 37h-53" stroke="#4a87ee"/>

See the Pen QjMrXV by Matt DesLauriers (@mattdesl) on CodePen.

However, in WebGL, rendering SVG paths is more challenging. The WebGL API, and by extension ThreeJS, is primarily built for rendering many triangles. It's left up to the developer to implement tasks like complex shape rendering and text layout.

Here I'll explore some of the approaches in handling the SVG <path> element in WebGL, and briefly discuss the tools involved in my latest demo for the svg-mesh-3d module. If you have a WebGL enabled browser, you can see the demo here.


The code samples here will use ThreeJS to wrap the WebGL API, and browserify to bring together small npm modules of code. For more details on this approach, check out the Modules for Frontend JavaScript guide.

Development Tools

During the development of the ThreeJS demo, I used budo as the dev server, and babelify to transpile ES6. This leads to an easy and fast development cycle even with large bundles. brfs is used to inline shader files and our list of static SVG files to render. Instead of using Gulp or Grunt, the entire development and production is composed with two npm run tasks in a package.json file:

"scripts": {
  "start": "budo demo/:bundle.js --live -- -t babelify -t brfs | garnish",
  "build": "browserify demo/index.js -t babelify -t brfs | uglifyjs -cm > bundle.js"

The gh-pages deploy is then automated with a single shell script.

For more details on npm run, see "How to Use npm as a Build Tool" by Keith Cirkel.

Approximation & Triangulation

For my svg-mesh-3d demo, I only needed to render a simple silhouette of the <path> data, with a solid color. Entypo icons is a good fit for this, and triangulation works well for these kind of SVGs.

The approach I took is to approximate the curves in a SVG path, triangulate its contours, and send the triangles to WebGL as a static geometry. This is an expensive step. It's not something you would do every frame, but after the triangles are sitting on the GPU, you can use a vertex shader to animate them freely.

Most of this work has already been done through various modules on npm. The "glue" that brings them together is under 200 lines of code.

The final ThreeJS demo uses over 70 modules throughout its bundle, but heavily leans on some of the following under the hood:

In the end, the user experience becomes as simple as requiring the svg-mesh-3d module and manipulating the data it returns. The module is not specific to ThreeJS, and could be used with any render engine, such as Babylon.js, stackgl, Pixi.js, and even vanilla Canvas2D. Here is how it might be used in ThreeJS:

// our utility functions
var createGeometry = require('three-simplicial-complex')(THREE);
var svgMesh3d = require('svg-mesh-3d');

// our SVG <path> data
var svgPath = 'M305.214,374.779c2.463,0,3.45,0.493...';

// triangulate to generic mesh data
var meshData = svgMesh3d(svgPath);

// convert the mesh data to THREE.Geometry
var geometry = createGeometry(meshData);

// wrap it in a mesh and material
var material = new THREE.MeshBasicMaterial({
  side: THREE.DoubleSide,
  wireframe: true

var mesh = new THREE.Mesh(geometry, material);

// add to scene

Vertex Animation

When working with WebGL, it's best to avoid uploading data to the GPU too often. This means we should try to build static geometry once, and then animate the geometry over many frames through vertex and fragment shaders.

To make our mesh "explode" into tiny pieces, we can change a uniform in a shader, which is a bit like a variable for GLSL. The vertex shader will be run on each vertex in our 3D geometry, allowing us to explode outward from the world origin at [ x=0, y=0, z=0 ].

To do this, we need a custom vertex attribute in our shader. For each triangle, its three vertices will use the same Vector3 to describe a direction. This direction is a random point on a sphere.

The bare-bones vertex shader below just transforms each vertex by its, scaled by the animation factor. The is in model-space, in the range -1.0 to 1.0.

attribute vec3 direction;
uniform float animation;

void main() {
  // transform model-space position by explosion amount
  vec3 tPos = + * animation;

  // final position
  gl_Position = projectionMatrix * modelViewMatrix * vec4(tPos, 1.0);

When the uniform is at 0.0, the triangles will be at their initial location, and the logo will be a solid fill:

When the uniform is set to 1.0, the triangles will be pushed out along the direction vector, away from world center, and form an "explosion" of sorts:

This looks good, but there are two more features for some polishing touches. The first is to scale the triangle toward its centroid. This is used to animate the triangles in and out independently of the explosion. For this, we need to submit another custom vertex attribute, like we did with direction.

The second is to add a bit of spin and chaos to the explosion by transforming the vector by a rotation matrix. The angle for the rotation matrix is determined by the animation factor, as well as the sign() of the triangle centroid's x position. This means that some triangles will rotate in the opposite direction.

With the final vertex shader in place, our application can now animate thousands of triangles at a silky smooth 60 FPS. Below is a screen shot with wireframe rendering, and the randomization parameter set to 1500.

Other Approaches

I chose cdt2d since it is numerically robust, can handle arbitrary input with holes, and is engine-agnostic. Other triangulators, such as earcut, tess2, and ThreeJS's own triangulator would have also been good options.

Aside from triangulation, it is worth mentioning some other approaches to SVG <path> rendering in WebGL. Curve approximation and triangulation has its downsides: it is slow to build the geometry, it is difficult to do correctly, it increases the size of the final code bundle, and it shows "stepping" or jagged edges when you zoom in.


A simple and effective approach is to rasterize SVG data to a HTMLImageElement, and then upload that to a WebGL Texture. This not only supports the <path> element, but the full range of SVG features covered by the browser, such as filters, patterns, text and even HTML/CSS in Firefox and Chrome.

For example, see svg-to-image for how this can be done with Blob and URL.createObjectURL. In ThreeJS and browserify, the code might look like this:

// our rasterizing function
var svgToImage = require('svg-to-image');

// create a box with a dummy texture
var texture = new THREE.Texture();
var geo = new THREE.BoxGeometry(1, 1, 1);
var mat = new THREE.MeshBasicMaterial({
  map: texture

// add it to the scene
scene.add(new THREE.Mesh(geo, mat));

// convert SVG data into an Image
svgToImage(getSvgData(), {
  crossOrigin: 'Anonymous'
}, function (err, image) {
  if (err) {
    // there was a problem rendering the SVG data
    throw new Error('could rasterize SVG: ' + err.message);

  // no error; update the WebGL texture
  texture.image = image;
  texture.needsUpdate = true;

function getSvgData () {
  // make sure that "width" and "height" are specified!
  return '<svg xmlns="" viewBox="0 0 100 100" width="1024px" height="1024px">.....</svg>';


However, since we are rasterizing the graphic into pixels, it is no longer scalable and will produce aliasing when we zoom in. Often, you will need large images to produce smooth edges, which will put a strain on texture memory.

This does not produce triangles, so to mimic the "explosion" we could use simple Delaunay triangulation, which is much faster and easier than constrained Delaunay triangulation.

Stencil Buffer

An old trick is to use the Stencil Buffer to render complex polygons with holes. This is much faster than the triangulation step involved in my demo, but has a major drawback: it lacks MSAA (anti-aliasing) in most browsers. Some engines might use this in conjunction with other techniques, such as FXAA (anti-aliasing in a post-process), or adding an outline to the shape using prefiltered lines.

Pixi.js uses this for complex shape rendering, see a demo here.

A complex vector illustration rendered with the stencil buffer

You can read more about it here:

Loop-Blinn Curve Rendering

A more advanced approach for hardware path rendering is described in Resolution Independent Curve Rendering by Loop and Blinn. For a more approachable introduction, check out "Curvy Blues" by Michael Dominic.

This technique produces the best curve and path rendering. It is fast, infinitely scalable, and can leverage the GPU for anti-alasing and special effects. However, it is much more complex to implement. There are many edge cases that might appear in arbitrary SVG paths, such as overlapping or self-intersecting curves. Furthermore, it is patented, so not a great choice for an open source project.

Further Reading

This post only scratches the surface of scalable vector graphics in WebGL. The SVG format goes far beyond filled paths, and includes features like gradients, strokes, text, and filters. Each of these introduces new complexities in the scope of WebGL, and rendering the format entirely on the GPU would be a tremendous mountain of work.

For more reading on some of these topics, check out the following: