WebGL Game Development Tutorials

WebGL Pipeline - An In-depth Tutorial

We can easily start using WebGL functions now that WebGL 3D canvas is initialized but it really helps to understand the underlying processes that take place from construction of 3D primitives to their rasterization on the screen via Frame Buffer.

Frame buffers are memory locations allocated for rendering graphics on the screen. But before the final image even gets to the framebuffer, the 3D geometry supplied to the WebGL rendering engine undergoes a series of steps. As a whole, the series of these steps together can be referred to as the WebGL Pipeline. It's simply the order in which vertex and texture data is processed.

On This Image: Usually, 3D object vertex data is provided to WebGL together with vertex index array. The index array contains only integer indices pointing to a vertex, instead of the XYZ, RGB, UV and normal vector values stored separately in the 3D model's vertex array.

Because most 3D models share XYZ vertex position in the place where many triangles connect, indexing to a vertex that shares the same XYZ location saves memory.

Think of a 3D cone, for example. Over 90% of its vertices share a single location at the top of the "pyramid". You can refer to this vertex by a single index, even if the cone consists of thousands of polygons, that you would otherwise have to refer to individually.

By the way, you can see shader source code for drawing your first triangle right here on this tutorial website. Always remember, that to create even the simplest shapes, you must describe your model geometry using both vertex and index arrays. This is what you will pass to glDrawArrays or glDrawElements functions later on (often just the index array).

Just like OpenGL, WebGL adapts a similar graphics pipeline. To pick up some principles, let's take a closer look at how it is visually represented by this simplified WebGL pipeline diagram:


I used a triangle here because it is the most basic primitive shape that demonstrates the whole WebGL pipeline visually. Of course in a real-world scenario, your model would consist of thousands of polygons.

A single vertex, for example, would only be rendered as a single dot on the screen.

In first step we're preparing our 3D vertex set to represent a triangle. This set doesn't have to be limited to triangle shapes but it usually contains X,Y,Z representation of at least one vertex with an optional RGB value (not shown in this example for simplicity).

This vertex data is usually passed on to something called a shader. There are different types of shaders. Two most common ones are Vertex and Fragment shaders. If you're coming from OpenGL background, looking at this simplified pipeline diagram you will quickly notice that WebGL does not support Geometry shaders. They are simply missing from the WebGL pipeline by design. But that's not so bad, because everything Geometry shaders can do can be accomplished in some other way. Not a big loss.

The coordinates are calculated in vertex shader and the color of each pixel is interpolated across the triangle surface in fragment shader based on the information received from vertex shader.

Vertex shader is always executed first and only then the results are passed on to the Fragment shader. This order is important, the shaders are not executed simultaneously.

You can think of them as chained together to produce the final result on the HTML canvas which ends up being the final rasterized image that will be rendered to the frame buffer.

In WebGL Browser Takes Care of Double Buffering

To create real time animation the frame buffer is "flipped" (copied) onto the canvas much like traditional "notepad animation" where one sheet of paper quickly replaces the one lying underneath it. The image will then instantly appear on the screen.

This is done this way to fix the refresh rate gap between the memory writes and screen refresh rate. If you guided the GPU driver to write directly to the video memory on the screen, you would see a noticeable "tear" effect. But writing first to an off-screen buffer and waiting until that process is completed first eliminates that side effect.

In animated computer graphics in general, this process is referred to as double-buffering or off-screen buffering. In OpenGL it had to be done by the programmer or with the help of another library already written by someone else. This is because desktop applications are responsible for creating their own window and manually control all kinds of its aspects.

In desktop OpenGL applications you would have to manually "flip" the buffer by issuing a GL command just as the last execution call after your frame has been completely rendered to the offscreen buffer. Even after that takes place, it takes time to complete this operation as the data is passed on to the fragment processing mechanism which usually tends to be the slowest process in the entire pipeline.

This is why in OpenGL has a function SwapBuffers which flips the two surfaces after waiting to ensure the operations have finished rendering to the surface first.

However, WebGL hands this control over to the browser. Buffer swapping is not something you have to worry about when dealing with WebGL. It's done automatically by the browser.

Drawing Basic Primitives

What kind of geometry can we draw using the WebGL pipeline? Dots, lines and triangles for the most part is what you will be rendering on the screen. The actual pixel data is calculated in the shader process and finally sent to the screen.

WebGL lets us choose how the rendering mechanism should treat our vertex sets. They can either be rendered as filled in (textured) triangles, as lines connected between the vertices or as single vertices which would just display the 3D model as dots on the screen.

On that a bit later in the tutorial. For now, just note that the vertex shader only understands 3D vertex and color coordinates and isn't concerned with actually drawing anything on the screen. And the fragment shader takes care of the actual pixel (referred to as fragment) to be drawn on the screen.

Both OpenGL and WebGL refer to pixels as fragments because they are much more than regular 2D pixels. They are part of an entire process dealing with sets of data. However, the actual name is a little mysterious.

The term fragment is thought of to be referred to as a manageable part of the rendering process. In Tom Duff’s “Compositing 3-D Rendered Images” (1985) the author uses "fragment" to refer to partial segments of a pixel. However, Karl Sims in “Particle animation and rendering using data parallel computation” (1990) discusses “dicing” particles into “pixel-sized fragments” for parallel processing by “fragment processors”. This is really all that I was able to find out about fragments. Sometimes, they are interchangeably thought of as pixels tossed around by the shaders loaded into the GPU. The discussion of why pixels are called pixels is outside of the scope of this WebGL tutorial:-)

We have just discussed the WebGL pipeline in its most basic form. But actually, in reality it looks a lot closer to something like shown on the diagram below.


Here the new additions are Varyings and Uniforms. These are the two data types specific to shader programming. You've already heard of ints and floats before from standard programming supported by pretty much almost every language you can think of, but these new keywords are unique to GPU programming. They are provided by the GLSL language.

Varyings are simply variables declared within each individual shader. They can be used as helper flags or shader configuration utilities that determine different aspects of your shader functionality. Light intensity, or distance for example. Just like any regular variable they can change throughout the lifecycle of your shader program usually written in GLSL language.

In OpenGL and WebGL we're often dealing with vertex and other information packed into data buffers. These data sets allocate memory space for blocks of data that does not change. This data is used for performance optimization.

In fact UBO's (Uniform Buffer Objects) are memory buffers allocated for sending data to the GPU from the host application (your WebGL program.)

The whole purpose of using uniform buffers is that they can be shared across multiple shaders (for example vertex and fragment shaders) without having to pass it to the GPU multiple times for each shader separately. This limits uniforms to being read-only data sets.

In contrast, varyings are usually defined in the first shader (vertex shader) and are passed on to the next shader for processing (fragment shader) and their value can and is often changed during the process.

To make use of uniforms they must first be bound to the shader inlet mechanism, which is accomplished using WebGL functions we'll take a look when we get to the source code.

Varyings do not need to be bound to anything. They are defined within the shader themselves.

The vertex and fragment shaders both have two virtual places for input and output of the data. The vertices are literally passed into the vertex shader through an inlet and come out on the other end through an outlet into the inlet of fragment shader.

This is a literal representation of how the data flow between shaders should be thought of. For this reason, when writing WebGL shaders (we're coming to that shortly in the following sections of the book) you will often see the in and out keyword.

Let's take a look at an actual GLSL program describing a simple vertex shader. We'll go in much more detail later. Note that we're still missing the fragment shader code that vertex shaders are paired with but eventually we'll get there. This is just an example to show what GLSL looks like.

And Then There Were Attributes

So we have varyings and uniforms keywords. But in our next shader example I'll throw in an additional type of a variable called attribute. I just didn't want to overwhelm you too soon with yet another variable type.

An attribute is for using in the vertex shader only. It's just another type of a variable.

An attribute is a variable that accompanies read-only vertex data. For example color or texture coordinates.

In comparison varying variables can be altered by the vertex shader, but not by the fragment shader. The idea is simply to pass information down the pipeline.

But let's take a look at these variables from another angle to get some perspective (no pun intended.)

What Exactly Is Varying?

You have to understand the principle of varying variables in order to gain a deeper understanding of WebGL shader programming.

When vertex data is sent to the shader it will process one pixel at a time at that location. However the shader itself will apply its calculation to all pixels in a given rendered primitive.

The data is interpolated across vertices. As in diagram below, for example between vertex B-A and B-C. The pixel location is calculated internally on the GPU. We're not concerned with that. But the shader receives information about that pixel. One at a time.


And that's what a varying is. It's an interpolated pixel. The GLSL shader program's algorithm you will write will "close in" on that pixel. But rendering will apply to entire primitive.

Uniforms, Attributes and Varying

Looks like we've come full circle. We've taken a look at different types of variables used in shaders. Let's draw a quick outline and wrap up our discussion by adding a little more depth:


per primitive

Constant during entire draw call.

Like a const. Do not vary.


per vertex

Typically: positions, colors, normals, UVs …

May or may not change between calls.


per pixel

Vary from pixel to pixel

Always changing on per-fragment operations in shader.

A uniform can be a texture map, for example. It does not change during the entire draw call.

Because attribute variables contain vertex data they are usually associated with an array. Of course that's because vertices are defined by at least 3 coordinates on the 3 axis. But keep in mind that most common 3D matrices pack into 4x4 space. And to comply with that standard, often each vertex is represented by an array consisting of 4 values, where the 4th coordinate is empty or represents some opportunistic value you want to pass to the shader.

From OpenGL to WebGL

I can't stress how many programmers come to WebGL from OpenGL programming for desktop computers. In OpenGL we're also using the GLSL language to write our shaders. But there may exist differences across different versions of GLSL which can slightly differ in syntax or miss features. As an OpenGL programmer you may be familiar with following shader format:

layout (location = 0) in vec3 position;

layout (location = 1) in vec3 rgb_in;

out vec3 rgb;

attribute vec3 VertexNormal; // Not actually used in this shader example

attribute vec2 TextureCoord; // Provided as example of "attribute" variable

uniform mat4 Model;

uniform mat4 View;

uniform mat4 Projection;

varying vec2 vTextCoord;

varying vec3 vLightPos;

void main() {

    gl_Position = Projection * View * Model * vec4(position, 1.0);

    rgb = rgb_in;


This is an example of a simple vertex shader written in GLSL (Shading Language) demonstrating what the theory covered in this chapter looks like in source code. Here you see that shaders have in and out keywords to support common data flow between shaders and your application.

Note however, that in WebGL we're using a slightly different version of GLSL. Notice the areas highlighted in yellow. In other words: layout (location = 0) and layout (location = 1) if you are reading this book on a Kindle device in black and white. If you've seen these instructions in OpenGL, chances are you won't in WebGL. I am only using them here for reference. The rest of the WebGL shader code remains primarily the same, depending on which version of GLSL is being used. This may or may not always be true as shader standards improve and continue to change. In this book, I am using only tested WebGL shaders that actually work at least in Chrome browser. But I've also tested the source code in others as well.

The location parameters tells us which slot the buffers were packed in before they are sent to this shader. They also tell us about the size of the data. For example vec3 stands for a buffer containing 3 floating point values which is enough to represent exactly one vertex coordinate set (x, y and z). These values are passed directly from your JavaScript program and will be shown later when we get to the examples that draw basic primitives.

Also notice that we take in variable called rgb_in and its "out" counterpart is reassigned to rgb. You can assign your own names here. For clarity, I added "_in" for data that is coming into the shader and I use just "rgb" for the data that is coming out. The logistics behind this come purely from a personal choice and I recommend using variable names that make the most sense to your own programming style.

Uniforms are like constant variables. In this case they are Model, View and Projection matrices (mat4 represents a 4x4 dimensional array) passed into the shader from our program.

These are multi-dimensional data sets that contain information about your camera position and the model vertices. They are shown in this basic example because they are the absolute minimum requirement to draw anything in 3D space - even if it's just points/dots.

Sometimes we also need to pass lightsource position matrix which is not shown in this example but the idea is the same. A light source is usually X,Y,Z center of the light. And an optional vertex indicating the direction which that light is pointing in (unless it's a global light source.)

Within the main() function of the shader is where you write your shader logic. It's a lot like a C program with additional keywords (vec2, vec3, mat3, mat4, const, attribute, uniform, etc.) I stripped this example down to its basic form but various GLSL versions (of which there are quite a few) vary in minor syntax differences. I skipped core version differences here. At this time, we're not concerned with that because I don't want to overcomplicate the book.

We've determined that vertices from our program are passed to the Vertex Shader. And from there, they are passed on to the Fragment Shader. Together vertex and fragment shader pair creates a representation of your rendered primitive in 3D space.

We'll continue our discussion about shaders, learn writing our own and even loading them from a web address on the local web hosting server (or localhost) throughout WebGL Tutorials on this site.

WebGL Book: A WebGL Tutorial Reference Book
WebGL Book - A WebGL Tutorial Reference Book

If tutorials on this site are not enough, or you simply like reading from a physical book or a digital device (Kindle, iPad, tablets, etc.) check out WebGL Book. Written by the author of tutorials on this site.

This book is a WebGL tutorial and a reference that guides the reader through the process of setting up and initializing WebGL, drawing 3D primitives and creating 3D computer games.

Preorder Here
© 2016-2019 Copyright WebGL Tutorials (webgltutorials.org)

All content and graphics on this website are the property of webgltutorials.org - please provide a back link when referencing on other sites.

Lyrics Haven: a song lyrics website, with clean printable lyrics react js tutorials react js elements react js components vue js tutorials angular js tutorials