Finally, it’s time to get stuck in to WebGL properly. The concepts here are very similar to OpenGL. I’m not going to rigorously explain OpenGL, but I’ll be highlighting anything interesting that falls out of using WebGL specifically.
I’m just going to cover some very basic 2D rendering, with a view to drawing sprites on screen as I would in a game.
First Triangle
The first port of call is just getting a triangle on the screen. Since WebGL is similar to OpenGL ES 3.0, the initial steps are to load a shader which I’ll use to draw the triangle, and writing the vertex data to a buffer for use during drawing.
I touched on this briefly on the post about window management, but getting the WebGL context is fairly straight forward:
<canvas id="main-canvas" width=600 height=400></canvas>
let gl;
function load() {
let canvas = document.getElementById('main-canvas');
gl = canvas.getContext('webgl2');
// ...
}
There are two versions of the webgl context, webgl
and webgl2
. WebGL 2 is more fully featured and widely supported
so I see no particular reason to worry too much about the older one. The one thing to note is that on MDN the
WebGL2RenderingContext
docs show the
additional methods available on top of the WebGL 1 API, so it’s worth keeping the docs for
WebGLRenderingContext
open as well.
With the context created I can load in the shader:
<script id="vertex-shader" type="x-shader/x-vertex">#version 300 es
layout(location = 0) in vec2 position;
void main() {
gl_Position = vec4(position, 0.0, 1.0);
}
</script>
<script id="fragment-shader" type="x-shader/x-fragment">#version 300 es
out highp vec4 outputColor;
void main() {
outputColor = vec4(0.0, 1.0, 0.0, 1.0);
}
</script>
let glProgram;
function compileShader(srcElementName, type) {
let shaderSrc = document.getElementById(srcElementName).text;
let shader = gl.createShader(type);
gl.shaderSource(shader, shaderSrc);
gl.compileShader(shader);
if (!gl.getShaderParameter(shader, gl.COMPILE_STATUS)) {
throw gl.getShaderInfoLog(shader);
}
return shader;
}
function load() {
// ...
let vertexShader = compileShader('vertex-shader', gl.VERTEX_SHADER);
let fragmentShader = compileShader('fragment-shader', gl.FRAGMENT_SHADER);
glProgram = gl.createProgram();
gl.attachShader(glProgram, vertexShader);
gl.attachShader(glProgram, fragmentShader);
gl.linkProgram(glProgram);
// ...
}
The shader version is 300 es
. The spec for this version of GLSL can be found
here.
I embed the shader in script tags, and grab the contents of the elements when compiling. I find this cleaner than having
a multiline string embedded in the javascript. In a larger game I would fetch
these assets. Also the version
declaration is on the same line as the script
tag. It needs to be on the first line of the shader string, and if it
were placed on the line below the compiler would think the first line was empty.
Shader compiled happily, I place the vertex data in to a VBO:
let glVertBuffer;
const triangleVerts = [
0.0, 0.5, // Top
0.5, -0.5, // Bottom Right
-0.5, -0.5 // Bottom Left
];
function load() {
// ..
glVertBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, glVertBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(triangleVerts), gl.STATIC_DRAW);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
}
The main thing of note here is how vertex data is defined and fed in to the buffer. the triangleVerts
javascript value
is a javascript array of javascript numbers, which are defined as 64bit floats. In order to ingest the data I need to
convert it to a flat array of 32bit floats.
Float32Array
is one
of the various typed arrays available, and provides a buffer analogous to a Rust Vec<f32>
. When given an iterable as a
constructor argument the array is initialized based on the contents of the iterable, such that it has the same length,
and each value in the array is converted from an element in the iterable.
And finally I draw the triangle:
function draw() {
gl.clearColor(1.0, 0.0, 1.0, 1.0);
gl.clear(gl.COLOR_BUFFER_BIT);
gl.useProgram(glProgram);
gl.bindBuffer(gl.ARRAY_BUFFER, glVertBuffer);
gl.enableVertexAttribArray(0);
gl.vertexAttribPointer(0, 2, gl.FLOAT, false, 0, 0);
gl.drawArrays(gl.TRIANGLES, 0, triangleVerts.length)
}
There’s not much here particular to WebGL. What’s somewhat of interest is what’s not here - there’s no explicit
“present” or “swap buffer” call to display the result. There is a finish
call which will block until the all commands
have been executed but it’s not needed.
Textured Rectangle
Next up I load an image and draw a textured rectange. The main addition is the loading of the image in to a texture.
There are two main ways I saw to get the image loaded - using the Image
element, and using fetch
with ImageBitmap
.
Image Tag
First up is using Image
to download an image. This seems to be the way most guides I’ve found download images for use
as texture data.
function loadTexture() {
let image = new Image();
image.src = 'sample-sprite.png';
image.addEventListener('load', () => {
texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MIN_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_MAG_FILTER, gl.LINEAR);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.texParameteri(gl.TEXTURE_2D, gl.TEXTURE_WRAP_S, gl.CLAMP_TO_EDGE);
gl.pixelStorei(gl.UNPACK_FLIP_Y_WEBGL, true);
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
gl.bindTexture(gl.TEXTURE_2D, null);
// .. Post texture load action ..
});
}
Downloading an image with the Image
element is quite straight forward. In the load
callback I create the WebGL
texture, set various parameters on it, and load the image data in to the texture.
WebGL assumes that texImage2D
is passed the image data bottom row first, with the 0,0
point being the bottom left
of the image. But the data in an Image
is stored top row first. In order to flip the data when being ingested, the
pixelStorei
call instructs the API to flip the image making it match its expectations.
The WebGL2 spec lists
the valid combinations of internalFormat
and format
arguments.
Fetch
Much like with audio, I can also download the image file with fetch
, decode the file, and use it directly:
async function loadTexture() {
let response = await fetch('sample-sprite.png');
let blob = await response.blob();
let image = await createImageBitmap(blob, { imageOrientation: 'flipY' });
texture = gl.createTexture();
gl.bindTexture(gl.TEXTURE_2D, texture);
// .. The exact same texture parameters are before ..
gl.texImage2D(gl.TEXTURE_2D, 0, gl.RGBA, gl.RGBA, gl.UNSIGNED_BYTE, image);
gl.bindTexture(gl.TEXTURE_2D, null);
// .. Post texture load action ..
}
The UNPACK_FLIP_Y_WEBGL
flag doesn’t work with ImageBitmap
. So instead of having WebGL flip the image during
texImage2D
, I flip the image by passing the imageOrientation: 'flipY'
flag to the image decoder on load.
But broadly, this approach works similarly to using Image. The image decoding function, createImageBitmap
, would also
be handy if I have an image file embedded in some larger buffer. If I end up bundling assets together in to single
resources, this would be how I’d decode the image data.
Using the Texture
To use the texture I need to update some of the rendering code to include UV coordinates. Almost none of this is particular to WebGL so I’m going to skim over it quickly:
<script id="vertex-shader" type="x-shader/x-vertex">#version 300 es
layout(location = 0) in vec2 inPosition;
layout(location = 1) in vec2 inTexCoord;
out vec2 vTexCoord;
void main() {
gl_Position = vec4(inPosition, 0.0, 1.0);
vTexCoord = inTexCoord;
}
</script>
<script id="fragment-shader" type="x-shader/x-fragment">#version 300 es
in highp vec2 vTexCoord;
layout(location = 0) out highp vec4 outColor;
uniform sampler2D uTexSampler;
void main() {
outColor = texture(uTexSampler, vTexCoord);
}
</script>
In order to use the texture I have to update the shaders to take in texture coordinates, and expose a sampler uniform.
The vertex data is extended to define a rectangle, and include uv coordinates. These UV coordinates are added to a new buffer the same way they were before:
const spriteVerts = [
-0.4, 0.4, // Top Left
0.4, 0.4, // Top Right
0.4, -0.4, // Bottom Right
-0.4, -0.4 // Bottom Left
];
const spriteUVs = [
0, 1, // Top Left
1, 1, // Top Right
1, 0, // Bottom Bottom Right
0, 0 // Bottom Left
];
function load() {
// ...
vertBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, vertBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(spriteVerts), gl.STATIC_DRAW);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
uvBuffer = gl.createBuffer();
gl.bindBuffer(gl.ARRAY_BUFFER, uvBuffer);
gl.bufferData(gl.ARRAY_BUFFER, new Float32Array(spriteUVs), gl.STATIC_DRAW);
gl.bindBuffer(gl.ARRAY_BUFFER, null);
loadTexture();
}
Again, the UV coordinates are converted in to a form which can be ingested by bufferData
by using Float32Array
.
And finally drawing is extended to enable the UV buffer, and assign the texture to the sampler:
function draw() {
// ..
gl.enableVertexAttribArray(1);
gl.bindBuffer(gl.ARRAY_BUFFER, uvBuffer);
gl.vertexAttribPointer(1, 2, gl.FLOAT, false, 0, 0);
let samplerLocation = gl.getUniformLocation(shaderProgram, 'uTexSampler');
gl.activeTexture(gl.TEXTURE0);
gl.bindTexture(gl.TEXTURE_2D, texture);
gl.uniform1i(samplerLocation, 0);
gl.drawArrays(gl.TRIANGLE_FAN, 0, 4)
}
And with that the texture appears on the canvas.
Ortho Projection
The above example stretches and distorts the texture. It’d be nice to set up the camera so that the sprite is at its native resolution. I want to use an orthographic projection so that I can make the world space map 1:1 to pixels. I can then reposition the vertices to create a rectange with the same dimensions as the sprite and the sprite should be rendered at its native resolution.
First I define a standard orthographic projection:
function ortho() {
let right = gl.drawingBufferWidth;
let left = 0;
let top = 0;
let bottom = gl.drawingBufferHeight;
let far = 1;
let near = 0;
return new Float32Array([
2 / (right - left), 0, 0, -((right + left) / (right - left)),
0, 2 / (top - bottom), 0, -((top + bottom) / (top - bottom)),
0, 0, -2 / (far - near), -((far + near) / (far - near)),
0, 0, 0, 1
]);
}
This makes a coordinate system where the top-left of the screen is 0,0
and the bottom right is
resolutionX,resolutionY
.
The vertex shader gains a uniform to allow the projection matrix to be passed in:
<script id="vertex-shader" type="x-shader/x-vertex">#version 300 es
layout(location = 0) in vec2 inPosition;
layout(location = 1) in vec2 inTexCoord;
out vec2 vTexCoord;
uniform mat4 uProjection;
void main() {
gl_Position = uProjection * vec4(inPosition, 0, 1.0);
vTexCoord = inTexCoord;
}
</script>
The rendering code is extended to pass the projection matrix in:
function draw() {
// ..
let projectionLocation = gl.getUniformLocation(shaderProgram, 'uProjection');
gl.uniformMatrix4fv(projectionLocation, true, ortho());
gl.drawArrays(gl.TRIANGLE_FAN, 0, 4)
}
And I change the position of the vertices as planned:
const spriteVerts = [
0, 0, // Top Left
64, 0, // Top Right
64, 64, // Bottom Right
0, 64 // Bottom Left
];
With those changes made the sprite now appears in its native resolution at the top right of the canvas.
Wrap Up
There is a lot more to WebGL. Most of it maps neatly from using OpenGL ES, so I don’t plan on continuing to build on this topic as part of this series. I imagine I will be writing up any interesting rendering projects I work on, but that will be viewed through the lense of rendering experiments more than web gamedev investigations.