Three.js
Over the past few weeks I’ve been exploring the world of 3D graphics in the web. As part of such an exploration I’ve travelled the lands of WebGL and WebGPU and discovered three.js (amongst many other treasures). What follow is a bunch of notes and experiments with three.js many of which will be reminiscent of the great three.js docs.
Getting Started
To get started open your favorite tinkering editor (codepen, stackblitz, jsfiddle, etc) and import the three.js library (if you don’t have a favorite you can use this starter in StackBlitz). Or you can just create an empty HTML file and add a script tag:
<script src="https://cdnjs.cloudflare.com/ajax/libs/three.js/0.145.0/three.min.js"></script>
To be able to display anything within three.js, one needs at least three elements: a scene, a camera and a renderer, so that we can render a scene with the camera.
// 1. Create scene
const scene = new THREE.Scene();
// 2. Create a camera of type PerspectiveCamera
const camera = new THREE.PerspectiveCamera(
/* FOV - Field of view */ 75,
/* Aspect Ratio */ window.innerWidth / window.innerHeight,
/* Near clipping plane */ 0.1,
/* Far clipping plane */ 1000 );
// 3. Create a WebGL renderer
const renderer = new THREE.WebGLRenderer();
// We set the size at which we want to render the 3D visualization
// in our app. For performance intensive app we can select smaller
// sizes or tell the renderer to use a lower resolution.
renderer.setSize( window.innerWidth, window.innerHeight );
// 4. Add renderer dom element to the DOM
// this is a <canvas> element we'll use to render our graphics
// Since we're using webGL the canvas will be a 'webgl' canvas
document.body.appendChild( renderer.domElement );
The camera we have created is a PerspectiveCamera, it is a camera designed to mimic the way the human eyes see. It is the most common projection used to render a 3D scene. The parameters used to create a perspective camera are the field of view, the aspect ratio, the near and far clipping planes which together define the camera viewing fustrum.
A camera view fustrum
Drawing a Cube
We can create a cube by following these steps:
- Define a geometry for the cube: Using
BoxGeometry
that contains all our cubes vertices and faces - Define a material to color the cube: For example, using
MeshBasicMaterial
- Creating a
Mesh
that applies the material to the geometry
// 1. Define Geometry
// The BoxGeometry lets us define rectangular cuboids of a given (width, height, depth)
const geometry = new THREE.BoxGeometry( 1, 1, 1 );
// 2. Define the material to color the cube
// The MeshBasicMaterial lets us draw geometries in a simple shaded (flat or wireframe) way
// that is not affected by light
const material = new THREE.MeshBasicMaterial( { color: 0xff00ff } );
// 3. Create a mesh to apply material to geometry
const cube = new THREE.Mesh( geometry, material );
Once the cube has been created we can add it to our scene:
// We we call scene.add() the new element will be added in position (0,0,0)
scene.add( cube );
// To avoid that our camera is within the object we move it a bit in the z axes
camera.position.z = 5;
But we can’t see it yet because we aren’t rendering our scene. In order to render out scene we need to create a render loop:
function render() {
requestAnimationFrame(animate);
renderer.render(scene, camera);
}
render();
This takes advantage of the requestAnimationFrame API to render the cube every time the screen is refreshed normally at 60 fps.
Animating a Cube
We can update our render
loop function to animate the cube by changing its rotation coordinates:
function render() {
requestAnimationFrame(animate);
// Updating the rotation coordinates on every frame we can make
// the cube rotate. Refer to https://threejs.org/docs/?q=Mesh#api/en/core/Object3D
// for the most common APIs to interact with objects.
cube.rotation.x += 0.01;
cube.rotation.y += 0.01;
renderer.render(scene, camera);
}
And thus we have a rotating cube:
Drawing Lines
We can also draw lines following a similar process, combining the right geometry and material like so:
// 1. Define Geometry
// The BufferGeomtry lets us a mesh, line or point geometry in a lot more detail
const geometry = new THREE.BufferGeometry().setFromPoints([
new THREE.Vector3( - 10, 0, 0 ),
new THREE.Vector3( 0, 10, 0 ),
new THREE.Vector3( 10, 0, 0 ),
]);
// 2. Define the material to color the lines
// The LineBasicMaterial lets us draw wire-frame geometries in a way they
// aren't affected by light
const material = new THREE.LineBasicMaterial( { color: 0x0000ff } );
// 3. Create a line to apply material to geometry
// A Line is a 3D object used to render continous lines in three.js
const line = new THREE.Line( geometry, material );
In the example above we used a BufferGeometry which is a representation of mesh, line, or point geometry that includes vertex positions, face indices, normals, colors, UVs, and custom attributes within buffers, reducing the cost of passing all this data to the GPU.
Adding some Lighting
So far we’ve seen only unlit shapes. We can include lighting in our 3D composition by using different three.js materials. So far we’ve only seen the basic materials which aren’t affected by light, but there are a number of materials that do support lighting like the MeshLambertMaterial
. Let’s light a sphere:
// 1. Define Geometry
// With less segments we have a less perfect sphere but it is easier to see
// how it rotates
const geometry = new THREE.SphereGeometry(
/* radius */ 15,
/* widthSegments */ 16,
/* heightSegments */ 16 );
// 2. Define material
const material = new THREE.MeshLambertMaterial( { color: 0x00ffee } );
// 3. Create a Mesh to apply the material to the geometry
const sphere = new THREE.Mesh( geometry, material );
scene.add( sphere );
// Make sure we can see the sphere and aren't inside of it
camera.position.z = 50
At this point were we to render the sphere as it is we wouldn’t see a thing. There’s no light in our scene so there’s no light to reflect and thus our sphere is in complete darkness in the vastness of the void. So we add a lighting source:
// Create a point light: A light that gets emitted from a single point in all directions.
const pointLight = new THREE.PointLight(0xffffff)
// Set its position
pointLight.position.x = 50
pointLight.position.y = 50
pointLight.position.z = 100
// Add to the scene
scene.add(pointLight)
And now we can see Neptune rotating away:
You might notice there’s a wireframe that represents the segments that compose the sphere. I just added that as an additional sphere with a MeshLamberMaterial
constructed with the wireframe option enabled: {wireframe: true}
.
const wireframe = new THREE.Mesh(
new THREE.SphereGeometry(15, 16, 16),
new THREE.MeshLambertMaterial({
// a little darker so it's visible
color: 0x00eeaa,
// this renders the wirefame
wireframe: true,
})
)
scene.add(wireframe)
Resources
- threejs.org
- three.js docs
- three.js examples
- Articles and Tutorials
- threejs.org useful links contains a bunch of additional resources to learn three.js
- Aerotwist tutorials on webgl and three.js
Written by Jaime González García , dad, husband, software engineer, ux designer, amateur pixel artist, tinkerer and master of the arcane arts. You can also find him on Twitter jabbering about random stuff.