Skip to content

marcofugaro/threejs-modern-app

Repository files navigation

threejs-modern-app

Boilerplate and utils for a fullscreen three.js app

demo

It is inspired from mattdesl's threejs-app, but it was rewritten and simplified using ES6 syntax rather than node, making it easier to read and well commented, so it can be easily customized to fit your needs.

Features

  • All the three.js boilerplate code is tucked away in a file, the exported WebGLApp is easily configurable from the outside, for example you can enable postprocessing, orbit controls, a gui, FPS stats, Detect GPU, and use the save screenshot or record mp4 functionality. It also has built-in support for cannon-es. [Read more]
  • A scalable three.js component structure where each component is a class which extends THREE.Group, so you can add any object to it. The class also has update, resize, and pointer hooks. [Read more]
  • An asset manager which handles the preloading of .gltf models, images, audios, videos and can be easily extended to support other files. It also automatically uploads a texture to the GPU, loads cube env maps or parses equirectangular projection images. [Read more]
  • global window.DEBUG flag which is true when the url contains ?debug as a query parameter. So you can enable debug mode both locally and in production. [Read more]
  • glslify to import shaders from node_modules. [Read more]
  • GPU tiering info using detect-gpu [Read more]
  • Modern and customizable development tools such as ⚡️esbuild⚡️, eslint, and prettier.
  • Beautiful console output:

console screenshots

Usage

Once you installed the dependencies running yarn, these are the available commands:

  • yarn start starts a server locally
  • yarn start --https starts an HTTPS server locally
  • yarn build builds the project for production, ready to be deployed from the build/ folder

All the build tools logic is in the package.json and esbuild.js.

WebGLApp

import WebGLApp from './utils/WebGLApp'

const webgl = new WebGLApp({ ...options })

The WebGLApp class contains all the code needed for three.js to run a scene, it is always the same so it makes sense to hide it in a standalone file and don't think about it.

You can see an example configuration here:

// setup the WebGLRenderer
const webgl = new WebGLApp({
canvas,
// set the scene background color
background: '#111',
backgroundAlpha: 1,
// enable postprocessing
postprocessing: true,
// show the fps counter from stats.js
showFps: window.DEBUG,
// enable OrbitControls
orbitControls: window.DEBUG,
// show the GUI
gui: window.DEBUG,
// enable cannon-es
// world: new CANNON.World(),
})

You can pass the class the options you would pass to the THREE.WebGLRenderer, and also some more options:

Option Default Description
background '#111' The background of the scene.
backgroundAlpha 1 The transparency of the background.
maxPixelRatio 2 You can clamp the pixelRatio. Often the pixelRatio is clamped for performance reasons.
width window.innerWidth The canvas width.
height window.innerHeight The canvas height.
orthographic false Use an OrthographicCamera instead of the default PerspectiveCamera.
cameraPosition new Vector3(0, 0, 4) Set the initial camera position. The camera will always look at [0, 0, 0].
fov 45 The field of view of the PerspectiveCamera. It is ignored if the option orthographic is true.
frustumSize 3 Defines the size of the OrthographicCamera frustum. It is ignored if the option orthographic is false.
near 0.01 The camera near plane.
far 100 The camera far plane.
postprocessing false Enable the postprocessing library. The composer gets exposed as webgl.composer.
xr false Enable three.js WebXR mode. The update function now will have a xrframe object passed as a third parameter.
showFps false Show the stats.js fps counter.
orbitControls undefined Set this to true to enable OrbitControls. You can also pass an object of OrbitControls properties to set.
gui undefined Wether or not to initialize the gui using lil-gui. Exposed ad webgl.gui.
guiClosed false Set this to true to initialize the gui panel closed.
world undefined Accepts an instance of the cannon-es world (new CANNON.World()). Exposed as webgl.world.
showWorldWireframes false Set this to true to show the wireframes of every body in the world. Uses cannon-es-debugger.

The webgl instance will contain all the three.js elements such as webgl.scene, webgl.renderer, webgl.camera or webgl.canvas. It also exposes some useful properties and methods:

webgl.isDragging

Wether or not the user is currently dragging. It is true between the onPointerDown and onPointerUp events.

webgl.cursor

Set this property to change the cursor style of the canvas. For example you can use it to display the pointer cursor on some objects:

onPointerMove(event, { x, y }) {
  // raycast and get the intersecting mesh
  const intersectingMesh = getIntersectingMesh([x, y], this, this.webgl)

  if (intersectingMesh) {
    this.webgl.cursor = 'pointer'
  } else {
    this.webgl.cursor = null
  }
}

webgl.saveScreenshot({ ...options })

Save a screenshot of the application as a png.

Option Default Description
width window.innerWidth The width of the screenshot.
height window.innerHeight The height of the screenshot.
fileName 'Screenshot' The name the .png file will have.

webgl.startRecording({ ...options })

Start the recording of a video using mp4-wasm.

Option Default Description
width window.innerWidth The width of the video.
height window.innerHeight The height of the video.
fileName 'Recording' The name the .mp4 file will have.
...others Other options that you can pass to the createWebCodecsEncoder() method of mp4-wasm.

webgl.stopRecording()

Stop the recording and download the video. It returns a promise that is resolved once the processing is done.

webgl.onUpdate((dt, time) => {})

Subscribe to the update requestAnimationFrame without having to create a component. If needed you can later unsubscribe the function with webgl.offUpdate(function).

Parameter Description
dt The seconds elapsed from the latest frame, in a 60fps application it's 0.016s (aka 16ms)
time The time in seconds elapsed from when the animation loop starts

webgl.onPointerDown((event, { x, y }) => {})

Subscribe a function to the pointerdown event on the canvas without having to create a component. If needed you can later unsubscribe the function with webgl.offPointerDown(function).

Parameter Description
event The native event.
position An object containing the x and the y position from the top left of the canvas.

webgl.onPointerMove((event, { x, y }) => {})

Subscribe a function to the pointermove event on the canvas without having to create a component. If needed you can later unsubscribe the function with webgl.offPointerMove(function).

Parameter Description
event The native event.
position An object containing the x and the y position from the top left of the canvas. If the user is dragging, the object contains also the dragX and dragY distances from the drag start point.

webgl.onPointerUp((event, { x, y }) => {})

Subscribe a function to the pointerup event on the canvas without having to create a component. If needed you can later unsubscribe the function with webgl.offPointerUp(function).

Parameter Description
event The native event.
position An object containing the x and the y position from the top left of the canvas. The object contains also the dragX and dragY distances from the drag start point.

webgl.gui.wireUniforms('My Material', material.uniforms)

Automatically adds the exposed uniforms of a material to the gui, as a folder. Internally it adds the uniforms using gui.addSmart()

Argument Description
folderName The name of the folder the gui will make.
uniforms The uniforms of the material. They are exposed by default in a ShaderMaterial or RawShaderMaterial. Or if you use the addUniforms function.

webgl.gui.addSmart(object, 'property')

Like gui.add() but tries to be more smart about it. For example if the number is between 0 and 1, it sets the slider min to 0 and max to 1. Otherwise the slider uses an exponential mapping for easy iteration.

Argument Description
object The object the gui will modify. For example it can be any material.
'property' The name of the property in the object. The gui will modify this property.

Component structure

Rather than writing all of your three.js app in one file instruction after instruction, you can split your app into thhree.js components". This makes it easier to manage the app as it grows. Here is a basic component:

https://github.com/marcofugaro/threejs-modern-app/blob/master/src/scene/Box.js

A three.js component is a class which extends THREE.Group (an alias for THREE.Object3D) and subsequently inherits its properties and methods, such as this.add(someMesh) or this.position or this.rotation. Here is a full list.

After having instantiated the class, you can add it directly to the scene.

// attach it to the scene so you can access it in other components
webgl.scene.birds = new Birds(webgl, { count: 1000 })
webgl.scene.add(webgl.scene.birds)

And in the component, you can use the options like this.

export default class Birds extends Group {
  constructor(webgl, options = {}) {
    super(options)
    // these can be used also in other methods
    this.webgl = webgl
    this.options = options

    // destructure and default values like you do in React
    const { count = 10 } = this.options

    // ...

The class supports some hooks, which get called once the element is in the scene:

update(dt, time) {}

Called each frame of the animation loop of the application. Gets called by the main requestAnimationFrame.

Parameter Description
dt The seconds elapsed from the latest frame, in a 60fps application it's 0.016s (aka 16ms).
time The time in seconds elapsed from when the animation loop starts.

resize({ width, height, pixelRatio }) {}

Called each time the window has been resized.

Parameter Description
width The window width.
height The window height.
pixelRatio The application pixelRatio, it's usually window.devicePixelRatio but clamped with webgl.maxPixelRatio.

onPointerDown(event, { x, y }) {}

Called on the pointerdown event on the canvas.

Parameter Description
event The native event.
position An object containing the x and the y position from the top left of the canvas.

onPointerMove(event, { x, y }) {}

Called on the pointermove event on the canvas.

Parameter Description
event The native event.
position An object containing the x and the y position from the top left of the canvas. If the user is dragging, the object contains also the dragX and dragY distances from the drag start point.

onPointerUp(event, { x, y }) {}

Called on the pointerup event on the canvas.

Parameter Description
event The native event.
position An object containing the x and the y position from the top left of the canvas. The object contains also the dragX and dragY distances from the drag start point.

Functional Components

If you don't need any of the previous methods, you can use functional components, which are just plain functions with the objective of making code easier to navigate in.

export function addLights(webgl) {
  const directionalLight = new DirectionalLight(0xffffff, 0.6)
  directionalLight.position.copy(position)
  webgl.scene.add(directionalLight)

  const ambientLight = new AmbientLight(0xffffff, 0.5)
  webgl.scene.add(ambientLight)
}

// ...

addLights(webgl)

Asset Manager

The Asset Manager handles the preloading of all the assets needed to run the scene.

You can use it like this:

// preload the suzanne model
const suzanneKey = assets.queue({
url: 'assets/suzanne.gltf',
type: 'gltf',
})
// preload the materials
const albedoKey = assets.queue({
url: 'assets/spotty-metal/albedo.jpg',
type: 'texture',
})
const metalnessKey = assets.queue({
url: 'assets/spotty-metal/metalness.jpg',
type: 'texture',
linear: true, // don't use gamma correction
})
const roughnessKey = assets.queue({
url: 'assets/spotty-metal/roughness.jpg',
type: 'texture',
linear: true, // don't use gamma correction
})
const normalKey = assets.queue({
url: 'assets/spotty-metal/normal.jpg',
type: 'texture',
linear: true, // don't use gamma correction
})
// preload the environment map
const hdrKey = assets.queue({
url: 'assets/ouside-afternoon-blurred-hdr.jpg',
type: 'env-map',
})

await assets.load({ renderer: webgl.renderer })

const suzanneGltf = assets.get(suzanneKey)

In detail, first you queue the asset you want to preload in the component where you will use it

import assets from '../utils/AssetManager'

const key = assets.queue({
  url: 'assets/model.gltf',
  type: 'gltf',
})

Then you import the component in the index.js so that code gets executed

import Component from './scene/Component'

And then you start the queued assets loading promise, always in the index.js

assets.load({ renderer: webgl.renderer }).then(() => {
  // assets loaded! we can show the canvas
})

After that, you init the component and use the asset in the component like this

const modelGltf = assets.get(key)

These are all the exposed methods:

assets.queue({ url, type, ...others })

Queue an asset to be downloaded later with assets.load().

Option Default Description
url The url of the asset relative to the public/ folder. Can be an array if type: 'env-map' and you're loading a cube texture.
type autodetected The type of the asset, can be either gltf, image, svg, texture, env-map, json, audio or video. If omitted it will be discerned from the asset extension.
pmrem false Only if you set type: 'env-map', you can pass pmrem: true to use the PMREMGenerator and prefilter for irradiance. This is often used when applying an envMap to an object rather than a scene background.
gamma false Only if you set type: 'texture' or type: 'env-map'. By default, the encoding of the texture is set to linear color space. You can pass gamma: true to enable gamma correction for the texture, useful when loading color data such as albedo maps in a gamma corrected workflow.
...others Other options that can be assigned to a Texture when the type is either env-map or texture.

Returns a key that later you can use with assets.get().

assets.queueStandardMaterial(maps, options)

Utility to queue multiple maps belonging to the same PBR material. They can later be passed directly to the MeshStandardMaterial.

For example, here is how you load a brick PBR texture:

const bricksKeys = assets.queueStandardMaterial(
  {
    map: `assets/bricks/albedo.jpg`,
    roughnessMap: `assets/bricks/roughness.jpg`,
    metalnessMap: `assets/bricks/metallic.jpg`,
    normalMap: `assets/bricks/normal.jpg`,
    displacementMap: `assets/bricks/height.jpg`,
    aoMap: `assets/bricks/ambientocclusion.jpg`,
  },
  {
    repeat: new Vector2().setScalar(0.5),
    wrapS: RepeatWrapping,
    wrapT: RepeatWrapping,
  }
)

As you can see, you can pass as a second argument any property you want to apply to all textures.

If you're using gamma, the textures with color data will be automatically gamma encoded.

Option Default Description
maps An object containing urls for any map from the MeshStandardMaterial or MeshPhysicalMaterial.
options Options you can assign to all textures, such as wrapping or repeating. Any other property of the Texture can be set.

Returns a keys object that later you can use with assets.getStandardMaterial().

assets.load({ renderer })

Load all the assets previously queued.

Option Default Description
renderer The WebGLRenderer of your application, exposed as webgl.renderer.

assets.loadSingle({ url, type, renderer, ...others })

Load a single asset without having to pass through the queue. Useful if you want to lazy-load some assets after the application has started. Usually the assets that are not needed immediately.

Option Default Description
renderer The WebGLRenderer of your application, exposed as webgl.renderer.
url The url of the asset relative to the public/ folder. Can be an array if type: 'env-map' and you're loading a cube texture.
type autodetected The type of the asset, can be either gltf, image, svg, texture, env-map, json, audio or video. If omitted it will be discerned from the asset extension.
pmrem false Only if you set type: 'env-map', you can pass pmrem: true to use the PMREMGenerator and prefilter for irradiance. This is often used when applying an envMap to an object rather than a scene background.
gamma false Only if you set type: 'texture' or type: 'env-map'. By default, the encoding of the texture is set to linear color space. You can pass gamma: true to enable gamma correction for the texture, useful when loading color data such as albedo maps in a gamma corrected workflow.
...others Other options that can be assigned to a Texture when the type is either env-map or texture.

Returns a key that later you can use with assets.get().

assets.addProgressListener((progress) => {})

Pass a function that gets called each time an assets finishes downloading. The argument progress goes from 0 to 1, with 1 being every asset queued has been downloaded.

assets.get(key)

Retrieve an asset previously loaded with assets.queue() or assets.loadSingle().

Option Default Description
key The key returned from assets.queue() or assets.loadSingle(). It corresponds to the url of the asset.

assets.getStandardMaterial(keys)

Retrieve an asset previously queued with assets.queueStandardMaterial().

It returns an object of the loaded textures that can be fed directly into MeshStandardMaterial like this:

const textures = assets.getStandardMaterial(keys)
const material = new MeshStandardMaterial({ ...textures })
Option Default Description
keys The keys object returned from assets.queueStandardMaterial().

Debug mode

Often you want to show the fps count or debug helpers such as the SpotLightHelper only when you're developing or debugging.

A really manageable way is to have a global window.DEBUG constant which is true only if you append ?debug to your url, for example http://localhost:8080/?debug or even in production like https://example.com/?debug.

This is done here in just one line:

window.DEBUG = window.location.search.includes('debug')

You could also add more global constants by just using more query-string parameters, like this ?debug&fps.

glslify

glslify lets you import shader code directly from node_modules.

For example, if you run through glslify a string you're using in three's onBeforeCompile, you can import glsl-noise like this:

import glsl from 'glslify'

// ...

shader.vertexShader = glsl`
  uniform float time;
  uniform float speed;
  uniform float frequency;
  uniform float amplitude;

  #pragma glslify: noise = require('glsl-noise/simplex/3d')

  // the function which defines the displacement
  float displace(vec3 point) {
    return noise(vec3(point.xy * frequency, time * speed)) * amplitude;
  }

  // ...
`

💡 BONUS TIP: you can have glsl syntax highlighting for inline glsl strings in VSCode with the extension glsl-literal.

glslify is applied also to files with the .frag, .vert or .glsl extensions. They are imported as plain strings:

// pass.vert
varying vec2 vUv;

void main() {
  vUv = uv;
  gl_Position = projectionMatrix * modelViewMatrix * vec4(position, 1.0);
}
// index.js
import passVertexShader from '../shaders/pass.vert'

// ...

const material = new ShaderMaterial({
  // it's a string
  vertexShader: passVert,

  // ...
})

For a list of shaders you can import via glslify check out stack.gl packages list.

GPU Info

Sometimes it might be useful to enable expensive application configuration only on higher-end devices.

This can be done by detecting the user's GPU and checking in which tier it belongs to based on its benchmark score.

This is done thanks to detect-gpu, more detailed info about these mechanics in its README.

For example, here is how to enable shadows only on high-tier devices:

if (webgl.gpu.tier > 1) {
  webgl.renderer.shadowMap.enabled = true

  // soft shadows
  webgl.renderer.shadowMap.type = PCFSoftShadowMap
}

Here is what the exposed webgl.gpu object contains:

Key Example Value Description
tier 1 The tier the GPU belongs to. It is incremental, so the higher the better. It goes from 0 to 3. Most GPUs belong to the Tier 2
isMobile false Wheter it is a mobile/tablet GPU, or a desktop GPU.
name 'intel iris graphics 6100' The string name of the GPU.
fps 21 The specific rank value of the GPU.

⚠️ WARNING: webgl.gpu is set asyncronously since the benchmark data needs to be fetched. You might want to wait for the exposed promise webgl.loadGPUTier.

More info on this approach also in this great talk by luruke