I have a scaling attribute that I am applying to various instances of an instanced buffered geometry in Three.js. The shader code appears as follows:
attribute float scale;
uniform vec3 uMeshPosition;
void main() {
vec3 pos = position;
pos.x *= (uMeshPosition.x - pows.x) * scale + uMeshPosition.x;
pos.z *= (uMeshPosition.z - pos.z) * scale + uMeshPosition.z;
pos.y *= (uMeshPosition.y - pos.y) * scale + uMeshPosition.y;
gl_Position = projectionMatrix * modelViewMatrix * vec4(pos,1.0);
}
https://i.sstatic.net/s7TOT.png
Z represents the height in this case!!!
I want the scaled cubes to maintain their original center as indicated by the wireframe cube above.
https://i.sstatic.net/7tSx7.png
Is it possible to apply scaling on all three axes without computing and applying the scale on the CPU?
Updates:
The method I use to create the Geometry, typically with multiple cubes, but for this instance, it will be just one:
const createInstancedGeometry = (instanceCount, sizeX = 1, sizeY = 1, sizeZ = 1) => {
const geometry = new InstancedBufferGeometry()
geometry.maxInstancedCount = instanceCount
const shape = new BoxBufferGeometry(0.1 * sizeX, 0.1 * sizeY, 0.1 * sizeZ)
shape.translate(0, 0.4, 0)
const data = shape.attributes
geometry.addAttribute('position', new BufferAttribute(new Float32Array(data.position.array), 3))
geometry.addAttribute('uv', new BufferAttribute(new Float32Array(data.uv.array), 2))
geometry.addAttribute('normal', new BufferAttribute(new Float32Array(data.normal.array), 3))
geometry.setIndex(new BufferAttribute(new Uint16Array(shape.index.array), 1))
shape.dispose()
createInstancedAtrributes(geometry, instanceCount)
return geometry
}
This is how I set up the shader, although I'm not yet utilizing colors.
const createShader = () => {
const uniforms = {
// uMap: { type: 't', value: null },
uColor1: { type: 'c', value: new Color(0x961800) }, // red
uColor2: { type: 'c', value: new Color(0x4b5828) }, // yellow
uMeshPosition: { type: 'vec3', value: new Vector3(0, 0, 0) },
}
const shader = new ShaderMaterial({
uniforms,
vertexShader,
fragmentShader,
blending: AdditiveBlending,
transparent: true,
depthWrite: false,
})
return shader
}
The constructor for my Particle Fire is structured like this:
constructor({ sizeX = 1, sizeY = 1, sizeZ = 1 } = {}) {
const instanceCount = 1
const geometry = createInstancedGeometry(instanceCount, sizeX, sizeY, sizeZ)
const material = createShader()
const mesh = new Mesh(geometry, material)
mesh.frustumCulled = false
this.geometry = geometry
this.material = material
this.mesh = mesh
mesh.up = new Vector3(0, 0, 1)
mesh.position.set(2, 2, 1)
mesh.rotateX(Math.PI / 2)
this.instanceCount = instanceCount
const lineGeo = new EdgesGeometry(geometry) // or WireframeGeometry
const mat = new LineBasicMaterial({ color: 0xffffff, linewidth: 2 })
const wireframe = new LineSegments(lineGeo, mat)
this.mesh.add(wireframe)
}
And the update function looks like this:
update() {
const { instanceCount } = this
const { scale, progress, randoms } = this.geometry.attributes
const { uMeshPosition } = this.material.uniforms
uMeshPosition.value = this.mesh.position
for (let i = 0; i < instanceCount; i += 1) {
let value = progress.array[i]
value += 0.025
if (value > 1) {
value -= 1
scale.setX(i, randomValueBetween(0.3, 2, 3))
// randoms.setX(i, randomValueBetween(0, 1, 3))
}
// progress.setX(i, value)
}
scale.needsUpdate = true
// randoms.needsUpdate = true
// progress.needsUpdate = true
}
To add the object to the scene, I do the following:
const pFire = new ParticleFire()
scene.add(pFire.mesh)
To update it within a render loop:
pFire.update({ deltaTime })
renderer.render(scene, cameraController.camera)
requestAnimationFrame(animate)
cameraController.camera
refers to a basic camera controller added to the scene as a child to a 'character' that moves around the scene.
configuredCamera = new PerspectiveCamera(
75, window.innerWidth / window.innerHeight, 0.1, 5000,
)