I am still relatively new to the world of three.js, WebGL, and 3D graphics in general. My current project involves using three.js to visualize GPS track points in a 3D environment. The datasets I am working with can be quite large, containing hundreds of thousands of points, so optimizing performance is crucial.
To display these points, I utilize a `Points` object that is populated with a `BufferGeometry` containing all the necessary data. These points are added in chronological order, following the track path.
The visual representation of each point is achieved through a `PointsMaterial` with a 2D texture (sprite) depicting the point as a circle, while areas outside the circle remain transparent. Since the color of these points is dynamic, the 2D texture is dynamically drawn on a canvas.
However, a challenge arises when viewing the points along the direction of the track, where overlapping points cause artifacts. This occurs because the transparent parts of closer points are rendered above farther points:
https://i.sstatic.net/apR0c.png
Interestingly, when viewing the track from the opposite direction, rendering the points back to front resolves this issue:
https://i.sstatic.net/DEtQ1.png
I have experimented with two potential solutions to address this problem:
- Using `alphaTest` with a value between 0 and 1 somewhat mitigates the issue. However, since some points have partial transparency based on customer requirements, there is a risk of clipping valid portions of the points. Additionally, it creates jagged edges where points overlap.
https://i.sstatic.net/TllVb.png
- Setting `depthWrite: false` for the points material results in visually appealing renderings. Unfortunately, newer points always obstruct older ones regardless of camera orientation, leading to incorrect visuals.
https://i.sstatic.net/P2gdK.png
What would be an effective solution to render the points in correct depth order, starting with the farthest and ending with the closest?
Below are key snippets of the code implementation:
Building geometry using coordinates fetched from an XHR request:
const particlesGeometry = new BufferGeometry();
const vertices = [];
for (let i = 0; i < timeline3d.points.length; i++) {
const coordinates = timeline3d.points[i].sceneCoordinates;
vertices.push(coordinates[0], coordinates[1], coordinates[2]);
}
particlesGeometry.addAttribute('position', new Float32BufferAttribute(vertices, 3));
return new Points(particlesGeometry, buildTimelinePointsMaterial(timeline3d));
Defining the material:
function buildTimelinePointsMaterial (timeline) {
const pointTexture = new CanvasTexture(drawPointSprite(POINT_SPRITE_RADIUS, timeline.color));
return new PointsMaterial({
size: POINT_SIZE,
sizeAttenuation: true,
map: pointTexture,
transparent: true,
alphaTest: 0.4
});
}