Currently, I am in the process of building a web application using Three.js that displays a scene in ASCII art, much like the example showcased here. However, I am encountering some challenges with the frame rate.
In an attempt to enhance performance, I have experimented with various algorithms to convert the rendered scene into an ASCII string. While some were slower than the provided example and others faster, none proved efficient enough for rendering extensive scenes, even with the WebGL renderer.
Considering this predicament, I am contemplating transferring this conversion process to the GPU by implementing a shader. Nevertheless, I am uncertain about how to configure a Three.js shader to produce a string output. Additionally, my ideal scenario would involve the ability to specify a custom ASCII character set as a palette, even though GLSL does not support a string data type.
Appreciate any guidance on this matter! :)