In my current project, I am exploring the use of the latest Node.js streams API to create a stream that buffers a specific amount of data. This buffer should be automatically flushed when the stream is piped to another stream or when it emits `readable` events. The challenge lies in ensuring that the buffer is flushed every time the stream is piped to a new destination, even if it has already been flushed to a previous destination.
Here's an example scenario:
- The
BufferStream
class is built usingstream.Transform
and maintains a 512KB ring buffer internally. ReadableStreamA
is connected as a source to an instance ofBufferStream
.BufferStream
continuously writes incoming data fromReadableStreamA
to its ring buffer (overwriting old data).- The buffered data in
BufferStream
is then piped toWritableStreamB
.
throughWritableStreamB</code receives the entire 512KB buffer and continues to receive ongoing data from <code>ReadableStreamA
BufferStream
.- The same buffer in
BufferStream
is also piped toWritableStreamC
.
received due to additional data being written toWritableStreamC</code receives a separate copy of the 512KB buffer, which may differ from what <code>WritableStreamB
BufferStream
.
My question is, can this functionality be achieved with the streams API? So far, my only idea involves creating an object with a custom method that generates a new PassThrough stream for each destination instead of simply piping to and from it.
As a side note, I have previously accomplished similar tasks using the older "flowing" API by monitoring new subscribers on `data` events. Whenever a new function was added with `.on('data')`, I would call it directly with a duplicate of the ring buffer data.