I'm dealing with a large mongoDB database that contains millions of documents and I need to fetch them all at once without crashing or encountering cursor errors. My goal is to send this data over http using express in nodeJS. The collection contains thousands of documents, each with a field containing thousands of smaller documents. Currently, the size of my collection is 500MB. Do you have any recommendations for handling this big data scenario? Should I consider implementing a limit/skip based solution? If so, could you provide an example code snippet?
I've already tried document streaming, which appeared more reliable, but I still encountered the same cursor issue (Cursor not found).
app.get("/api/:collection", (req, res) => {
const filter = JSON.parse(req.query["filter"] || "{}");
const projection = JSON.parse(req.query["projection"] || "{}");
const sort = JSON.parse(req.query["sort"] || "{}");
db.collection(req.params.collection).find(filter)
.project(projection).sort(sort)
.stream({ transform: JSON.stringify })
.addCursorFlag("noCursorTimeout", true)
.pipe(res);
});