-Modify
I have raised a bug report to address this issue
I am attempting to upload a directory to my server containing large files, including CT scan images. While the process is functioning correctly, I am encountering memory problems.
document.getElementById("folderInput").addEventListener('change', doThing);
function doThing(){
var filesArray = Array.from(event.target.files);
readmultifiles(filesArray).then(function(results){
console.log("Results read :" + results.length);
})
}
function readmultifiles(files) {
const results = [];
return files.reduce(function(p, file) {
return p.then(function() {
return readFile(file).then(function(data) {
// store the result in the results array
results.push(data);
});
});
}, Promise.resolve()).then(function() {
// ensure final resolved value is the results array
console.log("Returning results");
return results;
});
}
function readFile(file) {
const reader = new FileReader();
return new Promise(function(resolve, reject) {
reader.onload = function(e) {
resolve(e.target.result);
};
reader.onerror = reader.onabort = reject;
reader.readAsArrayBuffer(file);
});
}
View JSFiddle of the proposed solution - Utilizing response from this query
In the provided example, no action is taken with the data but there is observed growth in memory usage.
Memory consumption prior to uploading:
Memory consumption post-upload:
The uploaded file folder is 342Mb which explains the increase in memory, however, it should be released afterwards, right?
If you have any suggestions on how to overcome this issue or if there is an alternative API that could be utilized instead of FileReader, please share your insights.
MODIFY-----
I suspect this is a Chrome and V8 related bug. Upon testing with Firefox, the memory is properly freed. It could be associated with this reported bug