My goal is to send synchronous calls to a page that will handle the SQL insertion of words I am posting. However, due to the large number of chunks and the synchronous nature of SQL, I want each AJAX call to be processed one after another.
for (chunk = 1; chunk <= totalchunks; chunk++) {
$.ajax({
type: "POST",
dataType: "json",
url: "updateHandle.php",
data: {words:arr.slice(1000*(chunk-1),1000*chunk),push:getpush},
success: function(){
console.log('Items added');
},
error: function(){
console.log('Errors happened');
}
});
}
Unfortunately, using async: false
does not seem to be effective as every AJAX call goes to the error case instead of the success case. Are there any alternative solutions that I might have missed?
I considered implementing a busy-waiting while-loop with locks, but my attempts with the setTimeout() function have not yielded the desired results (likely an error on my end).
EDIT: The size of the chunks requires multiple AJAX calls, hence the need for serialization. Additionally, the number of chunks may vary between calls, so adaptability is key.