Currently, I am working on an extension for Tableau that captures user decisions and saves them as rows in datatables using JavaScript. These modified rows are then sent to AWS Lambda in JSON format. In AWS Lambda, we process the data from these rows by generating update SQL queries and executing them one by one on a Redshift database using a for loop.
However, this process is proving to be time-consuming. Can anyone recommend a more efficient approach?
- Is there a better method for sending data to AWS Lambda, such as packing or zipping the data before transmission?
- Are there alternative approaches to mass updating the database instead of using a for loop?
Note: Each modified row may contain different values.