Our team requires a service that can periodically generate up-to-date JSON data every 15 minutes for our AJAX services to consume. This JSON will be used for metric analysis and charts among other things.
The reason behind the 15-minute interval is due to the time-consuming nature of the JSON generation process, with queries taking 45-60 seconds to complete. As our application grows, this duration is expected to increase further.
EDIT: All our data is sourced from a SQL database, which is already established as a large relational database setup. Swiftly delivering this data to our web users is essential, hence real-time querying from the database is not feasible due to slow response times.
EDIT 2 The JSON data needs to be completely refreshed every 15 minutes and made readily available. While it's acceptable if the background service takes time to run, ensuring that the JSON is generated on schedule is our priority.
The challenge lies in determining the optimal method to store this growing number of JSON objects - around 30-40 need to be created and serialized for transmission. Additionally, we have predefined time periods for JSON generation: daily, weekly, monthly, quarterly, semiannually, annually, and biennially.
I've contemplated using a flat database table, although I'm hesitant about having tables with numerous columns and only one row. Another option is generating a "data.json" file on the server to pull data from, but are there any potential drawbacks or limitations to this approach? Can it be cached without compromising its purpose?
We're open to exploring alternative storage methods - could an expert provide insights on the best way to handle this scenario?