Let me present the issue at hand:
I am currently managing a web application that serves as a notification system which updates frequently. This application is operational on several local computers, each of which solely display information without any input devices. The web app refreshes every few seconds to showcase new data.
The problem arises when there is a loss in connection to the server, leading to a 'page not found' error being displayed on the interface. In such situations, we are left with limited options like rebooting all computer systems running this application, connecting a physical keyboard to refresh the browser, or attempting remote access to each machine for manual refreshing. These solutions prove to be inefficient and frustrating.
Unfortunately, I am unable to make alterations to the existing application or the server environment.
Hence, what I require is a method to test the application's call and keep retrying until the connection is reestablished if an error occurs or it times out. Ideally, I envision creating a client-side page scraper using JavaScript to send requests directly to the application (which renders basic HTML) locally on each machine, hence eliminating the need for a server. If the scraping process retrieves the expected content, It would display it; otherwise, it will continue making requests until the correct page data is obtained.
Is my proposed approach feasible? What would be the most effective way to execute this plan?