We are in the process of developing a robust web application that needs to be able to handle a significant amount of traffic. I have been conducting load tests on a HP (Proliant DL 380) server with two 3.6GHz Xeon CPUs and 16GB of RAM. To perform these tests, I have been using Apache JMeter and Pylot, both of which have been providing similar results.
During one test scenario, I set up the load testing program to continuously hit my index page with just one thread. The index page is approximately 60KB in size and includes 10 ajax calls, a lot of JavaScript and jQuery code, necessary CSS, and more. Unfortunately, the results were not very promising.
Here are the results for the full index.jsp page:
- Throughput (req/sec): 3.567
- Response Time (secs): 0.278
I then decided to remove all ajax calls, charts, and CSS, but kept the JavaScript code.
- Throughput (req/sec): 6.082
- Response Time (secs): 0.161
Despite these changes, the performance was still low. I created a static index page in HTML format with the same content size, but without any server-side or client-side computation.
- Throughput (req/sec): 20.787
- Response Time (secs): 0.046
It was a significant improvement. I then added some JavaScript code back to the index.html page.
- Throughput (req/sec): 9.617
- Response Time (secs): 0.103
It seems that the bottleneck lies in the JavaScript code. I need to determine how many requests per second the server can handle without including JavaScript in the test, as it runs client-side. Should load testing tools be processing JavaScript code? (It appears they may be doing so).
Another crucial question is whether the current throughput is acceptable given the hardware, content size, and configurations mentioned. My target is 500 req/sec. Is the only solution to add more hardware?
For reference, the web application is built using Java, Struts2, JSP, Hibernate, and MySQL. It is also distributed across multiple servers using haproxy, but these tests were conducted on a single server.