Serializing Web Application Requests

Mr. Wilson tells us how he improved web response time and kept users happy using the Generic Network Queueing System (GNQS).
Future Directions

Queuing requests with GNQS allows another interesting option which we may pursue in the future as our processing demands increase. Instead of migrating the server again to an even more powerful machine or to the complexity of an array of web servers, we could retain the existing web server as a front-end server. Without any changes in the CGI scripts on the web server, GNQS could be reconfigured to distribute queued jobs across as many additional machines as necessary to meet our response time requirements. Since GNQS can also do load balancing, expansion can be done easily, efficiently and dynamically with no server down time. The number of queue servers would be completely transparent to the web server.

Evaluation

There are a number of ways to handle web applications which require significant back-end processing time. Optimizing application servers requires different techniques than optimizing servers for high hit rates. For application servers, the limiting resource may be CPU, memory or disk I/O, rather than network bandwidth. Response times to given requests are expected to be relatively slow, and informing waiting users of the status of their jobs is important. Queuing requests with GNQS and referring the user to a results page has proven to be an effective, easily implemented and robust technique.

Acknowledgements

Colin C. Wilson has been programming and administering UNIX systems since 1985. He has been happily playing with Linux for the past four years while employed at the University of Washington, developing DNA analysis software and keeping the systems up at the Human Genome Center. When he's not busy recovering from the latest disaster, he can be reached at colin@u.washington.edu.

______________________

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState