Linux and Web Servers
One of the interesting things about the moment in which the Linux community finds itself is the fact that while more and more people are learning about Linux, at the same time Linux itself is changing and expanding, making some newbies feel as if they will never "master Linux". Fortunately for them, "mastering Linux" is a goal probably not worth pursuing. It is like mastering cooking; in actuality, a person only becomes more and more proficient. Even upon reaching a level others might regard as mastery, the true practitioner knows there is always more to learn, more to create and more to discover.
This week on Wall Street News Hour, I will be talking to the Linux-curious about Linux and web servers. While many on the "end-user" side of things tend to think of web servers as something only geeks need concern themselves with, the quality of the web server on a given ISP or within a given network will have a major impact on the quality of the Internet experience the end user eventually enjoys. If it is possible (and I believe it is) that a poor performance by a web server can turn an end user off of the Internet completely, a top-quality web server (run by a talented sys admin, too) can convince an end user of both the seamlessness and exceptional productivity of computing and the Internet.
One of the easiest ways to understand what a web server is is simply to break up the term. A "server" is a software application that provides information to a "client". In the case of a web server, the server application delivers information to a client such as an Internet browser (Netscape, Internet Explorer, Opera, etc.). One of the most popular web servers (also known as "http server" for reasons which might be obvious at this point) is Apache, whose web servers are responsible for powering much of the Internet most of us use every day. Interestingly, virtually every different platform offers its own server software: Windows NT, for example, is a popular OS choice for servers among those preferring Microsoft products. Solaris, an operating system developed by Sun Microsystems, is another very popular server software package used widely on the Internet.
So "web serving"--providing HTML/web pages for browsers--is just one of the functions of server software, as are file and print "serving". The computer which handles the printer = a print server. A hub computer that handles files and applications over a network = a file server. And so on.
Generally, most of the qualities that make for good software on the client side (reliability/availability, strength or "robustness" and a clean, relatively bug-free performance) also make for good software on the server side. In fact, client and server versions of the same operating system are often very much alike (this is especially true with Linux systems), except for certain features that help server software deal with its many special responsibilities. Perhaps foremost among these responsibilities is security. Because servers in general and web servers in particular access a wide array of data for end users, it is paramount that users not only be able to access their servers when they choose to (meaning servers should not be easily crackable, nor should they be so bug-ridden as to become susceptible to crashing or other system failure), but also feel confident that the information they are accessing is not disclosed to others who are using the same software. If you've ever been talking on the telephone and suddenly found yourself hearing another conversation coming through your telephone line at the same time, then you can imagine how important communication security can be.
Another important responsibility of web servers also has to do with their relationship to end users: the ability to centrally manage a variety of remote clients. While from this point in time, such a comment may seem beyond obvious, the ability of various servers to handle both a sizable number of clients with varying types (and quality) of connection is one of the important criteria in determining the suitability of a given server to a given task.
For many end users, hearing that Linux was only okay on the desktop but a real killer as a server did not likely inspire a great deal of end-user enthusiasm toward the open-source operating system. This is a bit unfortunate, but completely understandable. The average end user doesn't care (or, more accurately, doesn't think he or she cares) about what kind of web server their ISP may be using. An end user has his or her client-side machine--desktop, laptop or some other connected device--and all the end-user tends to be interested in is being able to get on-line when his or her browser calls for HTML pages. This, for an increasing amount of people, is where Linux comes in.
One of the things that helped put Linux on the server map was its reliability. Many of the then-current web servers were powerful systems clearly capable of delivering content, but their inherent "bugginess" was a source of constant consternation for many sys admins. This "bugginess"--manifested in repeated crashing and other system failure--became so common that many began to assume crashing and system failure were simply a part of the web server experience. A similar bleak (but false) realization was happening at the same time on the desktop to many end users. The emergence of Linux, an operating system that was almost notoriously crash-free and rarely needed rebooting, was, for these sys admins, a godsend (or a Torvaldsend, depending on your perspective). Like Apache, the popular server used by UNIX and Windows NT systems, Linux is open source, which means programmers, sys admins and others having the expertise can "open up the hood" on the operating system and fix or add to specific parts of the fundamental source code of the system.
Linux's security is another area in which the open-source operating system tends to thrive. While widespread implementation of Linux will increase the overall instances of attacks on Linux systems, the ability to access its source code means that patches, fixes and other repairs can be deployed much more rapidly. It also means that many more security back doors and holes will be detected before these back doors and holes are exploited by crackers.
Fast/Flexible Linux OS Recovery
On Demand Now
In this live one-hour webinar, learn how to enhance your existing backup strategies for complete disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible full-system recovery solution for UNIX and Linux systems.
Join Linux Journal's Shawn Powers and David Huffman, President/CEO, Storix, Inc.
Free to Linux Journal readers.Register Now!
- The Qt Company's Qt Start-Up
- Devuan Beta Release
- May 2016 Issue of Linux Journal
- EnterpriseDB's EDB Postgres Advanced Server and EDB Postgres Enterprise Manager
- The US Government and Open-Source Software
- Open-Source Project Secretly Funded by CIA
- The Death of RoboVM
- The Humble Hacker?
- New Container Image Standard Promises More Portable Apps
- BitTorrent Inc.'s Sync
In modern computer systems, privacy and security are mandatory. However, connections from the outside over public networks automatically imply risks. One easily available solution to avoid eavesdroppers’ attempts is SSH. But, its wide adoption during the past 21 years has made it a target for attackers, so hardening your system properly is a must.
Additionally, in highly regulated markets, you must comply with specific operational requirements, proving that you conform to standards and even that you have included new mandatory authentication methods, such as two-factor authentication. In this ebook, I discuss SSH and how to configure and manage it to guarantee that your network is safe, your data is secure and that you comply with relevant regulations.Get the Guide