It would seem, that unless you are not actively involved in the current world (perhaps you are busy studying the galaxy or wondering whether that really is water on Mars), you might have heard something about going green. So, as a commuter who takes mass transit because it is easier and cheaper, image my surprise when one of our subway stations was bedecked in vinyl advertising touting that if you moved to this company’s platform, you could go green and reduce your energy consumption by more than 50%. It should be noted that this company, earlier in the year claimed you could get back close to 70% of your network bandwidth by switching to their VoIP platform, so I will take their numbers with a grain of salt (and a shot of tequila) but the issue of going green in the data center is something that caught my eye, not because it was a new trend, but because it was a trend. It would seem that going green is the current buzz word, both in and out of the IT industry. However, like Virtualization, Security or Y2K, you need to take one part myth, one part science, one part art, shake until confused and pout over the ice of shrinking IT budgets and you are left with the confusion of management as they glaze over with each sip of the vendor's concoction as they assign you the task of implementing the current trend.
OK, so maybe I am being dramatic, but when you think about it, IT has, in years without a major release from Microsoft, focused on something, usually pushed by the hardware vendors trying to move product, and the something this year seems to be going green.
The myth part of this follows along with Moore’s law. You remember Moore, he of the “…number of transistors that can be inexpensively placed on an integrated circuit is increasing exponentially, doubling approximately every two years.” Late last year, as I was preparing to move my data center, I had to count up the power consumption of my systems so that I could make sure there was enough juice to make them go. You would be amazed how fussy these systems can be about having enough power. In the process of computing watts consumed and BTUs generated, a rather startling fact made itself known (OK, perhaps not so startling if you are paying attention). The 1U pizza boxes, with the quad cores that seemed to radiate enough heat to warm your lunch (which they did quite nicely), ounce for ounce, generated less heat and used less power than the 6U bar fridges that had half the computing power and took up six times as much space. Of course, this does make sense. Every year, the systems improve in capacity and processing power, so why not in power consumption and BTUs generated. This is where the myth part comes into play. If you just keep current with your equipment, you are going green and do not even have to work hard to achieve it.
But that only gets you so far. Then the science kicks in. One of the more scientific improvements is not so much in the IT systems, but in better building maintenance and management. While most of us think about a data center as a huge empty room kept at a temperature just above freezing, where you can store meat and most who work there need parkas and gloves to function, the modern data center is no longer a giant freezer. Cooling in the new data center has gone from whole room to rack based where air is forced around and through the racks and up and down through the plenum rather than cooling all the empty spaces in the room. This is the next step in going green. There are other aspects to this. Efficient power management in lighting and other electronic systems; improved power cabling, making sure that power goes where it is needed and not where it is not needed; environmental changes in building design, materials and structures. These all help keep costs down and as more building material comes from recycled material, costs are reduced and increased greenness is achieved.
The art, of course, comes in the melding of all the various components that go into a data center. Budget costs will always drive the components that can be procured and there are always trade-offs. There are never enough dollars for everything we want, and never enough time to install all the little things that will help maximize our dollars spent, despite the current demands of management.
And after all, at the end of the week, after months of planning, a new trend will be reported, maybe right here in these very pages, and the cycle starts all over again. Happy Greening.
Fast/Flexible Linux OS Recovery
On Demand Now
In this live one-hour webinar, learn how to enhance your existing backup strategies for complete disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible full-system recovery solution for UNIX and Linux systems.
Join Linux Journal's Shawn Powers and David Huffman, President/CEO, Storix, Inc.
Free to Linux Journal readers.Register Now!
- Server Hardening
- May 2016 Issue of Linux Journal
- EnterpriseDB's EDB Postgres Advanced Server and EDB Postgres Enterprise Manager
- The Humble Hacker?
- The US Government and Open-Source Software
- BitTorrent Inc.'s Sync
- The Death of RoboVM
- Open-Source Project Secretly Funded by CIA
- New Container Image Standard Promises More Portable Apps
- ACI Worldwide's UP Retail Payments
In modern computer systems, privacy and security are mandatory. However, connections from the outside over public networks automatically imply risks. One easily available solution to avoid eavesdroppers’ attempts is SSH. But, its wide adoption during the past 21 years has made it a target for attackers, so hardening your system properly is a must.
Additionally, in highly regulated markets, you must comply with specific operational requirements, proving that you conform to standards and even that you have included new mandatory authentication methods, such as two-factor authentication. In this ebook, I discuss SSH and how to configure and manage it to guarantee that your network is safe, your data is secure and that you comply with relevant regulations.Get the Guide