Going Green

It would seem, that unless you are not actively involved in the current world (perhaps you are busy studying the galaxy or wondering whether that really is water on Mars), you might have heard something about going green. So, as a commuter who takes mass transit because it is easier and cheaper, image my surprise when one of our subway stations was bedecked in vinyl advertising touting that if you moved to this company’s platform, you could go green and reduce your energy consumption by more than 50%. It should be noted that this company, earlier in the year claimed you could get back close to 70% of your network bandwidth by switching to their VoIP platform, so I will take their numbers with a grain of salt (and a shot of tequila) but the issue of going green in the data center is something that caught my eye, not because it was a new trend, but because it was a trend. It would seem that going green is the current buzz word, both in and out of the IT industry. However, like Virtualization, Security or Y2K, you need to take one part myth, one part science, one part art, shake until confused and pout over the ice of shrinking IT budgets and you are left with the confusion of management as they glaze over with each sip of the vendor's concoction as they assign you the task of implementing the current trend.

OK, so maybe I am being dramatic, but when you think about it, IT has, in years without a major release from Microsoft, focused on something, usually pushed by the hardware vendors trying to move product, and the something this year seems to be going green.

The myth part of this follows along with Moore’s law. You remember Moore, he of the “…number of transistors that can be inexpensively placed on an integrated circuit is increasing exponentially, doubling approximately every two years.” Late last year, as I was preparing to move my data center, I had to count up the power consumption of my systems so that I could make sure there was enough juice to make them go. You would be amazed how fussy these systems can be about having enough power. In the process of computing watts consumed and BTUs generated, a rather startling fact made itself known (OK, perhaps not so startling if you are paying attention). The 1U pizza boxes, with the quad cores that seemed to radiate enough heat to warm your lunch (which they did quite nicely), ounce for ounce, generated less heat and used less power than the 6U bar fridges that had half the computing power and took up six times as much space. Of course, this does make sense. Every year, the systems improve in capacity and processing power, so why not in power consumption and BTUs generated. This is where the myth part comes into play. If you just keep current with your equipment, you are going green and do not even have to work hard to achieve it.

But that only gets you so far. Then the science kicks in. One of the more scientific improvements is not so much in the IT systems, but in better building maintenance and management. While most of us think about a data center as a huge empty room kept at a temperature just above freezing, where you can store meat and most who work there need parkas and gloves to function, the modern data center is no longer a giant freezer. Cooling in the new data center has gone from whole room to rack based where air is forced around and through the racks and up and down through the plenum rather than cooling all the empty spaces in the room. This is the next step in going green. There are other aspects to this. Efficient power management in lighting and other electronic systems; improved power cabling, making sure that power goes where it is needed and not where it is not needed; environmental changes in building design, materials and structures. These all help keep costs down and as more building material comes from recycled material, costs are reduced and increased greenness is achieved.

The art, of course, comes in the melding of all the various components that go into a data center. Budget costs will always drive the components that can be procured and there are always trade-offs. There are never enough dollars for everything we want, and never enough time to install all the little things that will help maximize our dollars spent, despite the current demands of management.

And after all, at the end of the week, after months of planning, a new trend will be reported, maybe right here in these very pages, and the cycle starts all over again. Happy Greening.


David Lane, KG4GIY is a member of Linux Journal's Editorial Advisory Panel and the Control Op for Linux Journal's Virtual Ham Shack


Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Uncharted regions of green power in the data centre

Nefarious Wheel's picture

It's true that the data centre is evolving to a more energy efficient standard. But there's still quite a lot of watts being pushed through the plenum in the form of waste heat. Air conditioning (the thermal bit, rather) is a heat pump. You generate heat, then you use aircon to pump that heat outside. One useful "green" improvement might be to chuck a regenerator and capture some of that waste heat. I'd guess a nice, efficient Stirling cycle cogeneration engine (they generate work from fairly minor temperature differentials) might be the way to go. Pump the resulting electricity back into the grid. They'll work when it's either too hot or too cold outside (for that middle bit, open a window ;)

I think you mentioned the

FredR's picture

I think you mentioned the exact word: efficiency. What engineer-type is not obsessed with efficiency? Even Moore's law basically said "things are going to get twice as efficient every two years". I believe the "green" moniker stuck because average everyday Joe Q. Citizen who is not an engineer, may be a little intimidated by the word. What else does "green" mean besides, let's try and do more work with a machine while consuming less resources? And isn't that the ultimate goal of business in general?

-- FLR or flrichar is a superfan of Linux Journal, and goofs around in the LJ IRC Channel

Going Green vs GOING GREEN

David Lane's picture

There are, of course, two different issues at play. The first is using the term going green as a marketing ploy to sell more product and then there is going green as a way to really cut energy consumption, not necessarily related to cutting costs. For businesses in general, yes, as they say in "Head Office" the company is out to maximize its profits and therefore the return to the investor, therefore, anything that cuts costs is a good thing, regardless of the current moniker the marketeers decide to hang on it. It is not about "efficiency" per se but effective use of limited resources. (Sure I could be splitting hairs here).

The other form though, the "tree hugger" or whatever pejorative you want to hang on it, version of going green is about reducing energy consumption. Period. That form of going green is all about the complete reversal of the entire energy based service infrastructure that we have today. Of course, like vegetarians that wear leather, or those who are pro-life but support the death penalty, most are talking out of both sides of their mouth. But that is hardly a technical discussion.

David Lane, KG4GIY is a member of Linux Journal's Editorial Advisory Panel and the Control Op for Linux Journal's Virtual Ham Shack

White Paper
Linux Management with Red Hat Satellite: Measuring Business Impact and ROI

Linux has become a key foundation for supporting today's rapidly growing IT environments. Linux is being used to deploy business applications and databases, trading on its reputation as a low-cost operating environment. For many IT organizations, Linux is a mainstay for deploying Web servers and has evolved from handling basic file, print, and utility workloads to running mission-critical applications and databases, physically, virtually, and in the cloud. As Linux grows in importance in terms of value to the business, managing Linux environments to high standards of service quality — availability, security, and performance — becomes an essential requirement for business success.

Learn More

Sponsored by Red Hat

White Paper
Private PaaS for the Agile Enterprise

If you already use virtualized infrastructure, you are well on your way to leveraging the power of the cloud. Virtualization offers the promise of limitless resources, but how do you manage that scalability when your DevOps team doesn’t scale? In today’s hypercompetitive markets, fast results can make a difference between leading the pack vs. obsolescence. Organizations need more benefits from cloud computing than just raw resources. They need agility, flexibility, convenience, ROI, and control.

Stackato private Platform-as-a-Service technology from ActiveState extends your private cloud infrastructure by creating a private PaaS to provide on-demand availability, flexibility, control, and ultimately, faster time-to-market for your enterprise.

Learn More

Sponsored by ActiveState