Linux for Suits - Bridging the Gap
If you're one of the growing number of Linux advocates working in corporate IT, chances are you're involved in a struggle. Your side, the Linux/open-source side, is a grassroots movement, usually driven from the bottom up, not from the top down. Meanwhile, IT management remains a top-down culture that prefers “solutions” in the form of mandated strategy, not ad hoc problem solving.
To understand what we're up against, it helps to visit the top-down side where it lives. That's what I did on June 27–30, 2004, in Boston, at the 32nd Annual Conference and Annual General Meeting of the Membership of the Information Systems Audit and Control Association, the ISACA.
For all of its 32 years, ISACA has been obsessed with governance, control, accountability, strategy, planning, risk reduction, business continuity and so on. Here's a sample of the educational sessions:
“'Who Has Access to What': Controlling User Access and Segregation of Duties with Identity Management”
“Enterprise Risk Management and Basel II: The Mission Critical Role of IT”
“Incident Response Forensics Investigation”
“Use of Encryption to Enforce Regulatory Compliance”
“Successful Software Asset Management”
Not surprisingly, open source wasn't on the agenda—until this year, when I was invited to come and teach a class on “The Realities of Open Source”.
Before and after I gave my talk, I sat in on a variety of other sessions. In every case I carried the only laptop in the room, other than the ones used by speakers to give Microsoft PowerPoint presentations. When I pulled it out to take notes, or to work on my own talk, others looked at me like I was carrying a loaded weapon. Was I about to commit an act of wireless hacking, perhaps?
That would have been cool, of course, but not with this crowd. Open source, to them, smacks of the unknown. The very notion of “open” sounds like a security breach waiting to happen—a risk best “managed” by avoiding it in the first place.
So giving the talk was a fun challenge. I had addressed newbies before, but never newbies who also happened to work in IT Governance. I felt like I was explaining free enterprise to the Politburo in the old USSR. The cultural distance verged on the absolute.
I opened with the two questions I already knew were on everybody's mind:
How do you design an audit to account for open source?
For that matter, how do you govern, control and assure it?
I answered by saying “We're not in Kansas anymore” and proceeded to explain how many of the civilized graces we take for granted, from e-mail and Web servers to Amazon and eBay, owed credit to open-source developers, values and effects—all of which were as wild and uncontrollable as the next storm or earthquake.
That said, I went on to explain that open source was also about resourceful individuals and groups doing what needs to be done, that open source is an example of what happens when the demand side supplies itself and that it can be understood as the DIY sector of the Information Systems construction and maintenance business.
I also said none of this was especially visible when you looked down at it from the parapets of Fort Business, demanding that it comply with approved procedures before Management dropped the bridge across the moat.
I showed the sizes of various open-source conversations on the Web, which equaled or even exceeded those concerning Microsoft and other familiar names. I showed how much Linux and open-source code probably was already running in attendees' companies, whether they knew it or not.
And, I talked about common interests—for example, the use value (rather than the sale value) of software, and also low cost, availability, quality, integrity, efficiency, effectiveness, continuity, robust infrastructure, leveraged physical assets, problem solving and other virtues copied from the slides of other speakers.
In the Q&A we talked about SCO FUD and other predictable issues. One attendee demanded an explanation for why “they” (meaning open-source advocates) had “such terrible manners”. Later on a boat tour of Boston Harbor, she came up to me, waved a cigarette in my face, and insisted that open-source people “have a good case, but they simply must change their behavior”. But she couldn't give me an example of what she meant.
After my talk, several attendees told me they were, in fact, open-source users and advocates inside their organizations (one was Monsanto), but that the prospects for open source in large conservative IT organizations was poor in the short term, even if they were good in the long term.
The main issue was auditability. One attendee told me that open-source stuff is, indeed, auditable (for example, the source code itself can be audited), but that this fact was still far from obvious.
It was clear, however, that for most attendees, open source simply wasn't on their radar. “We don't have any, and we don't plan to have any”, was the basic response. Another sign: the room was only half-full. The session after mine was on “Wireless Hacking Exposed”. It was standing-room only.
In my July 9, 2004, SuitWatch newsletter, I wrote about the cultural divide I witnessed at the conference and its near-absolute contrast from Supernova, the conference where I spoke a couple of days before ISACA. The theme for Supernova was decentralization. If anything, ISACA's theme was the polar opposite.
Immediately an e-mail came from Glen Campbell, an IT consultant and veteran of both cultures. “The largest driving factor in most Fortune 2000 IT shops is risk mitigation”, he said. “Likewise, the whole concept of DIY-IT is a complete anachronism to most IT shops. They did DIY-IT in the '60s, '70s and '80s, and discovered that, not only is it less expensive to outsource their internal systems (to SAP, or Siebel, or EDS or IBM), it also—tada! —reduces their risk in that there's another large corporation out there they can sue if something goes wrong.”
Customers requiring “six nines” of reliability have been getting it for years from those same providers, or from HP, Sun or Novell. “That's why Sun can charge 10–50× for a server that runs Apache (which, by the way, is one of the very few open-source packages that large companies have come to trust)”, Campbell adds.
The regulatory environment has also elevated corporate IT paranoia.
Last year I attended a Harvard Business conference in New York where a procession of expensive consultants from Deloitte and other companies described the complex new reporting and accounting procedures required by the recently passed Sarbanes-Oxley Act, which enforces, post-Enron, a much higher degree of corporate—and personal—accountability. Regardless of whatever good it intended to do, it also was clear that Sarbanes-Oxley promised to (borrowing Scott McNealy's immortal words) “darken the skies with consultants” and expand corporate bureaucracies—IT Governance, for example. It was clear at ISACA that Sarbanes-Oxley was a godsend for members. Suddenly they found themselves standing exactly in the direction top corporate brass was forced by the new law to turn to for help.
So, the question becomes, where does open source fit in a Sarbanes-Oxley context? Here's Glen Campbell again:
I have a number of friends here in Silicon Valley who are on the “high-tech” side of the world. They simply cannot understand why, when you encounter an IT problem, a company doesn't grab Perl or Python and write a script that solves it. The IT guys, on the other hand, look at this with horror; how in the world could this hold up to a Sarbanes-Oxley investigation, which requires the CEO to sign off personally on the sources and validity of the corporate data? How in the world is the CEO going to understand a Perl script? The CEO will happily pay $100,000 to avoid going to jail because of this.
And his isn't the only bad news. Another SuitWatch response included this:
...my organization's overlords, some of whom were probably at ISACA and most assuredly none of whom were at Supernova, have issued a blanket edict forbidding the use of Wi-Fi anywhere inside the entire (organization) and forbidding the use of Wi-Fi with any (organization) equipment anywhere at any time. Needless to say, the (organization) CIO is firmly anti-open source and appears to be a vassal of Microsoft.
Yet Linux continues to grow in the enterprise. So does the whole LAMP suite. I'm writing this column in London, at the Identity Management Summit, where I just finished talking with Conn Crawford, who runs IT for the City of Sunderland, on the northeast coast of England, and with Nick Somper, Business Development Manager of Open Systems Management, Ltd. Both had Linux stories to tell. Crawford said his city just bought a Red Hat server to issue certificates for the city's PKI system. Somper's company does “adaptive systems management” for distributed and mixed environments that include UNIX, Linux and Windows systems. He sees Linux as a large and growing presence in IT.
Crawford and Somper both live in the market space where Linux has grown past the grassroots stage. It's part of the IT ecosystem in that space. Now the challenge is to make Linux more than safe—in perception as well as reality—for the large IT shops where risk aversion is a way of life. But before we do that, we need to understand the differences involved.
Adam Hertz, CTO of Ofoto, has a shop where DIY-IT and open source are ways of life. The company's whole business runs on systems developed almost entirely internally, on Linux and other open-source components. It also grows at an extremely fast rate, adding up to two terabytes of storage every day. His whole business wouldn't be possible without open source, Linux and DIY-IT. From that perspective, here's what he sees in the relatively staid environments where most ISACA members live:
Two themes: BigCoIT is all about standardization and isolationism.
Standardization, so the story goes, reduces risks and costs. It certainly reduces complexity, but it can take a huge toll on flexibility and responsiveness.
Standardization often involves using one multipurpose tool or platform to accomplish a lot of different purposes. This often involves customization, which is done by in-house experts or professional services firms—for example, Siebel, Lotus Notes and so on.
DIY shops tend to be cynical, or even downright frightened, of systems like that, because they're so inflexible and unhackable.
Another form of standardization is what people are allowed to have on their desktop PCs. In a lot of big shops, everyone has the same disk image, with all applications pre-installed. There's a huge suspicion of anything that comes from the outside world, especially open source. It's regarded as flaky, virus-laden, unscalable and so on. This produces isolationism, which means major barriers to just trying something.
In more open environments, there's a permeable membrane between the corporate IT environment and the Net. People tend to get new tools from the Net, usually open source, and just give 'em a spin. Culturally, this keeps the organization open to innovation and new approaches. It builds bonds between the employees and the development community at large.
Standardized, isolationist shops miss out on all of this. They maintain control, but inevitably they fall behind.
If we want to sell Linux upward into risk-averse enterprises, it would help to follow the guidance, as well as the example, of J.P. Rangaswami, CIO of Dresdner Kleinwort Wasserstein, the billion-dollar-plus investment bank. When I asked J.P. to help make sense of the challenge here, he said:
Too many shops are currently executing uncertainty management rather than risk management. And because they don't understand the risk, they do the right thing and sit on their hands. Once you understand a risk you are able to price it and make a rational decision. So in my opinion, shops differ in their ability to price risk, and the default position is risk aversion, which may be far more expensive even in the short run than an active risk appetite. Witness the colossal sums spent on Y2000 just because people could not understand or price the risk.
DIY-IT is, when based on open source, defensible as a low-risk position. More people inspect the code. There are more people using the code. There is more flexing of the code in differing circumstances. The issue is not of risk but of comprehension.
Last December Risk Waters named J.P. its CIO of the Year.
Resources for this article: /article/7691.
Doc Searls is Senior Editor of Linux Journal
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- Tech Tip: Really Simple HTTP Server with Python
- Managing Linux Using Puppet
- Google's SwiftShader Released
- Parsing an RSS News Feed with a Bash Script
- SuperTuxKart 0.9.2 Released
- Doing for User Space What We Did for Kernel Space
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide