Insecurity Threat

Responses to claims that Linux and open source are threats to national security.

On April 9, EE Times reported on a speech given by Dan O'Dowd, CEO of Green Hills, an embedded operating system company. Here are a few of the pull quotes:

The open source process violates every principle of security. It welcomes everyone to contribute to Linux. Now that foreign intelligence agencies and terrorists know that Linux is going to control our most advanced defense systems, they can use fake identities to contribute subversive software that will soon be incorporated into our most advanced defense systems."

If Linux is compromised, our defenses could be disabled, spied upon or commandeered.

Every day new code is added to Linux in Russia, China and elsewhere throughout the world. Everyday that code is incorporated into our command, control, communications and weapons systems. This must stop.

Linux in the defense environment is the classic Trojan horse scenario--a gift of 'free' software is being brought inside our critical defenses. If we proceed with plans to allow Linux to run these defense systems without demanding proof that it contains no subversive or dangerous code waiting to emerge after we bring it inside, then we invite the fate of Troy.

Before most Linux developers were born, Ken Thompson had already proven that 'many eyes' looking at the source code can't prevent subversion.

We asked Eric S. Raymond, open-source advocate and UNIX historian, to respond. He wrote:

Well, for starters this guy is abusing the Thompson example in a couple of ways. Technically, the C compiler was not open source at the time Thompson put in his back door--you needed an AT&T source license to see it legally. But the more fundamental point is this: nothing prevents someone at a closed-source shop from doing exactly the same thing.

O'Dowd is making the unstated assumption that somehow closed-source development prevents the placement of code unexpected by users in a way open-source development doesn't. The widespread prevalence of easter eggs in closed-source code, like the Mac dogcow or the flight simulator embedded in Excel '97, shows this is nonsense. Managers are essentially helpless to prevent this sort of thing.

If I were an enemy spy, I would much rather bribe a closed-source programmer to slip a deadly easter egg into DOD software than send a patch to an open-source project--my risk of detection would be far less that way.

In another rebuttal on a forum at LinuxDevices.com, "Concerned Citizen" replied, "Let's not forget that the terrorists that Mr. O'Dowd refers to used proprietary software for attacks on the USA. They have Windows machines and Flight Simulator, you might recall." He concludes:

Linux will be "hardened" for use in military systems. It will take time, and it will cost money. The results will be every bit as good--no, better--than what we see today. Such work is underway, and SE Linux by the NSA is but a start. But the hardening will take less time, eventually be more secure, and cost far less than the hodge-podge of incompatible proprietary systems, some from vendors who have gone out of business or pay lip-service to value and utility, or who simply don't care to truly improve their code thus endangering us and our troops. Your argument is tired and worn out, Mr. O'Dowd. Time to give it a rest.

The last word, of course, belongs to the market. For that we have this, from Business Week:

Dana Myers is known as the penguin lady at chip giant Texas Instruments (TXN). Since 2002, Myers has overseen development work on compact versions of the open-source Linux operating system used to run chips and circuit boards in mobile phones and other electronics products. And during the past year, Myers has boosted the size of her development team by 75%, to nearly 100 people. "All over the world, customers are asking us for Linux", says Myers.

Doc Searls is Senior Editor of Linux Journal.

______________________

Doc Searls is Senior Editor of Linux Journal

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Re: Insecurity Threat

Anonymous's picture

The solution is incredibly simple: Look inside the horse! And that cannot be done with closed source. The reason the Trojan Horse worked is because someone trusted a gift from their enemies (guess they didn't want to "look a gift horse in the mouth" - sorry!). If they had only looked inside...

The Lice of DC

Anonymous's picture

I live in DC. I imagine it mimics ancient Babel in that, like the biblical story in which the Tower construction failed as everyone began speaking foreign languages, government here daily fails our nation.

However, to address _this_ particular issue, let's take a moronic bureaucratic justification and reverse it to favor common sense: "we must fully research chemical, nuclear, and biological weapons, not to develop offensive weapons, but to understand them and develop the best possible defense." EXACTLY the same thing can be said in defense of open source and against nearly anything closed, secret, or unknown.

I remain worried that the HORDE of .gov lemmings inside the beltway will, under duress of cash and Gates and politics and every other evil, keep the USA saddled to inferior microsoft shlockware.

THANK GOD other nations now give us intellectual competition via Linux.

Regards,
A Concerned US Citizen

Closed Source Insecurity Threat

NZheretic's picture

On September 28, 1999, an Internet Caucus Panel Discussion was held to discuss the issues surounding the Clipper chip and export restrictions on encryption in general. Congressman Curt Weldon raised a couple of interesting questions over the briefing he had with John Hamre of the US NSA.

But the point is that when John Hamre briefed me, and gave me the three key points of this change, there are a lot of unanswered questions. He assured me that in discussions that he had had with people like Bill Gates and Gerstner from IBM that there would be, kind of a, I don't know whether it's a, unstated ability to get access to systems if we needed it. Now, I want to know if that is part of the policy, or is that just something that we are being assured of, that needs to be spoke. Because, if there is some kind of a tacit understanding, I would like to know what it is.

Backdoors to systems can be inserted and vulnerabilities can be deliberately left open. Because it is easy enough to compare binary code and disassemble the difference, the same binary code has to used globally, or the backdoor will be quickly discovered. That means the backdoors used to get access to foreign powers computers by the NSA is will also be inside the computers in your country as well, left open for anyone to exploit.

This kind of security policy is an oxymoron. The only way to secure your countries information infrastructure is to have a policy to remove any such vulnerabilities and backdoors as soon as possible after discovery, Closing the Window of Exposure.

Unless you can have access to all the source code and have the right to recompile and compare the binaries, you cannot verify that the software you are using is free of backdoors.

If you do not have the resources to examin every line of source code, then you best bet is to use source code that is fully open to peer inspection.

In my opinion, an open source license, opens up the code to true peers in the industry, people who work with the source code to build solutions. When flaws are discovered, it is these peers who closely examin the patches and the source code that is vulnerable.

Otherwise who do you trust, the vendor? Remember Ed Curry!
In October 26, 1998, Ed Curry a former Microsoft contractor, presented documents to the Defense Department that he said proved that Microsoft Corp. conducted a campaign to mislead the government about the security certification status of Microsoft Windows NT.
You don't need to modify source code to insert a backdoor, "infection" can take place anywhere along the build to delivery chain.

In June 2002, Microsoft shipped a copy of Korean-language version of Visual Studio .NET infected with a copy of the Nimda worm.

There is a saying that goes back to the end of the cold war,: "Trust, but verify". In the same way you must have access to the source code and the ability to rebuild the toolchain from scratch to compare the resulting binaries.

the lice of DC

Anonymous's picture

I live in DC. I imagine it mimics ancient Babel in that, like the biblical story in which the Tower construction failed as everyone began speaking foreign languages, government here daily fails our nation.

However, to address _this_ particular issue, let's take a moronic bureaucratic justification and reverse it to favor common sense: "we must fully research chemical, nuclear, and biological weapons, not to develop offensive weapons, but to understand them and develop the best possible defense." EXACTLY the same thing can be said in defense of open source and against nearly anything closed, secret, or unknown.

I remain worried that the HORDE of .gov lemmings inside the beltway will, under duress of cash and Gates and politics and every other evil, keep the USA saddled to inferior microsoft shlockware.

THANK GOD other nations now give us intellectual competition via Linux.

Regards,
A Concerned US Citizen

Re: Insecurity Threat

Anonymous's picture

If we proceed with plans to allow Linux to run these defense systems without demanding proof that it contains no subversive or dangerous code waiting to emerge after we bring it inside, then we invite the fate of Troy.

This part irks me.

There is no need to demand proof, all the proof you could ever want is already being offered up for free, voluntarily. It's called source code. It is the reason this is called Open Source software.

The only thing the military or any other organization for that matter, needs to do to ensure that they are getting what they think they are is build their distro from source, whether that means going to a source distro like Gentoo or just downloading the SRPM packages of RedHat or Mandrake, or Suse or whatever distro they happen to favor, and building from those. If one small group of trusted IT guys download, inspect and build a trusted build there is no way that anything like that could happen.

C'mon it's called open source for a reason. Use the source, that's what it's there for.

Re: Insecurity Threat

tripolitan's picture

No kidding, if I was a hacker-terrorist, I would definately feel better if DOD's computers/servers were all Windows technology (oxymoron?) and why not? All one has to do is look at the number of documented security holes/bugs and compare that with the speed at which they get fixed. Then compare that with Open Source software. Nowadays, it seems that if you don't like something, call it a securiy threat!

Re: Insecurity Threat

Anonymous's picture

While his words are over the top and flame bait, the point behind the flames is actually valid.

The requirements for specific military software are that each and every part has to be from a known source and audited at each stage. These are also very small operating systems, not full-fledged with bells and whistles. That that audit process counts out Linux is fine by me. It also counts out just about every other OS out there for the same reason.

If an Apple, Sun, Microsoft, HP or other provider made the same statements about the OS they primarily ship, I'd be concerned. That said, the uses for such an OS are fairly limited. Linux surely can be used in the military. After all, if Windows was good enough for some tasks (and it might not be!) ... Linux surely would be even more secure and trust worthy.

Anonymous's picture

Mr. O'Dowd, do not proceed.

Re: Insecurity Threat

Anonymous's picture

The DOD trusts no-one and that's fine with me, it's their job. So relying on a small group of trusted people to check software is just not realistic. People , like software, can always be compromised -it's just a matter of finding the right button to push. Open source is better that closed source because the source is out there HOWEVER hacks do happen and holes are left open. With the money that the DOD save on using Open Source software they could place bounties/prizes for finding and exploiting issues in Open Source software.
Find a low priority bug $10, medium $100, high $1,000 critical $10,000.
Everyone loves money and I would much rather have 100's if not 1000's of security experts trying to hack my systems for the right reasons MONEY not malicious intent. Plus the DOD would have a tight, regulary audited code base and some peace of mind (ha !)
Cheers debian_dummy