Novell made the right decision even if for the wrong reasons
Novell has decided not to use proprietary Linux modules such as the NVidia accelerated driver. My first reaction was that Novell was being needlessly idiotic. Then I read this article on OSWeekly.com, by Matt Hartley. It calls out the leading Linux distributions for failing to band together to pressure hardware vendors to pre-install Linux. I've been saying basically the same thing for the past few years, so I heartily agree with this article. It was then that it occurred to me that Novell may have made the right decision, even if for the wrong reason.
Before I continue, I have to point out a delicious irony in the first page of the above-mentioned article. It reads, "I have just learned that Novell is working to front a full scale assault on Microsoft's Vista as they ready their own Linux distribution, SuSE 10.1 and that super-cool xgl feature based on OpenGL." I can't help but wonder if the either Novell or the author is aware of the fact that Microsoft owns the patent on OpenGL. SGI used to own the patent, but Microsoft snagged it in one of the infamous "bail-out" deals Microsoft loves to make with struggling competitors. No doubt Microsoft coveted the patent for OpenGL because it was the only credible competition for DirectX. Should OpenGL-based games ever pose a threat to its PC and Xbox game market strategy, Microsoft now has the ability to cause trouble for the competition. Microsoft has hinted at waging patent wars against Linux, and this may be one that it has in mind. But that's a topic for yet another column.
Obviously, the Novell decision affects more than just NVidia, but NVidia practically owns the graphics card market, and for good reason. The video cards that use the NVidia chipsets are top notch. Virtually every computer I own has an NVidia card, and every computer I have owned in the past several years had an NVidia card. Why? They (with a few exceptions) work great, and they work even better on Linux when you use the NVidia proprietary driver. The driver is so popular that many Linux distributions install it by default when they detect that you're using an NVidia-based card. When distributions do not install it by default, it's generally a breeze to install it yourself without having to download the driver from the NVidia web site.
As I said, I'm an NVidia guy, and I'm not familiar with ATI cards or drivers. So I'll spend most of my time talking about NVidia. But the sentiments I am about to express apply to all proprietary drivers, even drivers for devices other than video cards.
Given the quote from vice president of Linux product management "[this decision] gives the responsibility for drivers to the vendors, which is where it belongs", I am inclined to assume Novell made this decision to offload the burden of maintaining the proprietary drivers to the vendors, not to pressure the vendors to reconsider how they license their drivers. That's too bad.
Come together, right now
So to echo the spirit of Matt Hartley's article, I am calling out all Linux distributors to follow Novell's lead, but for the right reasons. It's time Linux distributors, large and small, commercial and free, band together and pressure vendors, especially NVidia, to relicense their drivers under the GPL.
I'm perfectly aware of the fact that vendors like NVidia and ATI have proprietary driver algorithms they want to protect. But as far as I can see, NVidia has already made public all of the source code that comprises the kernel portion of the NVidia driver. That's probably because this code isn't doing anything interesting with the card. As far as I can tell, the NVidia driver doesn't even load a proprietary binary, such as firmware, in order to work. Most of the interesting stuff must take place in the Xorg modules and GL drivers, none of which would taint the kernel no matter how proprietary they may be. I'm not sure whether the NVidia license conflicts with the Xorg license, but that would be a whole 'nother issue.
The kernel issue must revolve around this header in the NVidia driver source (emphasis mine):
* Copyright 1999-2001 by NVIDIA Corporation. All rights reserved. All
* information contained herein is proprietary and confidential to NVIDIA
* Corporation. Any use, reproduction, or disclosure without the written
* permission of NVIDIA Corporation is prohibited.
Well, all of NVidia's information contained therein may be proprietary, but it isn't very confidential in any practical sense. You can browse through the source code all you want. So why not simply change the license to be GPL? WHat harm would that do to NVidia? None that I can see.
The only possible problem I can imagine is if the NVidia driver loads one or more of its proprietary libraries when you run Xorg such that these libraries become part of the kernel at runtime. I don't see any evidence of this, but it's conceivable, since the driver module seems to use up 4 MB, and I don't see how the source code alone compiles into a module that size. Maybe I'm missing something in the code that allocates that huge chunk of memory for reasons other than loading something at Xorg runtime.
But let's assume, for the sake of argument, that this is the case. The NVidia driver is loading proprietary binary code at runtime, and does so in such a way that this proprietary code becomes part of the Linux kernel. Obviously, in this case, it would not solve the problem to relicense the source code as GPL.
It would solve the problem, however, to make that binary code firmware that is licensed such that it can be included in all distributions. The Linux kernel already accomodates drivers with proprietary firmware, so this approach shouldn't taint the kernel. But (assuming NVidia isn't making its source code proprietary for other reasons, such as company politics), then perhaps the need to turn some portion of its libraries into firmware could be the sticking point for NVidia.
Here's why. The only way to switch from loading binary code into the kernel to making this same code firmware would be to modify the NVidia cards. Unless NVidia provided a way for the cards to work with both the firmware approach and existing drivers, this kind of design change would break backward compatibility and force NVidia to rewrite all of its drivers for other operating systems.
I hope NVidia will excuse me if I have little or no sympathy for the company if it has to deal with this problem in order to make its drivers GPL. The same goes for ATI, if ATI faces a similar problem. You can find NVidia chipsets everywhere these days, including embedded systems and motherboards. NVidia must be making a feces-load of cash. Surely it can afford to make this sort of change. It might jack up the price of its video cards, but it could also make these cards run faster.
The beneficial tradeoff would be that the kernel developers would carry the burden of maintaining the source code for the drivers. NVidia would no longer need to modify its source code every time a change in the kernel breaks their current driver code.
Put the pressure on
NVidia has a weak spot for Linux because NVidia's video cards are favored by those companies that use Linux and NVidia for high-performance rendering of graphics for animation, among other things. However, I don't see how anyone could get every Linux user on the planet to ditch their NVidia graphics cards, and I don't know that it would make a big enough dent in NVidia's bottom line even if they did. Maybe I'm wrong, but I think there's a better approach.
Linux distributors should band together to exercise some of their muscle to pressure NVidia, ATI, and others to GPL their drivers. They should make it as difficult as possible to install the proprietary drivers, even if they have to insert boot scripts to delete the modules and the proprietary libraries. They should, as a unified group, threaten to make it extremely painful for all but the most savvy Linux users to install and use the proprietary drivers. Better still, pick one or more alternative video chipsets with GPL drivers (I hear there are good GPL drivers for Intel video chipsets), and rub those in the faces of NVidia and ATI.
Threaten to promote these alternatives in advertising. Threaten to make a deal with Intel (or whoever) to help them sponsor a campaign to "trade in your NVidia/ATI card for $XX off on the alternative". Even better, go after both NVidia and ATI and play them off each other. Make it clear to NVidia and ATI that if one of them goes GPL and the other doesn't, that could make all the difference in video card brand loyalty. It takes only one victory to start a campaign to rip out your NVidia or ATI card in favor of the other because those drivers are finally GPL. Indeed, I might find it hard to give up the NVidia performance I enjoy for an inferior card, but ATI is real competition. I would be among the first to rip out all my NVidia cards and replace them with ATI cards if ATI went GPL.
I'm sure there are other tactics that escape me at the moment that would increase the pressure on NVida, ATI, and others. Some combination could work.
The key here is that the Linux distributions leaders are dropping the ball on several fronts. As Matt Hartley aptly pointed out, they're dropping the ball on getting Linux pre-installed on hardware. When it comes to hardware preloads, they seem to be unaware that if they cooperate more and compete a little less, they'll create a rising tide that lifts all boats. The same applies when it comes to making NVidia and ATI drivers GPL. All Linux distributions would be much more attractive to a wider audience if they all just worked with the favorite video cards of existing and potential customers.
The question is, are the executives and leaders of distributions listening? Are you willing to put in the effort? A lot of you give constant lip service to the importance of free software. How about backing that up with some aggresive pressure on those companies who fail to provide GPL drivers?
|Designing Electronics with Linux||May 22, 2013|
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
- New Products
- Linux Systems Administrator
- Senior Perl Developer
- Technical Support Rep
- UX Designer
- Designing Electronics with Linux
- Dynamic DNS—an Object Lesson in Problem Solving
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Using Salt Stack and Vagrant for Drupal Development
- Nice article, thanks for the
7 hours 24 min ago
- I once had a better way I
13 hours 10 min ago
- Not only you I too assumed
13 hours 28 min ago
- another very interesting
15 hours 21 min ago
- Reply to comment | Linux Journal
17 hours 14 min ago
- Reply to comment | Linux Journal
1 day 8 min ago
- Reply to comment | Linux Journal
1 day 25 min ago
- Favorite (and easily brute-forced) pw's
1 day 2 hours ago
- Have you tried Boxen? It's a
1 day 8 hours ago
- seo services in india
1 day 12 hours ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?