Wackypedia: the Wikipedia fork
The fork occupies an ambivalent place in the world of open source. On the one hand, it is widely perceived as the worst thing that can happen to a project, pitting hacker against hacker, and dissipating coding effort that could be more usefully applied in a united way. On the other, it is the ultimate test and guarantee of openness: if code cannot be forked, it is not truly open. Perhaps most importantly, it is the threat of the fork, hanging over projects like a digital sword of Damocles, that keeps them close to their constituencies, as free software's short history has shown time and again.
The closest that the Linux kernel has come to forking, was during the famous “Linus does not scale” incident that began on 28 September1998 with the innocent question:
Am I the only one for whom 2.1.123 fbcon.c doesn't compile?
and culminated with Linus losing it:
Go away, people. Or at least don't Cc me any more. I'm not interested, I'm taking a vacation, and I don't want to hear about it any more. In short, get the hell out of my mailbox.
At which point senior coders like Alan Cox and David Miller feared that a fork might be necessary if it proved impossible to revise the existing patch submission system that was centred around Linus. Fortunately, that split never happened, not least because its threat was enough to concentrate people's minds on finding a mutually satisfactory solution – in this case, first, to change the way patch submissions were made, and secondly, to use Larry McVoy's BitKeeper for overall source management, later replaced by Linus' own Git.
Ultimately, then, the spectre of a fork proved salutary for kernel development: it forced the main players to confront the growing problems, and find a solution, rather than just carry on and hope for the best.
Given this dynamic, I wonder whether it might be time to start thinking about forking Wikipedia – purely for its own good, you understand.
Some might say that such drastic action is hardly necessary, because Wikipedia is not faced by any looming crisis; on the contrary – it is going from strength to strength, with the English-language version alone storming past the two-million article mark. But there has always been an unresolved tension at the heart of Wikipedia's mission, one that goes right back to its origins.
Before Jimmy Wales's Wikipedia, there was Jimmy Wales's Nupedia, inspired by Dmoz, a volunteer effort to create a free version of Yahoo's hierarchical listings. Dmoz began in 1998 under the name Gnuhoo - which was inspired by GNU/Linux - before turning into Newhoo. It was acquired by Netscape and released as open content. As Wikipedia's co-founder Larry Sanger explained:
Originally [Wikipedia] was the Nupedia Wiki - our idea was to use it as an article incubator for Nupedia. Articles could begin life on this wiki, be developed collaboratively and, when they got to a certain stage of development, be put it into the Nupedia system.
That is, Wikipedia was originally a kind or rough and ready working area, where articles were knocked together and then polished for inclusion in the definitive Nupedia. The free-for-all nature of Wikipedia was one reason why Sanger later decided to set up his own Citizendium project, what he called a "progressive or gradual fork" of Wikipedia. That idea was later dropped, and Citizendium is now essentially a standalone effort to produce an online encyclopedia where the emphasis is on using the knowledge of experts to guide the creation of articles, rather than weighting contributions from anyone equally, as with Wikipedia.
Ironically, though, Wikipedia has been moving steadily towards Citizendium's philosophy. The original claim of being "an encyclopedia anyone can edit" has become less all-inclusive – not so much in terms of who may write, but rather as far as what they can write about. This involves the issue of “notability”:
Within Wikipedia, notability is an inclusion criterion based on encyclopedic suitability of a topic. The topic of an article should be notable, or "worthy of notice". This concept is distinct from "fame", "importance", or "popularity", although these may positively correlate with notability. A subject is presumed to be sufficiently notable if it meets the general notability guideline below, or if it meets an accepted subject specific standard listed in the table to the right.
That is, Wikipedia is no longer interested in accepting entries about any old thing, but requires them to be “notable” in the above sense. Now, that's all very well if you aspire to be a dead serious kind of encyclopedia along the lines of the Encyclopaedia Britannica, but given that Wikipedia offered the hope of something more than just an online version of that dead tree monument to dead knowledge, I'm not so sure this attempt at increasing the respectability of Wikipedia is right.
For a start, it means that I can't just go to Wikipedia and find entries on any old subject: if my query is about something insufficiently “notable”, then it's back to Googling for an answer. But it seems to me that part of the promise and power of Wikipedia is that it could be the ultimate repository of knowledge – all knowledge, not just the serious, notable bits.
Of course, Wikipedia is perfectly entitled to take any direction it wants to, but given that I, at least, want something more, I wonder whether there might be others out there who would also prefer a more inclusive, less picky, Wikipedia – as it was in the beginning - one that routinely has entries on anything and everything. After all, the great thing about such online repositories is that nobody is forced to read stuff they don't care about. You could even offer users “notability” filters, so that you only saw the level of trivia you could stomach.
It would be easy for Wikipedia to accomplish this – it might even regard this rough and ready Wackypedia as the development branch for the “real”, grown-up Wikipedia – just as Wikipedia was originally meant to be for Nupedia. If it doesn't want to do that, fine – but maybe somebody else does: anyone for a fork?
Glyn Moody writes about openness at opendotdotdot.
|Designing Electronics with Linux||May 22, 2013|
|Dynamic DNS—an Object Lesson in Problem Solving||May 21, 2013|
|Using Salt Stack and Vagrant for Drupal Development||May 20, 2013|
|Making Linux and Android Get Along (It's Not as Hard as It Sounds)||May 16, 2013|
|Drupal Is a Framework: Why Everyone Needs to Understand This||May 15, 2013|
|Home, My Backup Data Center||May 13, 2013|
- Linux Systems Administrator
- New Products
- Senior Perl Developer
- Technical Support Rep
- UX Designer
- Designing Electronics with Linux
- Dynamic DNS—an Object Lesson in Problem Solving
- Using Salt Stack and Vagrant for Drupal Development
- Making Linux and Android Get Along (It's Not as Hard as It Sounds)
- Have you tried Boxen? It's a
3 hours 40 min ago
- seo services in india
8 hours 12 min ago
- For KDE install kio-mtp
8 hours 13 min ago
- Evernote is much more...
10 hours 13 min ago
- Reply to comment | Linux Journal
18 hours 58 min ago
- Dynamic DNS
19 hours 32 min ago
- Reply to comment | Linux Journal
20 hours 31 min ago
- Reply to comment | Linux Journal
21 hours 21 min ago
- Not free anymore
1 day 1 hour ago
1 day 5 hours ago
Enter to Win an Adafruit Pi Cobbler Breakout Kit for Raspberry Pi
It's Raspberry Pi month at Linux Journal. Each week in May, Adafruit will be giving away a Pi-related prize to a lucky, randomly drawn LJ reader. Winners will be announced weekly.
Fill out the fields below to enter to win this week's prize-- a Pi Cobbler Breakout Kit for Raspberry Pi.
Congratulations to our winners so far:
- 5-8-13, Pi Starter Pack: Jack Davis
- 5-15-13, Pi Model B 512MB RAM: Patrick Dunn
- 5-21-13, Prototyping Pi Plate Kit: Philip Kirby
- Next winner announced on 5-27-13!
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?