Why We Need an Open Source Second Life
Unless you have been living under a rock for the last six months, you will have noticed that the virtual world Second Life is much in the news. According to its home page, there are currently around 1,700,000 residents, who are spending $600,000 – that's real, not virtual, money – in the world each day. These figures are a little deceptive – there are typically only 10,000 to 15,000 residents online at any one time, and the money flow is not a rigorous measurement of economic activity – but there is no doubt that Second Life is growing very rapidly; moreover, we are beginning to see it enter the mainstream in a way that has close parallels with the arrival of the Web ten years ago.
Companies are beginning to set up shop in Second Life, including big names like Adidas, American Apparel, Dell, Nissan, Penguin Books, Reebok, Sun Microsystems, Toyota, Reuters and Wired. Often they choose to create their virtual buildings on self-contained islands, which are essentially three-dimensional analogues of the early corporate Web sites: that is, vaguely pretty to look at, but not very functional.
One of the pioneers in Second Life is IBM, which also played an important part in helping to make the Web (and open source) respectable for businesses. Here's what Irving Wladawsky-Berger, vice president of technology strategy and innovation at IBM, and the man who oversaw the company's GNU/Linux policy in the early days, says about IBM's interest in Second Life:
I think that what we are seeing is the evolution of the Internet and World Wide Web in incredibly important new directions. Foremost among them is a much more people-centric Web.
We see this people-centric evolution of the Web in social networks and Web 2.0 - capabilities that enable people to find each other, form communities, share information, and collaborate on a variety of endeavors. Now we are bringing to this new people-centric spirit the highly visual, interactive applications in Virtual Worlds. This new breed of applications is being rethought around the people who design them, maintain them and use them, instead of asking those people to come down to the level of the computers.
We can now bring these exciting capabilities, already in wide use in science, engineering, defense and consumer applications, into the worlds of business, education, health care and government. This was the step that led to IBM’s e-business strategy ten years ago. Could we be at the onset of v-business? Based on my initial experiences in Second Life, we are all in for an incredible ride.
His boss, Sam Palmisano has backed up those words with actions. A couple of weeks ago, he entered Second Life himself to give a major speech about IBM's future path, and announced a $100 million fund to create 10 new businesses within the company, including:
3D Internet: Partnering with others to take the best of virtual worlds and gaming environments to build a seamless, standards-based 3D Internet -- the next platform for global commerce and day-to-day business operations.
So, things look bright for Second Life and the other virtual worlds that are being developed. There's just one problem: they are all closed source. This means that free software is falling behind in one of the most innovative areas in computing today.
Linden Lab, the company behind Second Life, is very open-source friendly. Its computing infrastructure is based on thousands of servers running GNU/Linux, Apache, Squid and MySQL. Alongside the usual Windows and Macintosh clients for Second Life, there is already one for GNU/Linux (if still a little rough at the edges).
And Linden Lab hopes to go even further by opening sourcing Second Life's software. Here's what Philip Rosedale, Second Life's creator and CEO of Linden Lab, told me during an extensive interview recently, when I asked about his current thinking on opening up the code:
Without speaking to specific timing or plans - and we've thought and are thinking lots and lots where there might be exceptions to this - but it seems like the best way to allow SL to become reliable and scalable and grow. And we've got a lot of smart people here thinking about that.
Further proof of Linden Lab's goodwill towards the free software world can be found in its tacit approval of an open source project to reverse-engineer the Second Life protocols. Called libsecondlife, it has already done valuable work, although this has been overshadowed somewhat by the recent brouhaha over the CopyBot program, which drew on libsecondlife's code. CopyBot allowed some or even all of an object in Second Life to be copied. This is obviously a problem for a virtual economy that depends on selling digital objects. And yet, despite many cries to the contrary, the sky is not falling, as I've explained elsewhere.
More than the blip of CopyBot, there are deep problems that need to be addressed in the context of creating an open source version of Second Life, notably as far as security is concerned. Most of them have to do with how open source clients would interact with Linden Lab's servers, and how it might be possible to allow users to run their own Second Life servers – effectively creating separate virtual worlds based on the same protocols.
As well as libsecondlife, there are a couple of other open source virtual world projects of note. For example, Croquet employs an ambitious approach that goes beyond Second Life in many ways; however, it is still at an early stage. The same can be said about Uni-Verse, a European consortium that includes the foundation behind the popular 3D tool Blender.
These are all useful initiatives, and there will doubtless be others. But if open source is to give the lie to Jim Allchin's famous jibe in the first Halloween Document that it is always "chasing tail-lights", the free software community must become more involved with the existing virtual world projects, and invest much more time and effort in new ones.
Developing expertise with the underlying technologies is particularly important because it is quite possible that the next stage in the Web's evolution will incorporate elements from three-dimensional virtual worlds. Philip Rosedale explained why he thinks that is likely:
People always believe that the idea of simulating a three-dimensional world will make the experience of people in it different because it's three dimensional, and that's certainly true. However, there's a second thing about the 3D web that makes it different than the 2D web, and is really important, which is that there are other people there with you when you're experiencing it.
Look at MySpace. When you go to a MySpace page, you can listen to their music. What is the listening experience like? Well, it's still just you sitting in front of your computer listening alone to that music. But in SL, if you're listening to somebody's music, whether live or pre-recorded, there's a very good chance that there's someone next to you listening to the same music, and so you're able to turn to them and say: What do you think? Or you're able to turn to them and say: Have you been here before, and, if so, do you know where the lawnmower section is?
That, I think, is what makes the potential of the 3D Web different perhaps even more so than the spatial difference between 3D content, and 2D content. And I think that alone makes it very likely that there will be a kind of a 3D Web, that has this shared experience property. That's what everyone will look back on and say: Wow, that is what made it different.
Glyn Moody writes about open source at opendotdotdot.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- Doing for User Space What We Did for Kernel Space
- Google's SwiftShader Released
- SuperTuxKart 0.9.2 Released
- Parsing an RSS News Feed with a Bash Script
- LiveCode Ltd.'s LiveCode
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide