Game Developers Conference 2000
A video game convention, such a thing exists; and to such a place I went, and back from such a place I am to yield a report to such readers as subscribe to a computer magazine about a kernel and its operating system. Yes, I have been to the Game Developers Conference 2000 in San Jose, and I have such news as has not been reported in this column for a long time. In fact, since I don't usually cover commercial software, such news has never been reported here. Alas, variety is more than just the spice of life—it's also a reproductive advantage, so if we want to reproduce computer games, we'll have to learn how to crack them. Actually, that's one of the many things that has changed since I last visited commercial games. It seems the sheer size of the software is the best copy-restriction scheme of the day. No longer have we any clever challenges to break (and let's just say it was more fun cracking a game to look at its code and make it do funny things than it was to be able to copy it. As a kid, I used to rewrite dungeon adventure games in skater lingo). Many things have changed, and it is with a heart not quite as heavy as a heavy heart, yet not quite light as a light heart, that I report the findings of my undercover operation.
First of all, video game development is a serious industry. It's like the movie industry, except it isn't suing everyone on the Net. There's a struggle for the teenage mind and dollar, and the gaming racket is not so much about creativity as it once was, it's not about a single developer sitting in his bedroom churning out the latest big hit. Producing a video game is probably more difficult than producing a film, and there's a whole infrastructure built up to support the gaming industry, which brings me to my next point.
Video game development requires infrastructure, not just the infrastructure within the computers themselves that lets game developers have easy and uniform access to the video display, sound card and I/O devices (an infrastructure which we need to develop), but an actual support system within the industry to supply graphics engines, 3-D rendering tools and development kits. You may have noticed that modern games tend to be 3-D remakes of the Quake variety. Well, in order to produce the graphics for these games, developers have to use special graphics packages to develop 3-D models. Then, they quite often buy someone else's graphics engine to use. Some people at the convention were even trying to sell audio samples.
Another thing to know about the gaming industry is that it doesn't pay like a job at Microsoft. The top few games take up nearly the whole market, and everyone else is a tad out of luck. Now, creative people like Sid Meier and Richard Garriot who consistently make enormous hits aren't starving in the streets, but they aren't as wealthy as Bill, Paul and Steve, either. Why is gaming industry pay important? Because they're recruiting.
Yes, we coders are in demand, at least right now, and the video game industry has to be one of the most interesting areas to work in. At the moment, I don't do the “proprietary thing”, so it's mostly off limits to me unless I change my mind someday. But, if you want a fun place to work where you can exercise creativity, be aware the video gaming industry is hiring (pre-IPO with stock options, even).
Computer games can seem stupid, especially the new rash of boring 3-D killers, but they've vital to the success of computers. They attract future hackers, this is true, but their real value is that they drive the hardware industry. Of course, Microsoft's bloated OS also forces people to keep buying bigger and faster machines, but the main driver of progress (or so I choose to believe) is video games. Graphics cards, sound cards, processors—the more people buy games, the more these technologies are driven; in the end, we win.
What's new on the proprietary front? Well, if you haven't played modern games in a while, you might be impressed by the state of current production quality. However, if you have, you might be bored with all the bloody 3-D shooters. I didn't see any games that demonstrated truly new ideas, but at least the developers are getting better at implementing the old ones.
For GNU/Linux, I can say that Loki is our best hope right now. With seven releases so far and twenty scheduled this year, don't go back to dual-booting; there are more games on the way, and they're going to sound better, too. Loki is spearheading the OpenAL project, which is a cross-platform 3D audio library, essentially the OpenGL of audio. It's LGPL (lesser GPL, as it is now called), meaning that even though it is guaranteed to be free, you can use it to make things that aren't. It's a pragmatic approach, if we want it to be accepted in the hyper-commercial industry. Check out http://www.openal.org/ if you're interested in this project.
One thing holding back video game development on Linux is the lack of internal infrastructure. The multi-tasking, multi-user nature of our OS presents a slew of trouble when one user wants to play a game that needs access to the frame buffer, audio device and I/O ports, as well as some assurance of soft real-time performance. How do we avoid resource conflicts? How do we give programmers a consistent way to talk more-or-less directly to the frame buffer or the digital signal processor? And, how do we solve the latency problems? Fortunately, these questions are being forcibly answered by the imposition of demanding games. Once the infrastructure is there, we can expect easier development of both free and proprietary wares for GNU/Linux.
Didier Malenfant brought another ray of hope. Already quite a successful game programmer (with a legacy from the Amiga demo scene), he gave a presentation for developers interested in developing games that are easily portable to Linux. As long as developers start with several platforms in mind, they won't encounter much trouble porting later on. I spent much of my time at the show trying to explain the details to numerous intrigued developers, so I have a feeling we've planted the memes pretty successfully. In fact, the ideal of many developers is to be able to work under Linux and then cross compile to other platforms. Jon Taylor of Crack.com considered Linux the ultimate development platform, a sentiment I'm sure nearly all of us share.
Hardware news? Well, as you know, the Amiga is coming back again, this time hopefully for real. It is rumored to be on a Linux base, but then we've been living on rumors long enough to build a whole new operating system. Microsoft announced their XBox, and several Microsofties got quite haughty with me when I offered them a free Linux Journal, so I perceive less than good karma in the air surrounding that outfit.
As for the future of video gaming, some people expected computer gaming to be completely replaced by consoles, others thought the two would coexist without trouble, and still many thought that console and computers would offer different kinds of games. Personally, I can't understand the value of a gaming box if you can't hack it, so I hope they don't get too popular.
Surprise, surprise—more things work on Linux than we know. Of course, I've been asked by certain firms to keep silent about what exactly it is that works, but let's just say the industry has noticed our little OS project, and a lot of software is either being ported or has been ported and is now sitting around going stale. This is one more reason I suspect if we get our audio and video acts together, the gaming industry on Linux will be ready to take off.
Anything special to watch out for? Well, I saw some demos from Aegis Simulation Technologies (creators of the graphically impressive BFRIS), one of which will come out soon and looked neat, and another which is being delayed and looks fantastic. The projects are still under wraps, but keep on the lookout: if you're into gaming and don't mind the proprietary thing, you probably won't want to miss these. Now, enough with the proprietary bit and on to free software, since that's what we're all about, no matter how often we may forget.
Bertrand Meyer, in his criticism of the ethics of free software, said that commercial software gave a “proof of concept” that free software then copied. So my question would be, what can free game developers learn from the commercial video gaming industry? Well, we need an infrastructure, both externally in terms of ultra-powerful libraries, rendering tools and development kits, and internally by resolving once and for all how we can reconcile real-time multimedia with a multi-tasking, multi-user OS. I imagine there are a million more-informed opinions than mine floating around, but I shall meander nonetheless! Nothing beats gcc for a development tool, and the OpenGL/MesaGL libraries seem quite easy to use. As for rendering tools, hopefully Blue Moon, renderman, GNU's Panorama and PMR will keep progressing. It seems as though virtual reality and artificial intelligence, those trendy business ventures of the 80s which left craters all over Silicon Valley, are finally coming back, so for VR we'll really depend on 3-D engines and rendering tools. As for AI, I wonder if NPCs (non-player characters) can be written in LISP, and does this mean I can use it again? Video game development is so interesting; it would be nice if we were able to develop for free, share our libraries, and not have to reinvent the wheel with massive production teams every time around. I wonder if we will see the day when a single programmer, or a handful of close friends (Cathedral alert!), working only with software released under the GPL, could spit out a truly creative masterpiece on a par with the best of the commercial offerings. One would hope.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- Interview with Patrick Volkerding
- Google's SwiftShader Released
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Tech Tip: Really Simple HTTP Server with Python
- SuperTuxKart 0.9.2 Released
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Non-Linux FOSS: Caffeine!
- Returning Values from Bash Functions
- Managing Linux Using Puppet
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide