Inteview with Matthias Ettrich
Matthias Ettrich is the creator of the KDE desktop environment. Below, Aleksey Dolya interviews Matthias about the process of creating KDE and what he's up to these days.
Aleksey Dolya: Matthias, could you tell us a little bit about yourself?
Matthias Ettrich: Sure. I was born in 1972 in southern Germany. My first computer was a Commodore C64 that I bought with my brothers when I was about 12. I simply got hooked, that was exactly what I wanted to do. Later, I forgot about computers--there's more interesting things in life for a teenager--until I started studying computer science at the university of Tuebingen. The MS-DOS stuff at that time was frustrating; I kind of couldn't see the real progress to what we already had on the C64. GNU/Linux I discovered a year later, and with it the fun came back to computing [for me].
Privately, I'm living happily with my fiancée and two black cats in a small flat in Oslo. There's a fitness center right across the street, which adds to my constantly guilty conscience of not working out enough. We have an acoustic piano inside the apartment. After having turned 30, my sense for classical music came back. At present I'm practicing my way through Beethoven's piano sonatas, which likely will keep me busy another decade. Apart from that, we try to take advantage of Norway; in wintertime this means skiing. While Norwegian downhill areas are small compared to the Alps and are essentially sun-free, the fine powder snow easily compensates for it.
AD: Could tell us about the creation of KDE?
ME: Around 1995 I became a real GNU and Linux fan. I was so convinced by the system that I couldn't understand how fellow students, no matter what they studied, could bother with Windows 3.1--no multitasking, 16-bit address space, permanent crashes, horrible icons, ugly fonts. There really was nothing good one could say about it, but it was popular. The application used most was Word 2.0, despite the steep price, the instability and the poor end results. Students manually were splitting up their documents into smaller chunks, because their word processor couldn't handle more than 10 pages safely at once. We [GNU and Linux] and had TeX and LaTeX, which were easy to use, produced great results, never crashed and were free. Why would anybody not want to use them?
Back then, Microsoft was far away from victory, so I was unwilling to believe in technology lock-up theories. Rather, I blamed existing software and its lack of end user focus. This was when I started to write a graphical front-end to LaTeX, which later became the LyX document processor. Although the project really didn't stop the success of Word, it turned into a successful free software project with an active development community, and it taught me how free software projects work internally.
On the downside, I learned that a word processor alone wasn't enough to make people consider GNU/Linux as a serious alternative. Windows 95 was about to replace most 3.1 installations, and with it more powerful GUI applications became standard. Even small details like the file dialog were light years ahead of anything possible on X11 at that time. Drag and drop were everywhere, and network protocols such as HTTP were built into the system and thus available to any application through component technology. And, worst of all, it actually no longer looked so bad. In fact, it looked quite similar to Motif, just a slightly lighter gray--a fantastic default background color--and a more sophisticated button shadow.
In the following months I played around with different alternatives to improve the situation. I tried most toolkits I could get my hands on and did smaller hacks to different window managers. Luckily, I was studying computer science at that time, which in Germany translated to having lots of spare time and a fast internet connection.
Eventually I discovered the Qt GUI toolkit, which was the first toolkit I tried that didn't look vastly inferior to what was happening on Windows. The reason was Qt was running on Windows as well, thus it had to emulate the powerful Win32 controls. I got so excited by its possibilities that I posted a project invitation to various news groups. Martin Konold, who I knew through the LyX project, was setting up some services, FTP space and mailing lists, so we waited. The response was immediate and tremendous, a clear indication that many people in the community were thinking about exactly the same issues and looking for ways out. The timing was just about right. A few days later, we had more than 40 seriously interested software developers on the mailing list; then, we agreed on the most important items and started coding.
At that point in time, the choice of the Qt toolkit simply was a suggestion of mine. It was heavily disputed. Many were not happy with the license it had back then. Another technical disadvantage was it's hardcoded look and feel, either Motif or Windows. But this was compensated for by the intuitive API, and writing new software was what we foremost wanted.
We felt we could deal with the other issues later, and time has proven us right. When I joined Trolltech two years later, my first assignment, one that I gave myself, was to introduce stylability into Qt. My second personal assignment was to improve relations with the KDE community, which meant pushing towards license changes--first open source and later GNU GPL.
Those were strange times. While we in KDE were heavily defending Qt's licensing in public as something not so important for the time being, some of us were working on a free Qt clone while others were trying to influence Trolltech. Ironically, most hackers at Trolltechs had a background in the Free Software community. They could easily picture themselves attacking Qt on mailing lists and coding GTK+ instead, given things had gone slightly different for them personally.
Good intentions were always there, but so was the fear to cut off the revenue stream required to ensure further development on Qt. All Trolltech wanted to do was develop a GUI toolkit for a living; the challenge was to learn that this was possible with the GNU GPL. It had not been tried with a software library before, and nobody knew how the market would react or whether it understood the complex licensing terms.
AD: What tools did you use to create KDE?
ME: We used everything that was available: news, mail, Web, IRC, GCC, flex, bison, bash, automake, autoconf, Perl, you name it. And of course, the Qt GUI toolkit, which together with our C++ bias was the most disputed choice.
All the tools had one thing in common: they were free software or at least freely available in source code. While freedom of tools doesn't guarantee the success of a software project, it's a necessary precondition. Imagine KDE developers would have had to buy the OS, the compiler, the revision control system client, the editor and so on. How many contributors would we have gotten, if any?
AD: How do you find the current popularity of KDE?
ME: Beyond all expectations, really. We see more and more contributors, more languages, more applications. And we see KDE code move onto other platforms. Recently Apple came up with their new browser for the Macintosh, which they built on top of KHTML and KJS, two of the more important pieces of KDE technology. Thanks to KDE's use of the GNU (L)GPL and Apple's cooperation, we not only will see improvements in Konqueror soon, but we also vastly increased the development and maintenance team of our HTML render component. The fewer people that browse with MSIE, the better for the Web.
AD: Could you compare KDE and GNOME? At first glance, they seem to have different interfaces and themes, but the functionality is the same.
Both projects are similar in that they have two, often distinct, target groups, the end users and the application developers. We try to make end users enjoy our look and feel, and we empower application developers to write powerful and compliant applications. KDE is about getting more and better software for GNU/Linux; that is, ultimately the APIs and back-end systems are more important than its surface. The comparison of the development frameworks I would rather leave to others; a lot of it is matter of taste. Generally speaking, both have similar goals, and thanks to common and mutual inspiration we often made similar choices. This is stressed by the fact that today there is cooperation between both teams on almost any level.
AD: GNOME is an industry standard. Why did it happened?
ME: I assume you mainly refer to Sun's introduction of GNOME as the user interface for Solaris. I don't know the exact reason why they are doing it, but I'm glad the commercial UNIX vendors finally seem to understand, backup and support free software on the desktop. I only wish that came 11 years earlier, when they still had a chance against PCs. Today it means little to KDE whether the few Solaris administrators that also have a Solaris console on their desk use an xterm or the GNOME panel to launch Oracle's Java-based configuration utilities. What matters to us--and to GNOME--is that more and more developers write applications against our APIs.
AD: Do you know Miguel De Icaza (the creator of GNOME)? What kind of relationship do you have with him?
b>ME: We met a few times. I admire his energy and the risk he took in creating Ximian. A business built around free software on the desktop is a big challenge, and with Mono and Evolution they have two very interesting projects in the makings. I truly wish them success and the quantum of luck that every business needs.
AD: What does the word KDE mean? Who is its author?
ME: It means K Desktop Environment. I picked the K not only because it is the letter before L, for Linux, I also liked the pun with CDE. The letter K is pronounced the same as C in many languages. Originally we thought about giving the K a meaning other than KDE, but we gave up that idea before the first line of code was written.
AD: Where do you work now?
ME: I work as director of Trolltech AS in Norway's capital, Oslo. My responsibility is to lead the development of the Qt toolkit for UNIX, MS-Windows and the Macintosh. Thanks to the wonders of lean management and a good HR department, I'm still able to participate heavily in the development work myself.
AD: What is favorite Linux distribution? Why?
ME: Hard to answer; every distributions has its own specific target audience. Personally I do like SuSE for its comfort and its completeness and simply because both my fiancee and I are used to it. But we do run an equal share of Debian machines at work. Given that we have sysadmins to handle the tricky stuff, I'm rather distribution agnostic, at least as long as Emacs and GCC are available.
AD: What are you future plans?
ME: Professionally I'm focusing on the next generation of Qt. Qt today is established technology that has been developed for more than ten years, so we feel it's about time to revise some of its architecture. The wide range of devices it is used on--[everything] from powerful desktop workstations to small embedded devices--leads to new challenges. Interestingly enough, both small embedded applications and the big desktop applications that constantly become more and more complex have one thing in common: they would benefit from a more flexible, smaller and at the same time even faster toolkit. And we believe we can [deliver] exactly this.
In the shorter run, my team will come out with, no surprise, more free software. A piece we are particularly proud of is QSA, Qt Script for Applications. It's a new script binding technology for Qt that makes it easy to turn existing applications into scriptable ones. We'll ship this together with a Java/ECMA-script compliant scripting engine and a graphical script IDE. Let's see whether free software picks and if it finds its way into KDE.
Privately I want to set apart more spare time to do KDE hacking myself, but whether this will become a reality only time can tell.
Aleksey Dolya is a Russian C/C++ programmer interested in network security and software protection.
Practical Task Scheduling Deployment
July 20, 2016 12:00 pm CDT
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.Register Now!
- SUSE LLC's SUSE Manager
- My +1 Sword of Productivity
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- Managing Linux Using Puppet
- Non-Linux FOSS: Caffeine!
- Doing for User Space What We Did for Kernel Space
- SuperTuxKart 0.9.2 Released
- Parsing an RSS News Feed with a Bash Script
- Google's SwiftShader Released
- SourceClear Open
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide