A year ago, not many people knew about Google. Being the search engine both by and for the Linux community, now everybody knows about Google. It doesn't hurt that Google now powers Yahoo's Web searches as well as its own.
While Google remains a fine search engine, the company's decision to patent its search methods hasn't sat well with those in the Linux community who don't cotton to software patents, which include many in the Free and Open Source Software communities.
Well, there are also other search engines with Linux and UNIX credentials. One is Fast Search and Transfer ASA (FAST) http://www.alltheweb.com/, a Norwegian company with offices in the US and a partnership with Dell. The two companies jointly and publicly intend to build the world's largest and deepest search engine.
Early last year, Lycos made a substantial investment in FAST and now co-brands FAST's four basic search engines: FAST Web Search, FAST FTP Search, FAST MP3 Search and FAST MultiMedia Search (all of which can be found at www.alltheweb.com and www.lycos.com/—they use the same engines).
FAST's engines run on FreeBSD and are reportedly developed on a mix of FreeBSD and Linux machines. In fact, FAST's first engine, FTPsearch, was developed under the Free Software Foundation's GPL. You can still download the GPL version of that software at ftp://ftpsearch.ntnu.no/pub/ftpsearch/. Search results are presented by Apache and PHP.
We also understand that some of FAST's people have been involved in PHP's development for a long time, and many of FAST's R&D people in Norway come from one UNIX-oriented computer club at the University in Trodheim. It's called “Programvareverkstedet”, or PVV http://www.pvv.org/.
The products FAST sells are closed-source along with the search engine itself, which is also the case for every other search engine at this point (or at least that we know of—correct us if we're wrong).
For more about FAST's technologies, click the “Technology” tab on the company's home page.
In another significant search engine development, Yahoo began in November to charge businesses to hurry their listings into Yahoo's “Business to Business” and “Shopping and Services” areas within the “Business and Economy” category. For $199, Yahoo's Business Express program fast-tracks submissions for review and possible inclusion in Yahoo's listings in either of those two areas. According to the FAQ docs.yahoo.com/info/suggest/faq.html, “...any site submitted to these areas will be reviewed and either added or denied within seven business days. If your site is denied, you will be told why and will have a chance to appeal the decision.”
Meanwhile, the Open Directory Project (http://www.dmoz.org/) continues to grow at an explosive rate. A cursory set of searches shows the two services are highly competitive. The question now is, how do they scale?
There's not much you can do to help Yahoo other than work for the company or pay for a listing. But there's a lot you can do to help the Open Directory Project—mainly as an editor. Just navigate down to a topic that obsesses you and sign up to become an editor through the link on that page.
When Woody Guthrie was singing hillbilly songs on a little Los Angeles radio station in the late 1930s, he used to mail out a small mimeographed songbook to listeners who wanted the words to his songs. On the bottom of one page appeared the following: “This song is Copyrighted in U.S., under Seal of Copyright # 154085, for a period of 28 years, and anybody caught singin it without our permission, will be mighty good friends of ourn, cause we don't give a dern. Publish it. Write it. Sing it. Swing to it. Yodel it. We wrote it, that's all we wanted to do.” —Pete Seeger, June 1967
In the February 1995 issue of Linux Journal Belinda Frazier reports on Comdex 1995 and its Linux presence:
...there usually isn't much about UNIX at Comdex. This year however, I was very pleased to find Linux represented at two booths at the show. Both Yggdrasil Computing, Inc. and Morse Telecommunication had Linux in their companies' banner.
In comparison, at the Fall Comdex, the number of exhibitors in the Linux Business Expo section was in the neighborhood of 500.
Doc Searls is Senior Editor of Linux Journal
|Speed Up Your Web Site with Varnish||Jun 19, 2013|
|Non-Linux FOSS: libnotify, OS X Style||Jun 18, 2013|
|Containers—Not Virtual Machines—Are the Future Cloud||Jun 17, 2013|
|Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer||Jun 12, 2013|
|Weechat, Irssi's Little Brother||Jun 11, 2013|
|One Tail Just Isn't Enough||Jun 07, 2013|
- Speed Up Your Web Site with Varnish
- Containers—Not Virtual Machines—Are the Future Cloud
- Linux Systems Administrator
- Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer
- Non-Linux FOSS: libnotify, OS X Style
- Senior Perl Developer
- Technical Support Rep
- UX Designer
- RSS Feeds
- Reply to comment | Linux Journal
2 hours 23 min ago
- Yeah, user namespaces are
3 hours 40 min ago
- Cari Uang
7 hours 11 min ago
- user namespaces
10 hours 5 min ago
10 hours 30 min ago
- One advantage with VMs
12 hours 59 min ago
- about info
13 hours 32 min ago
13 hours 33 min ago
13 hours 34 min ago
13 hours 36 min ago
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?