Court Gets A Torrent-full About Linux
BitTorrent is one of the most contentious technologies available. At least, that is, to the Old Order, those lovely suit-clad corporate types bent on holding technology forever in the days of the — manual — typewriter. The technology, and the suits' dreams of a world free of it, are on trial in Australia, where Linux made an appearance today — at the defense table.
The matter at hand is a lawsuit by the Australian Federation Against Copyright Theft — yes, that's AFACT — against iiNet, an Australian internet service provider over the ISP's role in allowing its service to be used for illegal BitTorrent downloading.
The basis for the case, according to trial coverage, is an earlier case — concerning photocopies of print books — which held that the University of New South Wales was liable for copyright infringement essentially because they controlled the copiers. AFACT hopes to use the same argument against iiNet, holding it liable for what goes through because it owns the tube.
Linux made its appearance in defense of iiNet today, during iiNet CEO Michael Malone's the third day of cross examination. Obtaining Linux had already been listed as one many legal reasons for using BitTorrent, though AFACT's barrister sought to deflate the importance, describing it as "likely to be downloaded, if at all, once by the person who uses it." According to reports, Malone did not comment on the amount of traffic such downloads represent, but did point out that Linux is actually updated on a regular basis, presenting new opportunities for downloads.
Reportedly, Tony Bannon, AFACT's counsel, also suggested that iiNet itself was responsible for its customers use of BitTorrent in the first place. Malone pointed out that iiNet's customer service — who Bannon apparently insinuated were pushing BitTorrent on customers — does not handle third-party application support. He went on to defend iiNet's service representatives, rebuffing the implication that because they are "relatively young" they would be "the sort of people you would expect to be familiar with the processes for downloading via BitTorrent." "I don’t expect," he told the court, "that every young person in Australia is downloading illegally using BitTorrent."
Experts have suggested that, though the trial is scheduled to end in the near future, the result may be some months in coming. David Brennan — an Associate Professor of Law at Melbourne University and consultant for an Australian copyright management group — is quoted saying that "the period in which the court reserves to prepare its judgment will be measured in months rather than weeks." The matter is unlikely to end with the Federal Court — it will more thank likely land before the High Court of Australia for a final decision. (The High Court is roughly equivalent to the Supreme Courts of the UK and US.)
Justin Ryan is a Contributing Editor for Linux Journal.
|Speed Up Your Web Site with Varnish||Jun 19, 2013|
|Non-Linux FOSS: libnotify, OS X Style||Jun 18, 2013|
|Containers—Not Virtual Machines—Are the Future Cloud||Jun 17, 2013|
|Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer||Jun 12, 2013|
|Weechat, Irssi's Little Brother||Jun 11, 2013|
|One Tail Just Isn't Enough||Jun 07, 2013|
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?