Hey, I'm a potential new user of Linux. Got a Question.
And as a potential new user, I've been looking around and researching Linux for quite awhile - well actually for the last 2 days or so. I've found a listing of possible "distributors" through a website called linux.org.
To explain my needs, I am a player of World of Warcraft and an amateur artist. My computer is an AMD Athlon(tm) 64 Processor 3000+ 2.01 GHz, 2.50 GB of RAM, ASUS K8 Motherboard, and I use Windows XP Corporate Edition (Cracked) in a Maxter 152 GB Hard Drive, with additional IBM Deskstar 19.1 GB Hard Drive, and I also have a WD 232 GB External Hard Drive. I have around 109 GB of free space on the Maxter that I use for my WinXP OS on.
What I need to know is could you tell me or recommend what to get for a Linux OS on my machine, and how would I partition my Maxter Hard Drive, and how much space would it literally take?
|Speed Up Your Web Site with Varnish||Jun 19, 2013|
|Non-Linux FOSS: libnotify, OS X Style||Jun 18, 2013|
|Containers—Not Virtual Machines—Are the Future Cloud||Jun 17, 2013|
|Lock-Free Multi-Producer Multi-Consumer Queue on Ring Buffer||Jun 12, 2013|
|Weechat, Irssi's Little Brother||Jun 11, 2013|
|One Tail Just Isn't Enough||Jun 07, 2013|
Free Webinar: Hadoop
How to Build an Optimal Hadoop Cluster to Store and Maintain Unlimited Amounts of Data Using Microservers
Realizing the promise of Apache® Hadoop® requires the effective deployment of compute, memory, storage and networking to achieve optimal results. With its flexibility and multitude of options, it is easy to over or under provision the server infrastructure, resulting in poor performance and high TCO. Join us for an in depth, technical discussion with industry experts from leading Hadoop and server companies who will provide insights into the key considerations for designing and deploying an optimal Hadoop cluster.
Some of key questions to be discussed are:
- What is the “typical” Hadoop cluster and what should be installed on the different machine types?
- Why should you consider the typical workload patterns when making your hardware decisions?
- Are all microservers created equal for Hadoop deployments?
- How do I plan for expansion if I require more compute, memory, storage or networking?