Hack and / - A Little Spring Cleaning
No matter how big your hard drives are, at some point you're going to look at your storage and wonder where all the space went. Your /home directory is probably a good example. If you are like me, you don't always clean up after yourself or organize immediately after you download a file. Sure, I have directories for organizing my ISOs, my documents and my videos, but more often than not, my home directory becomes the digital equivalent of a junk drawer with a few tarballs here, an old distribution ISO there and PDF specs for hardware I no longer own. Although some of these files don't really take up space on the disk—it's more a matter of clutter—when I'm running out of storage, I'd like to find the files that take up the most space and decide how to deal with them quickly. This month, I introduce some of my favorite commands for locating space-wasting files on my system and follow up with common ways to clear some space.
First, let's start with file clutter in your main home directory. Although all major GUI file managers these days make it easy to sort a directory by size, because I'm focusing on command-line tips, let's cover how to find the largest files in the current directory via the old standby, ls. If you type:
$ ls -lSh
you'll get a list of all the files in your current directory sorted by size. Of course, if you have a lot of files in the directory, the files you most want to see are probably somewhere along the top of the list, so I typically like to type:
$ ls -lSh | less
to see only the top ten largest files. Now, this is pretty basic, but it's worth reviewing, as you'll use these commands over and over again to track down space-wasting files. Depending on how you structure your home directory, you probably won't find all the large files together. It's more likely that they are scattered into different subdirectories, so you then need to scan through your directory structure recursively, tally up the disk space used in each directory, and sort the output. Luckily, you don't have to resort to ls for this; du does the job quite nicely. For instance, one common use for du that I see referenced a lot is the following:
$ du -sh *
This scans through all the subdirectories you list as arguments (in this case, all the subdirectories within my current directory) and then lists them one by one with human-readable file sizes (the -h option converts the file sizes into megabytes, gigabytes and so forth, so it's easier to read). Here's some example output from that command:
456K bin 28K Default-Compiz 16K hl4070cdwcups-1.0.0-7.i386.deb 344K hl4070cdwlpr-1.0.0-7.i386.deb 27M images 60K LexmarkC750.ppd 850M mail
Although you certainly could work with this information, it would be much easier if it were sorted. To do that, replace the -h argument with -k, and then pipe the output to sort:
$ du -sk * | sort -n 16 hl4070cdwcups-1.0.0-7.i386.deb 28 Default-Compiz 60 LexmarkC750.ppd 344 hl4070cdwlpr-1.0.0-7.i386.deb 456 bin 10224 writing 26948 images 869588 mail
This works better, because now I can see that my local e-mail cache is taking up the bulk of the storage; however, next I would need to change to the mail directory and run the command again, over and over, until I narrow it down to the subdirectory that has the large files. That's why I normally skip the above commands and go straight for what I affectionately call the duck command:
$ du -ck | sort -n . . . 87704 ./.mozilla 87704 ./.mozilla/firefox 119236 ./mail/example.net/sent-mail-2004 119236 ./mail/example.net/sent-mail-2004/cur 869852 ./mail 869852 ./mail/example.net 1064100 . 1064100 total
The -c option essentially recurses into each subdirectory like before, except it keeps a running tally of the space used by each subdirectory down the tree, not just the first level of directories. When it reports its findings, it might list the same top-level directory multiple times. This makes it easy to drill down to the actual directory that consumes the most space, which in this example seems to be ./mail/example.net/sent-mail-2004/cur. If I wanted to clean up files there, I could cd to that directory and then run the ls commands I used above to see which files used the most space.
Kyle Rankin is a VP of engineering operations at Final, Inc., the author of a number of books including DevOps Troubleshooting and The Official Ubuntu Server Book, and is a columnist for Linux Journal. Follow him @kylerankin.
Fast/Flexible Linux OS Recovery
On Demand Now
In this live one-hour webinar, learn how to enhance your existing backup strategies for complete disaster recovery preparedness using Storix System Backup Administrator (SBAdmin), a highly flexible full-system recovery solution for UNIX and Linux systems.
Join Linux Journal's Shawn Powers and David Huffman, President/CEO, Storix, Inc.
Free to Linux Journal readers.Register Now!
- Google's Abacus Project: It's All about Trust
- Download "Linux Management with Red Hat Satellite: Measuring Business Impact and ROI"
- Back to Backups
- Secure Desktops with Qubes: Introduction
- Working with Command Arguments
- Fancy Tricks for Changing Numeric Base
- Secure Desktops with Qubes: Installation
- Linux Mint 18
- The Italian Army Switches to LibreOffice
- Seeing Red and Getting Sleep
Until recently, IBM’s Power Platform was looked upon as being the system that hosted IBM’s flavor of UNIX and proprietary operating system called IBM i. These servers often are found in medium-size businesses running ERP, CRM and financials for on-premise customers. By enabling the Power platform to run the Linux OS, IBM now has positioned Power to be the platform of choice for those already running Linux that are facing scalability issues, especially customers looking at analytics, big data or cloud computing.
￼Running Linux on IBM’s Power hardware offers some obvious benefits, including improved processing speed and memory bandwidth, inherent security, and simpler deployment and management. But if you look beyond the impressive architecture, you’ll also find an open ecosystem that has given rise to a strong, innovative community, as well as an inventory of system and network management applications that really help leverage the benefits offered by running Linux on Power.Get the Guide