Work the Shell - Compact Code and Cron Contraptions
This month, I thought I'd take another sidetrack. (You knew that entrepreneurs all have ADD, right?) So, it should be no surprise that to me, the fastest way from point A to point B is, um, what were we talking about?
Reader Peter Anderson sent in a code snippet that offers up a considerably shorter way to convert a really big byte count into kilobytes, megabytes and gigabytes than the one I shared in my December 2006 column.
His question: “Why so much extra code?”
His snippet of code to do this takes advantage of the built-in math capabilities of the Bash shell:
value=$1 ((kilo=value/1024)) ((mega=kilo/1024)) ((giga=mega/1024)) echo $value bytes = $kilo Kb, $mega Mb and $giga Gb
Peter, you're right. This is a succinct way of solving this problem, and it's clear that a shell function to convert, say, bytes into megabytes easily can be produced as a one-liner. Thanks!
As I've said in the past, I don't always write the most concise code in the world, but my goal with this column is to write maintainable code and to get that prototype out the door and be ready to go to the next thing as fast as possible. That practice isn't always compatible with the quest for elegance and perfection in the coding world, to say the least!
On an admin mailing list, I bumped into an interesting question that makes for a perfect second part to this column—a simple script that's really just a one-line invocation, but because it involves the cron facility, becomes worth our time.
The question: “I need to run a cron job that looks in a certain directory at the top of every hour and deletes any file that is more than one hour old.”
Generally, this is a job for the powerful find command, and on first glance, it can be solved simply by using an hourly cron invocation of the correct find command.
For neophyte admins, however, there are two huge steps involved that can be overwhelming: figuring out how to add a new cron job and figuring out the correct predicates for find to accomplish what they seek.
Let's start with find. A good place to learn more about find, of course, is the man page ( man find), wherein you'll see there are three timestamps that find can examine. ctime is the last changed time, mtime is the last modified time and atime is the last accessed time. None of them, however, are creation time, so if a file was created 90 minutes ago but touched or changed eight minutes ago, all three will report eight minutes, not 90. That's probably not a huge problem, but it's worth realizing as a very typical compromise required to get this admin script working properly.
For the sake of simplicity, I'll actually change this example to deleting files that haven't been accessed in the last 60 minutes, not worrying about how much earlier they might have been created. For this task, I need ctime.
find has this baffling syntax of +x, x and -x for specifying 60 minutes, and it would read as “more than x”, “exactly x” and “less than x”, respectively. If we use the sequence -ctime -60, we'll get exactly the opposite of what we want; we'll get files that have been changed in the last 60 minutes.
Or is that what we are specifying? Without a unit indicated, the default time unit is really days, so -60 is actually files that have been changed in the last 60 days—not what we want!
To specify minutes, we want to use cmin rather than ctime (I told you find was confusing). Here's how that might look:
find . -cmin +60
The above also matches directories, however; so another predicate we'll want to add is one that constrains the results only to files:
(type d is only directories, and so forth).
But, that's not exactly right either, because we probably want to ensure that we only ever go one level deeper instead of spending a lot of time traversing a complex file tree. This is done with the little-used maxdepth parameter, which is described as “True if the depth of the current file into the tree is less than or equal to n.” Now, let's put this all together:
find . -cmin +60 -type f -maxdepth 1
See how that all fits together?
Now, the last part of this requirement is actually to delete the matching file or files, and I have to admit that this gives me some cause for anxiety, because if you make even the slightest mistake with the find command, you can end up deleting tons of files you didn't want removed—not good. So, rather than just use -delete, I suggest you use -print, and for a day or so, let it run and have cron automatically e-mail the resulting report to you.
Dave Taylor has been hacking shell scripts for over thirty years. Really. He's the author of the popular "Wicked Cool Shell Scripts" and can be found on Twitter as @DaveTaylor and more generally at www.DaveTaylorOnline.com.
|The True Internet of Things||Sep 02, 2015|
|September 2015 Issue of Linux Journal: HOW-TOs||Sep 01, 2015|
|September 2015 Video Preview||Sep 01, 2015|
|Using tshark to Watch and Inspect Network Traffic||Aug 31, 2015|
|Where's That Pesky Hidden Word?||Aug 28, 2015|
|A Project to Guarantee Better Security for Open-Source Projects||Aug 27, 2015|
- Using tshark to Watch and Inspect Network Traffic
- September 2015 Issue of Linux Journal: HOW-TOs
- The True Internet of Things
- Problems with Ubuntu's Software Center and How Canonical Plans to Fix Them
- Concerning Containers' Connections: on Docker Networking
- Firefox Security Exploit Targets Linux Users and Web Developers
- Where's That Pesky Hidden Word?
- A Project to Guarantee Better Security for Open-Source Projects
- Build a “Virtual SuperComputer” with Process Virtualization
- My Network Go-Bag