"Dogs" of the Linux Shell

Could the command-line tools you've forgotten or never knew save time and some frustration?
Conclusion

The multiple commands in this article are presented in support of a hypothesis claiming the lion's share of a given system's feature set goes unnoticed. My goal here was to increase the awareness of several of the lesser utilized and showcased utilities that offer some value. If you ever think, "there must be a easier way to accomplish 'X'", while writing a script or while struggling with something at the command-line prompt, perhaps there is. Do some digging. One of the better sources for such digging is the O'Reilly Linux in a Nutshell book--a well organized, quick reference. I also would encourage you to examine the installed manual and info-based pages--not all command-line options were covered here.

Louis Iacona has been designing and developing applications on UNIX/Linux since 1982. Most recently, his efforts have been directed at applying leading-edge design/development techniques to the enterprise needs of fortune 2000 companies. He is currently a Senior Consulting Engineer at OmniE Labs, Inc. (www.omnie.com).

email: lji@omnie.com

______________________

Comments

Comment viewing options

Select your preferred way to display the comments and click "Save settings" to activate your changes.

Re: recall things in depth

Anonymous's picture

i just involve a project for a data convertion .

old window software can print report/data to file as file.prn. but the output is dirty, they seperately one record to mutil-line.

and fill record seperator using one blank line.

--------------------------

head , tail just good for capture it.

while [ startLine -lt totalLine ]; do

parse using wc -c to check is empty line

using cat -A , sed to trailing ^M(return) char

use >>,> to join 3/4 line as one record.

done

than port to mysql database

many thanks to arctical author.

Re: recall things in depth

Anonymous's picture

You're quite welcome. Glad it helped!

--- Louie Iacona

Re: [...]Linux Shell, Cygwin anyone?

Anonymous's picture

Just a reminder to those Linux/UNIX enthusiasts who have to suffer the Microsoft command line at work... Check out Cygwin for the coolest shell (and X) stuff that runs on Windows.

Rgds,
Derek

Re: One-level Deep Directory Listing

Anonymous's picture

Here's a super simple command line thingy that I use all the time to see the contents of the current directory and one level down:

daemonbox [1]: ls -AF `ls -A`

I've aliased it to "l1" for convienence

note - this is on NetBSD-1.6: YMMV in Linux

One-level Deep Directory Usage

Anonymous's picture

What I also find quite useful these days is:

du -h --max-depth=1

which shows me the how much disk space is being used by the sub-directories of the current folder (or whatever argument is added), and I've aliased it to 'd1'.

I've also used this as `d1|grep M` which will show me all the results that are 1 MB or greater (or contain "M" in their name :-)), for quick answers. And to sort `ls -l` by date, I've sometimes used `ls -l|sort -k6,7`.

Re: One-level Deep Directory Usage

Anonymous's picture

try grep "M" to get the files that are actually one meg or larger. You may want to try "[MG]" to get files that are over 1 gig to show. If you grep for M and a file is over 1000 megs it won't display.

Re: One-level Deep Directory Usage

Anonymous's picture

Ever tried "du -s *"?

OK, that lists files too, but it's quicker to write! Yay!

Re: One-level Deep Directory Listing

Anonymous's picture

in zsh you can simply do this

ls *(/)

Re: "Dogs" of the Linux Shell

DrScriptt's picture

Now this is a GREAT article!!! I really would like to see more articles like this one.

I've been using Linux for 3+ years now and I LOVE it. I cut my teeth on DOS batch files using DATE, FC, and TIME to do a LOT of what was done here. It was VERY hard, I ended up creating temporary files all over the place that had to be subsequently cleaned up. Unicies on the other hand make it SO easy. I really do enjoy seeing all the CLI tools that are out there and knowing that people are using them. To me using tools like these are what make us unix people. No matter how experienced or inexperienced (me) we may be. Using the system to its potential is what it's there for. Try doing some of these tasks things and more (combine them...) in Windows with what is provided with the OS.

DrScriptt...

drscriptt@riverviewtech.net

Fool

Anonymous's picture

Fool

Re: Excellent article.

Anonymous's picture

Found your site from Linux Today.

My linux tips page:

http://wolfrdr.tripod.com/linuxtips.html

Re: use seq, not fold, for iteration

Anonymous's picture

The iteration example is less than convincing. Try iterating over a 10 elements. Oops. Try 1000. Huh? ...

for i in $(echo 12345|fold -w1); do print $i; done

should be

for i in `seq 5`; do print $i;done

seq(1) allows to define start, stop, step and more.

Re: use seq, not fold, for iteration

Anonymous's picture

Thank You very much!!
I was looking for this exact feature for my script.

Re: use seq, not fold, for iteration

Anonymous's picture

Hi - the examples were not designed to convey

a message of, "this is absolutely the BESTway

to accomplish the given task".

(although, that might be true for some examples ;-) )

The examples are mainly intending to show

basic functionality - what tool generally does -

the output given a certain input.

Regards,

Louie Iacona

Re: use seq, not fold, for iteration

Anonymous's picture

Plus I think it is always much more fun doing it the hard way.

I remember when we used to have competitions to see how many different ways one could cat a file without using cat..

GREAT article!

Re: line numbering

Anonymous's picture

If you don't need anything complicated, cat -n somefile > somefile.numbered can do the trick with numbering lines.

Re: line numbering

Anonymous's picture

Hi - yes, that would work - however, nl provides

format options that 'cat -n' does not.

NL or PR are generally used to number lined text

since they're 'option rich' around that kind of formatting.

Good observation though - I should have included

that in the column ...

--- Louie Iacona

Re: line numbering

Anonymous's picture

nl isnt part installed in freebsd by default. Command line tools should be available everywhere. Of course you can download/compile/install yourself but thats alot of work. might as well just write the awk/perl script at that point.

Re:

Anonymous's picture

What is the Unix equivalent of Windows' "dir /s"? "dir /s" is like 'ls' but it looks recursively in all subdirectories too. I know 'find' can do something like this, but its man page is practically unreadable.. <:-

Re:

Anonymous's picture

`ls -R` ;)

regards, elybis

Re: dir /s

Anonymous's picture

If you want to display just the directories/subdirectories in the current directory as you would do with the DOS/Windows command "dir /AD" you might try:
ls -alp | grep '^d'
find -type d -maxdepth 1
ls -d */

Re:dir /s

Anonymous's picture

If you know the filename try locate you might be surprised by the output;-)

Re:dir /s

Anonymous's picture

or if ypur even close to the file name

Re: Other tricks: DU and DF

Anonymous's picture

Heh, I misread your question initially. Even though you said Windows, I saw "dir /s" and thought of VMS, where that provides subtotals

$ du -s *

works as a basic equivalent of that. (Yeah, I know I'm offtopic and not answering the real inital question.)

Another favorite of mine is

$ df -k

which shows mounted disks and how much space it has, how much is used, and LIES ABOUT HOW MUCH IS FREE. It's intentionally off by five percent. Note this seems to be true in every un*x I've used, not just linux flavors.

Re: Other tricks: DU and DF

Anonymous's picture

> $ df -k

>which shows mounted disks and how much space it has, how much is >used, and LIES ABOUT HOW MUCH IS FREE. It's intentionally off by five >percent. Note this seems to be true in every un*x I've used, not just >linux flavors.

That is because unices reserve 5% on each partition. This can only be used by root. This means that if a user fills a partition it does not stop the system working and root can still run normally to correct it.

Re: Other tricks: DU and DF

Anonymous's picture

The difference you are noticing is disk space reserved for root. I think 5% is the default amount reserved for root when you create a file system on most Unix boxes The amount of free disk space reported by 'df' is the remaining disk space available to non-root users.

Re:

Anonymous's picture

'find'

Re:

Anonymous's picture

Simple way to use find:

find dirname -ls

(where dirname is the directory to list -- use . if you want the current directory.) The output format will look like ls -ali but it will list all files and directories recursively.

You can also do:

ls -alR

But the format kind of sucks.

zsh: ls **/*.txt

Anonymous's picture

If you want to search only one directory deep, try

ls -hal */*.txt

and, here is the good part, IF you are using the zsh shell (free and comes with all Linux distributions) you can use

ls -hal **/*.txt

to search recursively directly in the shell! (Since this is shell expanded, it works with ALL commands, but you can't have more than a couple of thousand files then the expansion gets too large and you have to use 'find'.)

Re:dir /s equiv

Anonymous's picture

ls -la * is pretty close

Re:dir /s equiv

Anonymous's picture

the closest replacement (if you are using gnu find)

$ find . -name 'pattern' -ls

ie: pattern would be somthing like '*.txt'

it provides output that looks like ls "long format"

I suppose without gnu find you could

$ find . -name 'pattern' -exec ls -l "{}" ;

but that would be _slow_

find is very useful if a file pattern expands to a string larger than the commandline because with find the pattern is quoted. So it is not expanded by the shell.

ex: to delete a very large directory of files. ...

$ find . -name '*' -type 'f' -maxdepth 1 -exec rm "{}" ;

instead of rm *.

Garick

Re:dir /s equiv

Anonymous's picture

There is also

find . -name '*.txt' -print

if you only want to list the names and not sizer, date etc. I believe this may be more portable than the '-ls' option.

Re:

Anonymous's picture

ls -lR

That recurses through subdirectories.

Add some ls tweaks to make things more interesting. For instance, to sort directory listings from largest file size to smallest:

ls -lRS

To sort directory listings from most recently altered to "oldest":

ls -lRt

on and on and on...

Re:

Anonymous's picture

try "tree", "du", or "find ." (the dot means current directory).

easy find options are: -type f (regular files only) -type d (directories only).

for example

find . -type f |xargs grep 'nvidia'

will show you all the files under the current directory containing the

string nvidia. (xargs works kinda like the backquotes ("`")).

have fun!

Re: find

Anonymous's picture

find . -type f -name '*nvidia*'

would be a better example of how to use find. It would find all files whose _name_ contains nvidia.

xargs deserves a section and explanation of it's own.

Re:

Anonymous's picture

'ls -R' perhaps?

Re:

Anonymous's picture

thanks, that was too easy.. <:-)

Re: dir /s

Anonymous's picture

try "ls -R" or "ls -Rl"

Re:

Anonymous's picture

Is `ls -R` what you're looking for?

Re: recursive dir for UNIX/Linux

Anonymous's picture

Depending on what you want your output to

look like, try

ls -R /

It will display the contents of / (root)

in a:

Dirname1:

file1 file2

Dirname2:

file3

type of format.

The find command is easier to use than the man

page would lead you to believe.

Try:

find / -type f -print

This produces a more flat/linear list.

Depends on what you're doing - one will

be more suitable than the other.

These commands are pretty much the only game in

town for this sort of thing.

Oh, on the clarity of the man page, try typing:

info find

at your shell prompt. It's more verbose,

but more clear - I think.

--- Louie Iacona

Re: ``dir /s

Anonymous's picture

One Anonymous asked:

``What is the Unix equivalent of Windows' "dir /s"?"

Try ``find $DIR -name $FILE_NAME"

where $DIR is the name of the top directory you want to look in (typing just ``." works fine), & $FILE_NAME is the name of the file you are looking for.

Enclose $FILE_NAME if you are using wildcards.

Butake the time to read the manpage & learn how find works. It is a truly useful command.

Geoff

Re:

Anonymous's picture

I don't know what dir /s does.

ls -R lists all files in the directory and all subdirectories.

Re:

Anonymous's picture

Here's something I use now and again:

find / -type f -exec grep -icH 'regex' '{}' ; | sed -e '/0$/ d' | sed 's/(.*:)([0-9]*)/21/' | sort -n > results.txt

What this does is search every regular file on your system, greps it for a regex, pipes the output of that through sed a couple of times to remove results with zero hits and to put the number of hits at the front, sorts them by number then puts then in a file.

Useful when trying to find out how a particular distribution sets stuff for programs; be warned though, it can take a while to complete :-) but that shouldn't be a problem if you need a coffee!

Re: Cool, but...

Anonymous's picture

You might try the --recursive option to GNU grep. ;-)

Re: Faster Modification (I think)

Anonymous's picture

By looking at your command string it seems that an instance of grep is run for every single file on your system. If this could be avoided then the scan could be completed much quicker.

I think this should work faster:

find / -type f -print0 | xargs --null grep -icH 'regex' | sed -e '/0$/ d; s/(.*):([0-9]*)/2 1/' | sort -n

Or the two command version (Better for low memory machines because of the sort command):

find / -type f -print0 | xargs --null grep -icH 'regex' > results_prev

cat results_prev | sed -e '/0$/ d; s/(.*):([0-9]*)/2 1/' | sort -n > results

It should work faster because xargs will run the grep command with batches of input files. I also combined the sed expression, removed the ':' at the end of each line, and added a space between the number of times regex appears in the file and the name of the file. Note that the -print0 in the find command, and the --null in xargs is to avoid problems with files that contain spaces.

Later,

Jason B.

j bowman mydotmanager.com

Re: Faster Modification (I think)

Anonymous's picture

"By looking at your command string it seems that an instance of grep is run for every single file on your system. If this could be avoided then the scan could be completed much quicker. "

Absolutely :-) Most of the time I limit the search to /etc when trying to find which obsucre configuration file the parameters for xyz are located. The / was more a proof of concept.

I'll try it with the xargs and the print0. Thanks :-)

Euan.

Re:

Anonymous's picture

Duh, all my backslashes have been stripped out :(

Basically, put a backslash before every ( and ) in the second sed and before the 2 and the 1 in the second sed.

Bah!

Linux Sort files

Anonymous's picture

I want to sort files by created/Modified time in Ascending order

How?

Use ls -altr

Anonymous's picture

Use ls -altr