grep: Searching for Words
Within Linux (or any other UNIX), many people make use of filters, small programs (black boxes) that read input from standard input (stdin), do something with this input, and return the result to standard output (stdout).
Linux has many filters. Some examples are:
wc: print the number of bytes, words and lines in a file
tr: translate or delete characters
grep: print lines matching a pattern
sort: sort lines in a file
cut: cut selected fields from a file
The easiest way to learn these filters is to use them. This may seem daunting at first, since you may not know all the capabilities of these filters. I will describe the functions of grep so that you can benefit from its power.
I will be using this article (article.txt) as the input file for all the examples.
The syntax of the grep command is as follows:
grep [ -[[AB] ]num ] [ -[CEFGVBchilnsvwx] ]\ [ -e ] pattern| -file ] [ files... ]
I use GNU grep Version 2; if you're using another version, you may have slightly different options. I will touch on only those options I use most. To learn more about the grep command, see the man page. Variants of the grep command are egrep and fgrep. grep includes flags to simulate these commands: -E for egrep and -F for fgrep.
The simplest form of the command is:
grep flip article.txt
This will search for the word “flip” in the file article.txt and will display all lines containing the word “flip”.
grep also accepts regular expressions, so to search for “flip” in all files in the directory, the following command can be given:
grep flip *
All lines in all files which contain the word “flip” will be displayed, preceded by the file name. Thus, the first line of the output will look like this:
article.txt:grep flip article.txtThe line begins with the name of the file containing the word “flip”, followed by a colon, then the appropriate line.
Sometimes you may want to define the search for special characters or a word combination. To do this, put the expression between quotes so that the whole expression/pattern will be treated as one. The command would then look like this:
grep -e "is the"
I put the -e (i.e., do pattern search) option in this example just for demonstration purposes. It is not necessary to specify, as it is the default value.
To see the line numbers in which the pattern is found, use the -n option. The output will look like that shown above, with the file name replaced by the line number before the colon.
Another option which provides us with a number is the -c option. This option outputs the number of times a word exists in a file. This article contains the word “flip” 10 times.
> grep -c flip article.txt 10
You may now be able to think of many ways in which you might use grep. For any command you use often, speed is important. Normally, grep can do its job quickly. However, if the search is being done over many large files, the results will be slower to return. In this case, you can speed up the process by using either fgrep or egrep. fgrep is used only for finding strings, and egrep is used for complicated regular expressions.
File names, words, sentences and numbers can all be found quickly using grep. In addition, using the grep command together with other filters can be very powerful and prove to be of great value. For example, you could search a statistics file and sort the output by piping it through the sort and cut commands (see man pages):
grep ... | sort ... | grep ... | cut ... > result
This has been a quick introduction to get you started and rouse your curiosity to learn more about grep and other filters.
Practical Task Scheduling Deployment
One of the best things about the UNIX environment (aside from being stable and efficient) is the vast array of software tools available to help you do your job. Traditionally, a UNIX tool does only one thing, but does that one thing very well. For example, grep is very easy to use and can search vast amounts of data quickly. The find tool can find a particular file or files based on all kinds of criteria. It's pretty easy to string these tools together to build even more powerful tools, such as a tool that finds all of the .log files in the /home directory and searches each one for a particular entry. This erector-set mentality allows UNIX system administrators to seem to always have the right tool for the job.
Cron traditionally has been considered another such a tool for job scheduling, but is it enough? This webinar considers that very question. The first part builds on a previous Geek Guide, Beyond Cron, and briefly describes how to know when it might be time to consider upgrading your job scheduling infrastructure. The second part presents an actual planning and implementation framework.
Join Linux Journal's Mike Diehl and Pat Cameron of Help Systems.
Free to Linux Journal readers.View Now!
|The Firebird Project's Firebird Relational Database||Jul 29, 2016|
|Stunnel Security for Oracle||Jul 28, 2016|
|SUSE LLC's SUSE Manager||Jul 21, 2016|
|My +1 Sword of Productivity||Jul 20, 2016|
|Non-Linux FOSS: Caffeine!||Jul 19, 2016|
|Murat Yener and Onur Dundar's Expert Android Studio (Wrox)||Jul 18, 2016|
- Stunnel Security for Oracle
- The Firebird Project's Firebird Relational Database
- Murat Yener and Onur Dundar's Expert Android Studio (Wrox)
- SUSE LLC's SUSE Manager
- Managing Linux Using Puppet
- My +1 Sword of Productivity
- Non-Linux FOSS: Caffeine!
- Google's SwiftShader Released
- Doing for User Space What We Did for Kernel Space
- SuperTuxKart 0.9.2 Released
With all the industry talk about the benefits of Linux on Power and all the performance advantages offered by its open architecture, you may be considering a move in that direction. If you are thinking about analytics, big data and cloud computing, you would be right to evaluate Power. The idea of using commodity x86 hardware and replacing it every three years is an outdated cost model. It doesn’t consider the total cost of ownership, and it doesn’t consider the advantage of real processing power, high-availability and multithreading like a demon.
This ebook takes a look at some of the practical applications of the Linux on Power platform and ways you might bring all the performance power of this open architecture to bear for your organization. There are no smoke and mirrors here—just hard, cold, empirical evidence provided by independent sources. I also consider some innovative ways Linux on Power will be used in the future.Get the Guide