Search     or:     and:
  Краткое описание
 W. R. Стивенс TCP 
 W. R. Стивенс IPC 
 K. Bauer 
 Gary V. Vaughan 
 Д Вилер 
 В. Сталлинг 
 Pramode C.E. 
 Steve Pate 
 William Gropp 
 С Бекман 
 Р Стивенс 
 Mendel Cooper 
 М Перри 
 C.S. Rodriguez 
 Robert Love 
 Daniel Bovet 
 Д Джеф 
 G. Kroah-Hartman 
 B. Hansen 
Последние статьи :
  Rust 07.11   
  Go 25.12   
  EXT4 10.11   
  FS benchmark 15.09   
  Сетунь 23.07   
  Trees 25.06   
  Apache 03.02   
  SQL 30.07   
  JFS 10.06   
  B-trees 01.06   
TOP 20
 Steve Pate 3...204 
 Rubni-Corbet -> Глав...201 
 Stewens -> IPC 4...200 
 Rubni-Corbet -> Глав...194 
 Стивенс 9...192 
 Rubni-Corbet -> Глав...190 
 Linux Inline Assembly...188 
 Stein-MacEachern-> Час...188 
 Hansen 1...188 
 Rubni-Corbet -> Глав...185 
 Kernel Notes...185 
 Stewens -> IPC 1-3...185 
 Gary V.Vaughan-> Autotoll...184 
 Rodriguez 6...182 
 Rubni-Corbet -> Глав...180 
 Стивенс 10...180 
 Rubni-Corbet -> Глав...179 
 Stevens-> Глава 1...179 
  Работа с файл...178 
  01.07.2017 : 2237618 посещений

Testing and Branching

The case and select constructs are technically not loops, since they do not iterate the execution of a code block. Like loops, however, they direct program flow according to conditions at the top or bottom of the block.

Controlling program flow in a code block

case (in) / esac

The case construct is the shell equivalent of switch in C/C++. It permits branching to one of a number of code blocks, depending on condition tests. It serves as a kind of shorthand for multiple if/then/else statements and is an appropriate tool for creating menus.

case "$variable" in

"$condition1" )

"$condition2" )



  • Quoting the variables is not mandatory, since word splitting does not take place.

  • Each test line ends with a right paren ).

  • Each condition block ends with a double semicolon ;;.

  • The entire case block terminates with an esac (case spelled backwards).

Example 10-24. Using case

 # Testing ranges of characters.
 echo; echo "Hit a key, then hit return."
 read Keypress
 case "$Keypress" in
   [[:lower:]]   ) echo "Lowercase letter";;
   [[:upper:]]   ) echo "Uppercase letter";;
   [0-9]         ) echo "Digit";;
   *             ) echo "Punctuation, whitespace, or other";;
 esac      #  Allows ranges of characters in [square brackets],
           #+ or POSIX ranges in [[double square brackets.
 #  In the first version of this example,
 #+ the tests for lowercase and uppercase characters were
 #+ [a-z] and [A-Z].
 #  This no longer works in certain locales and/or Linux distros.
 #  POSIX is more portable.
 #  Thanks to Frank Wang for pointing this out.
 #  Exercise:
 #  --------
 #  As the script stands, it accepts a single keystroke, then terminates.
 #  Change the script so it accepts repeated input,
 #+ reports on each keystroke, and terminates only when "X" is hit.
 #  Hint: enclose everything in a "while" loop.
 exit 0

Example 10-25. Creating menus using case

 # Crude address database
 clear # Clear the screen.
 echo "          Contact List"
 echo "          ------- ----"
 echo "Choose one of the following persons:" 
 echo "[E]vans, Roland"
 echo "[J]ones, Mildred"
 echo "[S]mith, Julie"
 echo "[Z]ane, Morris"
 read person
 case "$person" in
 # Note variable is quoted.
   "E" | "e" )
   # Accept upper or lowercase input.
   echo "Roland Evans"
   echo "4321 Floppy Dr."
   echo "Hardscrabble, CO 80753"
   echo "(303) 734-9874"
   echo "(303) 734-9892 fax"
   echo ""
   echo "Business partner & old friend"
 # Note double semicolon to terminate each option.
   "J" | "j" )
   echo "Mildred Jones"
   echo "249 E. 7th St., Apt. 19"
   echo "New York, NY 10009"
   echo "(212) 533-2814"
   echo "(212) 533-9972 fax"
   echo ""
   echo "Ex-girlfriend"
   echo "Birthday: Feb. 11"
 # Add info for Smith & Zane later.
           * )
    # Default option.	  
    # Empty input (hitting RETURN) fits here, too.
    echo "Not yet in database."
 #  Exercise:
 #  --------
 #  Change the script so it accepts multiple inputs,
 #+ instead of terminating after displaying just one address.
 exit 0

An exceptionally clever use of case involves testing for command-line parameters.
#! /bin/bash
 case "$1" in
 "") echo "Usage: ${0##*/} <filename>"; exit $E_PARAM;;  # No command-line parameters,
                                                         # or first parameter empty.
 # Note that ${0##*/} is ${var##pattern} param substitution. Net result is $0.
 -*) FILENAME=./$1;;   #  If filename passed as argument ($1) starts with a dash,
                       #+ replace it with ./$1
                       #+ so further commands don't interpret it as an option.
 * ) FILENAME=$1;;     # Otherwise, $1.

Here is an more straightforward example of command-line parameter handling:
#! /bin/bash
 while [ $# -gt 0 ]; do    # Until you run out of parameters . . .
   case "$1" in
               # "-d" or "--debug" parameter?
               if [ ! -f $CONFFILE ]; then
                 echo "Error: Supplied file doesn't exist!"
                 exit $E_CONFFILE     # File not found error.
   shift       # Check next set of parameters.
 #  From Stefano Falsetto's "Log2Rot" script,
 #+ part of his "rottlog" package.
 #  Used with permission.

Example 10-26. Using command substitution to generate the case variable

 # Using command substitution to generate a "case" variable.
 case $( arch ) in   # "arch" returns machine architecture.
                     # Equivalent to 'uname -m' ...
 i386 ) echo "80386-based machine";;
 i486 ) echo "80486-based machine";;
 i586 ) echo "Pentium-based machine";;
 i686 ) echo "Pentium2+-based machine";;
 *    ) echo "Other type of machine";;
 exit 0

A case construct can filter strings for globbing patterns.

Example 10-27. Simple string matching

 # simple string matching
 match_string ()
   PARAMS=2     # Function requires 2 arguments.
   [ $# -eq $PARAMS ] || return $BAD_PARAMS
   case "$1" in
   "$2") return $MATCH;;
   *   ) return $NOMATCH;;
 match_string $a     # wrong number of parameters
 echo $?             # 91
 match_string $a $b  # no match
 echo $?             # 90
 match_string $b $d  # match
 echo $?             # 0
 exit 0		    

Example 10-28. Checking for alphabetic input

 # Using a "case" structure to filter a string.
 isalpha ()  # Tests whether *first character* of input string is alphabetic.
 if [ -z "$1" ]                # No argument passed?
   return $FAILURE
 case "$1" in
 [a-zA-Z]*) return $SUCCESS;;  # Begins with a letter?
 *        ) return $FAILURE;;
 }             # Compare this with "isalpha ()" function in C.
 isalpha2 ()   # Tests whether *entire string* is alphabetic.
   [ $# -eq 1 ] || return $FAILURE
   case $1 in
   *[!a-zA-Z]*|"") return $FAILURE;;
                *) return $SUCCESS;;
 isdigit ()    # Tests whether *entire string* is numerical.
 {             # In other words, tests for integer variable.
   [ $# -eq 1 ] || return $FAILURE
   case $1 in
   *[!0-9]*|"") return $FAILURE;;
             *) return $SUCCESS;;
 check_var ()  # Front-end to isalpha ().
 if isalpha "$@"
   echo "\"$*\" begins with an alpha character."
   if isalpha2 "$@"
   then        # No point in testing if first char is non-alpha.
     echo "\"$*\" contains only alpha characters."
     echo "\"$*\" contains at least one non-alpha character."
   echo "\"$*\" begins with a non-alpha character."
               # Also "non-alpha" if no argument passed.
 digit_check ()  # Front-end to isdigit ().
 if isdigit "$@"
   echo "\"$*\" contains only digits [0 - 9]."
   echo "\"$*\" has at least one non-digit character."
 e=`echo $b`   # Command substitution.
 check_var $a
 check_var $b
 check_var $c
 check_var $d
 check_var $e
 check_var $f
 check_var     # No argument passed, so what happens?
 digit_check $g
 digit_check $h
 digit_check $i
 exit 0        # Script improved by S.C.
 # Exercise:
 # --------
 #  Write an 'isfloat ()' function that tests for floating point numbers.
 #  Hint: The function duplicates 'isdigit ()',
 #+ but adds a test for a mandatory decimal point.

The select construct, adopted from the Korn Shell, is yet another tool for building menus.

select variable [in list]

This prompts the user to enter one of the choices presented in the variable list. Note that select uses the PS3 prompt (#? ) by default, but that this may be changed.

Example 10-29. Creating menus using select

 PS3='Choose your favorite vegetable: ' # Sets the prompt string.
 select vegetable in "beans" "carrots" "potatoes" "onions" "rutabagas"
   echo "Your favorite veggie is $vegetable."
   echo "Yuck!"
   break  # What happens if there is no 'break' here?
 exit 0

If in list is omitted, then select uses the list of command line arguments ($@) passed to the script or to the function in which the select construct is embedded.

Compare this to the behavior of a

for variable [in list]

construct with the in list omitted.

Example 10-30. Creating menus using select in a function

 PS3='Choose your favorite vegetable: '
 select vegetable
 # [in list] omitted, so 'select' uses arguments passed to function.
   echo "Your favorite veggie is $vegetable."
   echo "Yuck!"
 choice_of beans rice carrots radishes tomatoes spinach
 #         $1    $2   $3      $4       $5       $6
 #         passed to choice_of() function
 exit 0

Internal Commands and Builtins

A builtin is a command contained within the Bash tool set, literally built in. This is either for performance reasons -- builtins execute faster than external commands, which usually require forking off a separate process -- or because a particular builtin needs direct access to the shell internals.

A builtin may be a synonym to a system command of the same name, but Bash reimplements it internally. For example, the Bash echo command is not the same as /bin/echo, although their behavior is almost identical.
 echo "This line uses the \"echo\" builtin."
 /bin/echo "This line uses the /bin/echo system command."

A keyword is a reserved word, token or operator. Keywords have a special meaning to the shell, and indeed are the building blocks of the shell's syntax. As examples, "for", "while", "do", and "!" are keywords. Similar to a builtin, a keyword is hard-coded into Bash, but unlike a builtin, a keyword is not by itself a command, but part of a larger command structure. [1]



prints (to stdout) an expression or variable (see Example 4-1).
echo Hello
 echo $a

An echo requires the -e option to print escaped characters. See Example 5-2.

Normally, each echo command prints a terminal newline, but the -n option suppresses this.


An echo can be used to feed a sequence of commands down a pipe.

if echo "$VAR" | grep -q txt   # if [[ $VAR = *txt* ]]
   echo "$VAR contains the substring sequence \"txt\""


An echo, in combination with command substitution can set a variable.

a=`echo "HELLO" | tr A-Z a-z`

See also Example 12-19, Example 12-3, Example 12-41, and Example 12-42.

Be aware that echo `command` deletes any linefeeds that the output of command generates.

The $IFS (internal field separator) variable normally contains \n (linefeed) as one of its set of whitespace characters. Bash therefore splits the output of command at linefeeds into arguments to echo. Then echo outputs these arguments, separated by spaces.

bash$ ls -l /usr/share/apps/kjezz/sounds
 -rw-r--r--    1 root     root         1407 Nov  7  2000
  -rw-r--r--    1 root     root          362 Nov  7  2000
 bash$ echo `ls -l /usr/share/apps/kjezz/sounds`
 total 40 -rw-r--r-- 1 root root 716 Nov 7 2000 -rw-r--r-- 1 root root 362 Nov 7 2000

So, how can we embed a linefeed within an echoed character string?
# Embedding a linefeed?
 echo "Why doesn't this string \n split on two lines?"
 # Doesn't split.
 # Let's try something else.
 echo $"A line of text containing
 a linefeed."
 # Prints as two distinct lines (embedded linefeed).
 # But, is the "$" variable prefix really necessary?
 echo "This string splits
 on two lines."
 # No, the "$" is not needed.
 echo "---------------"
 echo -n $"Another line of text containing
 a linefeed."
 # Prints as two distinct lines (embedded linefeed).
 # Even the -n option fails to suppress the linefeed here.
 echo "---------------"
 # However, the following doesn't work as expected.
 # Why not? Hint: Assignment to a variable.
 string1=$"Yet another line of text containing
 a linefeed (maybe)."
 echo $string1
 # Yet another line of text containing a linefeed (maybe).
 #                                    ^
 # Linefeed becomes a space.
 # Thanks, Steve Parker, for pointing this out.


This command is a shell builtin, and not the same as /bin/echo, although its behavior is similar.

bash$ type -a echo
 echo is a shell builtin
  echo is /bin/echo


The printf, formatted print, command is an enhanced echo. It is a limited variant of the C language printf() library function, and its syntax is somewhat different.

printf format-string... parameter...

This is the Bash builtin version of the /bin/printf or /usr/bin/printf command. See the printf manpage (of the system command) for in-depth coverage.


Older versions of Bash may not support printf.

Example 11-2. printf in action

 # printf demo
 printf "Pi to 2 decimal places = %1.2f" $PI
 printf "Pi to 9 decimal places = %1.9f" $PI  # It even rounds off correctly.
 printf "\n"                                  # Prints a line feed,
                                              # Equivalent to 'echo' . . .
 printf "Constant = \t%d\n" $DecimalConstant  # Inserts tab (\t).
 printf "%s %s \n" $Message1 $Message2
 # ==========================================#
 # Simulation of C function, sprintf().
 # Loading a variable with a formatted string.
 Pi12=$(printf "%1.12f" $PI)
 echo "Pi to 12 decimal places = $Pi12"
 Msg=`printf "%s %s \n" $Message1 $Message2`
 echo $Msg; echo $Msg
 #  As it happens, the 'sprintf' function can now be accessed
 #+ as a loadable module to Bash,
 #+ but this is not portable.
 exit 0

Formatting error messages is a useful application of printf

   printf "$@" >&2
   # Formats positional params passed, and sents them to stderr.
   exit $E_BADDIR
 cd $var || error $"Can't cd to %s." "$var"
 # Thanks, S.C.


"Reads" the value of a variable from stdin, that is, interactively fetches input from the keyboard. The -a option lets read get array variables (see Example 26-6).

Example 11-3. Variable assignment, using read

 # "Reading" variables.
 echo -n "Enter the value of variable 'var1': "
 # The -n option to echo suppresses newline.
 read var1
 # Note no '$' in front of var1, since it is being set.
 echo "var1 = $var1"
 # A single 'read' statement can set multiple variables.
 echo -n "Enter the values of variables 'var2' and 'var3' (separated by a space or tab): "
 read var2 var3
 echo "var2 = $var2      var3 = $var3"
 # If you input only one value, the other variable(s) will remain unset (null).
 exit 0

A read without an associated variable assigns its input to the dedicated variable $REPLY.

Example 11-4. What happens when read has no variable

 # -------------------------- #
 echo -n "Enter a value: "
 read var
 echo "\"var\" = "$var""
 # Everything as expected here.
 # -------------------------- #
 # ------------------------------------------------------------------- #
 echo -n "Enter another value: "
 read           #  No variable supplied for 'read', therefore...
                #+ Input to 'read' assigned to default variable, $REPLY.
 echo "\"var\" = "$var""
 # This is equivalent to the first code block.
 # ------------------------------------------------------------------- #
 exit 0

Normally, inputting a \ suppresses a newline during input to a read. The -r option causes an inputted \ to be interpreted literally.

Example 11-5. Multi-line input to read

 echo "Enter a string terminated by a \\, then press <ENTER>."
 echo "Then, enter a second string, and again press <ENTER>."
 read var1     # The "\" suppresses the newline, when reading $var1.
               #     first line \
               #     second line
 echo "var1 = $var1"
 #     var1 = first line second line
 #  For each line terminated by a "\"
 #+ you get a prompt on the next line to continue feeding characters into var1.
 echo; echo
 echo "Enter another string terminated by a \\ , then press <ENTER>."
 read -r var2  # The -r option causes the "\" to be read literally.
               #     first line \
 echo "var2 = $var2"
 #     var2 = first line \
 # Data entry terminates with the first <ENTER>.
 exit 0

The read command has some interesting options that permit echoing a prompt and even reading keystrokes without hitting ENTER.

# Read a keypress without hitting ENTER.
 read -s -n1 -p "Hit a key " keypress
 echo; echo "Keypress was "\"$keypress\""."
 # -s option means do not echo input.
 # -n N option means accept only N characters of input.
 # -p option means echo the following prompt before reading input.
 # Using these options is tricky, since they need to be in the correct order.

The -n option to read also allows detection of the arrow keys and certain of the other unusual keys.

Example 11-6. Detecting the arrow keys

 # Detects the arrow keys, and a few more.
 # Thank you, Sandro Magi, for showing me how.
 # --------------------------------------------
 # Character codes generated by the keypresses.
 # --------------------------------------------
 echo -n "Press a key...  "
 # May need to also press ENTER if a key not listed above pressed.
 read -n3 key                      # Read 3 characters.
 echo -n "$key" | grep "$arrowup"  #Check if character code detected.
 if [ "$?" -eq $SUCCESS ]
   echo "Up-arrow key pressed."
   exit $SUCCESS
 echo -n "$key" | grep "$arrowdown"
 if [ "$?" -eq $SUCCESS ]
   echo "Down-arrow key pressed."
   exit $SUCCESS
 echo -n "$key" | grep "$arrowrt"
 if [ "$?" -eq $SUCCESS ]
   echo "Right-arrow key pressed."
   exit $SUCCESS
 echo -n "$key" | grep "$arrowleft"
 if [ "$?" -eq $SUCCESS ]
   echo "Left-arrow key pressed."
   exit $SUCCESS
 echo -n "$key" | grep "$insert"
 if [ "$?" -eq $SUCCESS ]
   echo "\"Insert\" key pressed."
   exit $SUCCESS
 echo -n "$key" | grep "$delete"
 if [ "$?" -eq $SUCCESS ]
   echo "\"Delete\" key pressed."
   exit $SUCCESS
 echo " Some other key pressed."
 exit $OTHER
 #  Exercises:
 #  ---------
 #  1) Simplify this script by rewriting the multiple "if" tests
 #+    as a 'case' construct.
 #  2) Add detection of the "Home," "End," "PgUp," and "PgDn" keys.


The -n option to read will not detect the ENTER (newline) key.

The -t option to read permits timed input (see Example 9-4).

The read command may also "read" its variable value from a file redirected to stdin. If the file contains more than one line, only the first line is assigned to the variable. If read has more than one parameter, then each of these variables gets assigned a successive whitespace-delineated string. Caution!

Example 11-7. Using read with file redirection

 read var1 <data-file
 echo "var1 = $var1"
 # var1 set to the entire first line of the input file "data-file"
 read var2 var3 <data-file
 echo "var2 = $var2   var3 = $var3"
 # Note non-intuitive behavior of "read" here.
 # 1) Rewinds back to the beginning of input file.
 # 2) Each variable is now set to a corresponding string,
 #    separated by whitespace, rather than to an entire line of text.
 # 3) The final variable gets the remainder of the line.
 # 4) If there are more variables to be set than whitespace-terminated strings
 #    on the first line of the file, then the excess variables remain empty.
 echo "------------------------------------------------"
 # How to resolve the above problem with a loop:
 while read line
   echo "$line"
 done <data-file
 # Thanks, Heiner Steven for pointing this out.
 echo "------------------------------------------------"
 # Use $IFS (Internal Field Separator variable) to split a line of input to
 # "read", if you do not want the default to be whitespace.
 echo "List of all users:"
 OIFS=$IFS; IFS=:       # /etc/passwd uses ":" for field separator.
 while read name passwd uid gid fullname ignore
   echo "$name ($fullname)"
 done </etc/passwd   # I/O redirection.
 IFS=$OIFS              # Restore originial $IFS.
 # This code snippet also by Heiner Steven.
 #  Setting the $IFS variable within the loop itself
 #+ eliminates the need for storing the original $IFS
 #+ in a temporary variable.
 #  Thanks, Dim Segebart, for pointing this out.
 echo "------------------------------------------------"
 echo "List of all users:"
 while IFS=: read name passwd uid gid fullname ignore
   echo "$name ($fullname)"
 done </etc/passwd   # I/O redirection.
 echo "\$IFS still $IFS"
 exit 0


Piping output to a read, using echo to set variables will fail.

Yet, piping the output of cat seems to work.

cat file1 file2 |
 while read line
 echo $line

However, as Bj� Eriksson shows:

Example 11-8. Problems reading from a pipe

 # This example contributed by Bjon Eriksson.
 cat $0 |
 while read line
     echo "{$line}"
 printf "\nAll done, last:$last\n"
 exit 0  # End of code.
         # (Partial) output of script follows.
         # The 'echo' supplies extra brackets.
 {cat $0 |}
 {while read line}
 {echo "{$line}"}
 {printf "nAll done, last:$lastn"}
 All done, last:(null)
 The variable (last) is set within the subshell but unset outside.

The gendiff script, usually found in /usr/bin on many Linux distros, pipes the output of find to a while read construct.
find $1 \( -name "*$2" -o -name ".*$2" \) -print |
 while read f; do
 . . .



The familiar cd change directory command finds use in scripts where execution of a command requires being in a specified directory.

(cd /source/directory && tar cf - . ) | (cd /dest/directory && tar xpvf -)
[from the previously cited example by Alan Cox]

The -P (physical) option to cd causes it to ignore symbolic links.

cd - changes to $OLDPWD, the previous working directory.


The cd command does not function as expected when presented with two forward slashes.
bash$ cd //
 bash$ pwd
The output should, of course, be /. This is a problem both from the command line and in a script.


Print Working Directory. This gives the user's (or script's) current directory (see Example 11-9). The effect is identical to reading the value of the builtin variable $PWD.

pushd, popd, dirs

This command set is a mechanism for bookmarking working directories, a means of moving back and forth through directories in an orderly manner. A pushdown stack is used to keep track of directory names. Options allow various manipulations of the directory stack.

pushd dir-name pushes the path dir-name onto the directory stack and simultaneously changes the current working directory to dir-name

popd removes (pops) the top directory path name off the directory stack and simultaneously changes the current working directory to that directory popped from the stack.

dirs lists the contents of the directory stack (compare this with the $DIRSTACK variable). A successful pushd or popd will automatically invoke dirs.

Scripts that require various changes to the current working directory without hard-coding the directory name changes can make good use of these commands. Note that the implicit $DIRSTACK array variable, accessible from within a script, holds the contents of the directory stack.

Example 11-9. Changing the current working directory

 pushd $dir1
 # Will do an automatic 'dirs' (list directory stack to stdout).
 echo "Now in directory `pwd`." # Uses back-quoted 'pwd'.
 # Now, do some stuff in directory 'dir1'.
 pushd $dir2
 echo "Now in directory `pwd`."
 # Now, do some stuff in directory 'dir2'.
 echo "The top entry in the DIRSTACK array is $DIRSTACK."
 echo "Now back in directory `pwd`."
 # Now, do some more stuff in directory 'dir1'.
 echo "Now back in original working directory `pwd`."
 exit 0
 # What happens if you don't 'popd' -- then exit the script?
 # Which directory do you end up in? Why?



The let command carries out arithmetic operations on variables. In many cases, it functions as a less complex version of expr.

Example 11-10. Letting "let" do arithmetic.

 let a=11            # Same as 'a=11'
 let a=a+5           # Equivalent to  let "a = a + 5"
                     # (Double quotes and spaces make it more readable.)
 echo "11 + 5 = $a"  # 16
 let "a <<= 3"       # Equivalent to  let "a = a << 3"
 echo "\"\$a\" (=16) left-shifted 3 places = $a"
                     # 128
 let "a /= 4"        # Equivalent to  let "a = a / 4"
 echo "128 / 4 = $a" # 32
 let "a -= 5"        # Equivalent to  let "a = a - 5"
 echo "32 - 5 = $a"  # 27
 let "a *=  10"      # Equivalent to  let "a = a * 10"
 echo "27 * 10 = $a" # 270
 let "a %= 8"        # Equivalent to  let "a = a % 8"
 echo "270 modulo 8 = $a  (270 / 8 = 33, remainder $a)"
                     # 6
 exit 0

eval arg1 [arg2] ... [argN]

Combines the arguments in an expression or list of expressions and evaluates them. Any variables contained within the expression are expanded. The result translates into a command. This can be useful for code generation from the command line or within a script.

bash$ process=xterm
 bash$ show_process="eval ps ax | grep $process"
 bash$ $show_process
 1867 tty1     S      0:02 xterm
  2779 tty1     S      0:00 xterm
  2886 pts/1    S      0:00 grep xterm

Example 11-11. Showing the effect of eval

 y=`eval ls -l`  #  Similar to y=`ls -l`
 echo $y         #+ but linefeeds removed because "echoed" variable is unquoted.
 echo "$y"       #  Linefeeds preserved when variable is quoted.
 echo; echo
 y=`eval df`     #  Similar to y=`df`
 echo $y         #+ but linefeeds removed.
 #  When LF's not preserved, it may make it easier to parse output,
 #+ using utilities such as "awk".
 echo "==========================================================="
 # Now, showing how to "expand" a variable using "eval" . . .
 for i in 1 2 3 4 5; do
   eval value=$i
   #  value=$i has same effect. The "eval" is not necessary here.
   #  A variable lacking a meta-meaning evaluates to itself --
   #+ it can't expand to anything other than its literal self.
   echo $value
 echo "---"
 for i in ls df; do
   value=eval $i
   #  value=$i has an entirely different effect here.
   #  The "eval" evaluates the commands "ls" and "df" . . .
   #  The terms "ls" and "df" have a meta-meaning,
   #+ since they are interpreted as commands,
   #+ rather than just character strings.
   echo $value
 exit 0

Example 11-12. Forcing a log-off

 # Killing ppp to force a log-off.
 # Script should be run as root user.
 killppp="eval kill -9 `ps ax | awk '/ppp/ { print $1 }'`"
 #                     -------- process ID of ppp -------  
 $killppp                  # This variable is now a command.
 # The following operations must be done as root user.
 chmod 666 /dev/ttyS3      # Restore read+write permissions, or else what?
 #  Since doing a SIGKILL on ppp changed the permissions on the serial port,
 #+ we restore permissions to previous state.
 rm /var/lock/LCK..ttyS3   # Remove the serial port lock file. Why?
 exit 0
 # Exercises:
 # ---------
 # 1) Have script check whether root user is invoking it.
 # 2) Do a check on whether the process to be killed
 #+   is actually running before attempting to kill it.   
 # 3) Write an alternate version of this script based on 'fuser':
 #+      if [ fuser -s /dev/modem ]; then . . .

Example 11-13. A version of "rot13"

 # A version of "rot13" using 'eval'.
 # Compare to "" example.
 setvar_rot_13()              # "rot13" scrambling
   local varname=$1 varvalue=$2
   eval $varname='$(echo "$varvalue" | tr a-z n-za-m)'
 setvar_rot_13 var "foobar"   # Run "foobar" through rot13.
 echo $var                    # sbbone
 setvar_rot_13 var "$var"     # Run "sbbone" through rot13.
                              # Back to original variable.
 echo $var                    # foobar
 # This example by Stephane Chazelas.
 # Modified by document author.
 exit 0

Rory Winston contributed the following instance of how useful eval can be.

Example 11-14. Using eval to force variable substitution in a Perl script

In the Perl script "":
         my $WEBROOT = <WEBROOT_PATH>;
 To force variable substitution try:
         $export WEBROOT_PATH=/usr/local/webroot
         $sed 's/<WEBROOT_PATH>/$WEBROOT_PATH/' < > out
 But this just gives:
         my $WEBROOT = $WEBROOT_PATH;
         $export WEBROOT_PATH=/usr/local/webroot
         $eval sed 's%\<WEBROOT_PATH\>%$WEBROOT_PATH%' < > out
 #        ====
 That works fine, and gives the expected substitution:
         my $WEBROOT = /usr/local/webroot;
 ### Correction applied to original example by Paulo Marcel Coelho Aragao.


The eval command can be risky, and normally should be avoided when there exists a reasonable alternative. An eval $COMMANDS executes the contents of COMMANDS, which may contain such unpleasant surprises as rm -rf *. Running an eval on unfamiliar code written by persons unknown is living dangerously.


The set command changes the value of internal script variables. One use for this is to toggle option flags which help determine the behavior of the script. Another application for it is to reset the positional parameters that a script sees as the result of a command (set `command`). The script can then parse the fields of the command output.

Example 11-15. Using set with positional parameters

 # script "set-test"
 # Invoke this script with three command line parameters,
 # for example, "./set-test one two three".
 echo "Positional parameters before  set \`uname -a\` :"
 echo "Command-line argument #1 = $1"
 echo "Command-line argument #2 = $2"
 echo "Command-line argument #3 = $3"
 set `uname -a` # Sets the positional parameters to the output
                # of the command `uname -a`
 echo $_        # unknown
 # Flags set in script.
 echo "Positional parameters after  set \`uname -a\` :"
 # $1, $2, $3, etc. reinitialized to result of `uname -a`
 echo "Field #1 of 'uname -a' = $1"
 echo "Field #2 of 'uname -a' = $2"
 echo "Field #3 of 'uname -a' = $3"
 echo ---
 echo $_        # ---
 exit 0

Invoking set without any options or arguments simply lists all the environmental and other variables that have been initialized.
bash$ set

Using set with the -- option explicitly assigns the contents of a variable to the positional parameters. When no variable follows the --, it unsets the positional parameters.

Example 11-16. Reassigning the positional parameters

 variable="one two three four five"
 set -- $variable
 # Sets positional parameters to the contents of "$variable".
 shift; shift        # Shift past first two positional params.
 echo "first parameter = $first_param"             # one
 echo "second parameter = $second_param"           # two
 echo "remaining parameters = $remaining_params"   # three four five
 echo; echo
 # Again.
 set -- $variable
 echo "first parameter = $first_param"             # one
 echo "second parameter = $second_param"           # two
 # ======================================================
 set --
 # Unsets positional parameters if no variable specified.
 echo "first parameter = $first_param"             # (null value)
 echo "second parameter = $second_param"           # (null value)
 exit 0

See also Example 10-2 and Example 12-50.


The unset command deletes a shell variable, effectively setting it to null. Note that this command does not affect positional parameters.

bash$ unset PATH
 bash$ echo $PATH

Example 11-17. "Unsetting" a variable

 # Unsetting a variable.
 variable=hello                       # Initialized.
 echo "variable = $variable"
 unset variable                       # Unset.
                                      # Same effect as:  variable=
 echo "(unset) variable = $variable"  # $variable is null.
 exit 0

The export command makes available variables to all child processes of the running script or shell. Unfortunately, there is no way to export variables back to the parent process, to the process that called or invoked the script or shell. One important use of the export command is in startup files, to initialize and make accessible environmental variables to subsequent user processes.

Example 11-18. Using export to pass a variable to an embedded awk script

 #  Yet another version of the "column totaler" script (
 #+ that adds up a specified column (of numbers) in the target file.
 #  This uses the environment to pass a script variable to 'awk' . . .
 #+ and places the awk script in a variable.
 if [ $# -ne "$ARGS" ] # Check for proper no. of command line args.
    echo "Usage: `basename $0` filename column-number"
    exit $E_WRONGARGS
 #===== Same as original script, up to this point =====#
 export column_number
 # Export column number to environment, so it's available for retrieval.
 # -----------------------------------------------
 awkscript='{ total += $ENVIRON["column_number"] }
 END { print total }'
 # Yes, a variable can hold an awk script.
 # -----------------------------------------------
 # Now, run the awk script.
 awk "$awkscript" "$filename"
 # Thanks, Stephane Chazelas.
 exit 0


It is possible to initialize and export variables in the same operation, as in export var1=xxx.

However, as Greg Keraunen points out, in certain situations this may have a different effect than setting a variable, then exporting it.

bash$ export var=(a b); echo ${var[0]}
 (a b)
 bash$ var=(a b); export var; echo ${var[0]}

declare, typeset

The declare and typeset commands specify and/or restrict properties of variables.


Same as declare -r, sets a variable as read-only, or, in effect, as a constant. Attempts to change the variable fail with an error message. This is the shell analog of the C language const type qualifier.


This powerful tool parses command-line arguments passed to the script. This is the Bash analog of the getopt external command and the getopt library function familiar to C programmers. It permits passing and concatenating multiple options [2] and associated arguments to a script (for example scriptname -abc -e /usr/local).

The getopts construct uses two implicit variables. $OPTIND is the argument pointer (OPTion INDex) and $OPTARG (OPTion ARGument) the (optional) argument attached to an option. A colon following the option name in the declaration tags that option as having an associated argument.

A getopts construct usually comes packaged in a while loop, which processes the options and arguments one at a time, then decrements the implicit $OPTIND variable to step to the next.


  1. The arguments passed from the command line to the script must be preceded by a minus (-) or a plus (+). It is the prefixed - or + that lets getopts recognize command-line arguments as options. In fact, getopts will not process arguments without the prefixed - or +, and will terminate option processing at the first argument encountered lacking them.

  2. The getopts template differs slightly from the standard while loop, in that it lacks condition brackets.

  3. The getopts construct replaces the deprecated getopt external command.

while getopts ":abcde:fg" Option
 # Initial declaration.
 # a, b, c, d, e, f, and g are the options (flags) expected.
 # The : after option 'e' shows it will have an argument passed with it.
   case $Option in
     a ) # Do something with variable 'a'.
     b ) # Do something with variable 'b'.
     e)  # Do something with 'e', and also with $OPTARG,
         # which is the associated argument passed with option 'e'.
     g ) # Do something with variable 'g'.
 shift $(($OPTIND - 1))
 # Move argument pointer to next.
 # All this is not nearly as complicated as it looks <grin>.

Example 11-19. Using getopts to read the options/arguments passed to a script

 # Exercising getopts and OPTIND
 # Script modified 10/09/03 at the suggestion of Bill Gradwohl.
 # Here we observe how 'getopts' processes command line arguments to script.
 # The arguments are parsed as "options" (flags) and associated arguments.
 # Try invoking this script with
 # 'scriptname -mn'
 # 'scriptname -oq qOption' (qOption can be some arbitrary string.)
 # 'scriptname -qXXX -r'
 # 'scriptname -qr'    - Unexpected result, takes "r" as the argument to option "q"
 # 'scriptname -q -r'  - Unexpected result, same as above
 # 'scriptname -mnop -mnop'  - Unexpected result
 # (OPTIND is unreliable at stating where an option came from).
 #  If an option expects an argument ("flag:"), then it will grab
 #+ whatever is next on the command line.
 if [ $# -eq "$NO_ARGS" ]  # Script invoked with no command-line args?
   echo "Usage: `basename $0` options (-mnopqrs)"
   exit $E_OPTERROR        # Exit and explain usage, if no argument(s) given.
 # Usage: scriptname -options
 # Note: dash (-) necessary
 while getopts ":mnopq:rs" Option
   case $Option in
     m     ) echo "Scenario #1: option -m-   [OPTIND=${OPTIND}]";;
     n | o ) echo "Scenario #2: option -$Option-   [OPTIND=${OPTIND}]";;
     p     ) echo "Scenario #3: option -p-   [OPTIND=${OPTIND}]";;
     q     ) echo "Scenario #4: option -q-\
  with argument \"$OPTARG\"   [OPTIND=${OPTIND}]";;
     #  Note that option 'q' must have an associated argument,
     #+ otherwise it falls through to the default.
     r | s ) echo "Scenario #5: option -$Option-";;
     *     ) echo "Unimplemented option chosen.";;   # DEFAULT
 shift $(($OPTIND - 1))
 #  Decrements the argument pointer so it points to next argument.
 #  $1 now references the first non option item supplied on the command line
 #+ if one exists.
 exit 0
 #   As Bill Gradwohl states,
 #  "The getopts mechanism allows one to specify:  scriptname -mnop -mnop
 #+  but there is no reliable way to differentiate what came from where
 #+  by using OPTIND."

Script Behavior

source, . (dot command)

This command, when invoked from the command line, executes a script. Within a script, a source file-name loads the file file-name. Sourcing a file (dot-command) imports code into the script, appending to the script (same effect as the #include directive in a C program). The net result is the same as if the "sourced" lines of code were physically present in the body of the script. This is useful in situations when multiple scripts use a common data file or function library.

Example 11-20. "Including" a data file

 . data-file    # Load a data file.
 # Same effect as "source data-file", but more portable.
 #  The file "data-file" must be present in current working directory,
 #+ since it is referred to by its 'basename'.
 # Now, reference some data from that file.
 echo "variable1 (from data-file) = $variable1"
 echo "variable3 (from data-file) = $variable3"
 let "sum = $variable2 + $variable4"
 echo "Sum of variable2 + variable4 (from data-file) = $sum"
 echo "message1 (from data-file) is \"$message1\""
 # Note:                            escaped quotes
 print_message This is the message-print function in the data-file.
 exit 0

File data-file for Example 11-20, above. Must be present in same directory.

# This is a data file loaded by a script.
 # Files of this type may contain variables, functions, etc.
 # It may be loaded with a 'source' or '.' command by a shell script.
 # Let's initialize some variables.
 message1="Hello, how are you?"
 message2="Enough for now. Goodbye."
 print_message ()
 # Echoes any message passed to it.
   if [ -z "$1" ]
     return 1
     # Error, if argument missing.
   until [ -z "$1" ]
     # Step through arguments passed to function.
     echo -n "$1"
     # Echo args one at a time, suppressing line feeds.
     echo -n " "
     # Insert spaces between words.
     # Next one.
   return 0

It is even possible for a script to source itself, though this does not seem to have any practical applications.

Example 11-21. A (useless) script that sources itself

 # a script sourcing itself "recursively."
 # From "Stupid Script Tricks," Volume II.
 MAXPASSCNT=100    # Maximum number of execution passes.
 echo -n  "$pass_count  "
 #  At first execution pass, this just echoes two blank spaces,
 #+ since $pass_count still uninitialized.
 let "pass_count += 1"
 #  Assumes the uninitialized variable $pass_count
 #+ can be incremented the first time around.
 #  This works with Bash and pdksh, but
 #+ it relies on non-portable (and possibly dangerous) behavior.
 #  Better would be to initialize $pass_count to 0 before incrementing.
 while [ "$pass_count" -le $MAXPASSCNT ]
   . $0   # Script "sources" itself, rather than calling itself.
          # ./$0 (which would be true recursion) doesn't work here. Why?
 #  What occurs here is not actually recursion,
 #+ since the script effectively "expands" itself, i.e.,
 #+ generates a new section of code
 #+ with each pass through the 'while' loop',
 #  with each 'source' in line 20.
 #  Of course, the script interprets each newly 'sourced' "#!" line
 #+ as a comment, and not as the start of a new script.
 exit 0   # The net effect is counting from 1 to 100.
          # Very impressive.
 # Exercise:
 # --------
 # Write a script that uses this trick to actually do something useful.

Unconditionally terminates a script. The exit command may optionally take an integer argument, which is returned to the shell as the exit status of the script. It is good practice to end all but the simplest scripts with an exit 0, indicating a successful run.


If a script terminates with an exit lacking an argument, the exit status of the script is the exit status of the last command executed in the script, not counting the exit. This is equivalent to an exit $?.


This shell builtin replaces the current process with a specified command. Normally, when the shell encounters a command, it forks off a child process to actually execute the command. Using the exec builtin, the shell does not fork, and the command exec'ed replaces the shell. When used in a script, therefore, it forces an exit from the script when the exec'ed command terminates. [3]

Example 11-22. Effects of exec

 exec echo "Exiting \"$0\"."   # Exit from script here.
 # ----------------------------------
 # The following lines never execute.
 echo "This echo will never echo."
 exit 99                       #  This script will not exit here.
                               #  Check exit value after script terminates
                               #+ with an 'echo $?'.
                               #  It will *not* be 99.

Example 11-23. A script that exec's itself

 echo "This line appears ONCE in the script, yet it keeps echoing."
 echo "The PID of this instance of the script is still $$."
 #     Demonstrates that a subshell is not forked off.
 echo "==================== Hit Ctl-C to exit ===================="
 sleep 1
 exec $0   #  Spawns another instance of this same script
           #+ that replaces the previous one.
 echo "This line will never echo!"  # Why not?
 exit 0

An exec also serves to reassign file descriptors. For example, exec <zzz-file replaces stdin with the file zzz-file.


The -exec option to find is not the same as the exec shell builtin.


This command permits changing shell options on the fly (see Example 24-1 and Example 24-2). It often appears in the Bash startup files, but also has its uses in scripts. Needs version 2 or later of Bash.
shopt -s cdspell
 # Allows minor misspelling of directory names with 'cd'
 cd /hpme  # Oops! Mistyped '/home'.
 pwd       # /home
           # The shell corrected the misspelling.



A command that returns a successful (zero) exit status, but does nothing else.

# Endless loop
 while true   # alias for ":"
    # Need a way to break out of loop or script will hang.


A command that returns an unsuccessful exit status, but does nothing else.

# Null loop
 while false
    # The following code will not execute.
    # Nothing happens!

type [cmd]

Similar to the which external command, type cmd gives the full path name to "cmd". Unlike which, type is a Bash builtin. The useful -a option to type identifies keywords and builtins, and also locates system commands with identical names.

bash$ type '['
 [ is a shell builtin
 bash$ type -a '['
 [ is a shell builtin
  [ is /usr/bin/[

hash [cmds]

Record the path name of specified commands -- in the shell hash table [4] -- so the shell or script will not need to search the $PATH on subsequent calls to those commands. When hash is called with no arguments, it simply lists the commands that have been hashed. The -r option resets the hash table.


The bind builtin displays or modifies readline [5] key bindings.


Gets a short usage summary of a shell builtin. This is the counterpart to whatis, but for builtins.

bash$ help exit
 exit: exit [n]
     Exit the shell with a status of N.  If N is omitted, the exit status
     is that of the last command executed.



An exception to this is the time command, listed in the official Bash documentation as a keyword.


A option is an argument that acts as a flag, switching script behaviors on or off. The argument associated with a particular option indicates the behavior that the option (flag) switches on or off.


Unless the exec is used to reassign file descriptors.


Hashing is a method of creating lookup keys for data stored in a table. The data items themselves are "scrambled" to create keys, using one of a number of simple mathematical algorithms.

An advantage of hashing is that it is fast. A disadvantage is that "collisions" -- where a single key maps to more than one data item -- are possible.

For examples of hashing see Example A-21 and Example A-22.


The readline library is what Bash uses for reading input in an interactive shell.

Basic Commands

The first commands a novice learns


The basic file "list" command. It is all too easy to underestimate the power of this humble command. For example, using the -R, recursive option, ls provides a tree-like listing of a directory structure. Other useful options are -S, sort listing by file size, -t, sort by file modification time, and -i, show file inodes (see Example 12-4).

Example 12-1. Using ls to create a table of contents for burning a CDR disk

 # (
 # Script to automate burning a CDR.
 SPEED=2          # May use higher speed if your hardware supports it.
 # DEVICE="0,0"     For older versions of cdrecord
 DEFAULTDIR=/opt  # This is the directory containing the data to be burned.
                  # Make sure it exists.
                  # Exercise: Add a test for this.
 # Uses Joerg Schilling's "cdrecord" package:
 #  If this script invoked as an ordinary user, may need to suid cdrecord
 #+ chmod u+s /usr/bin/cdrecord, as root.
 #  Of course, this creates a security hole, though a relatively minor one.
 if [ -z "$1" ]
   # Default directory, if not specified on command line.
 # Create a "table of contents" file.
 # The "l" option gives a "long" file listing.
 # The "R" option makes the listing recursive.
 # The "F" option marks the file types (directories get a trailing /).
 echo "Creating table of contents."
 # Create an image file preparatory to burning it onto the CDR.
 echo "Creating ISO9660 file system image ($IMAGEFILE)."
 # Burn the CDR.
 echo "Burning the disk."
 echo "Please be patient, this will take a while."
 cdrecord -v -isosize speed=$SPEED dev=$DEVICE $IMAGEFILE
 exit $?
cat, tac

cat, an acronym for concatenate, lists a file to stdout. When combined with redirection (> or >>), it is commonly used to concatenate files.
# Uses of 'cat'
 cat filename                          # Lists the file.
 cat file.1 file.2 file.3 > file.123   # Combines three files into one.
The -n option to cat inserts consecutive numbers before all lines of the target file(s). The -b option numbers only the non-blank lines. The -v option echoes nonprintable characters, using ^ notation. The -s option squeezes multiple consecutive blank lines into a single blank line.

See also Example 12-25 and Example 12-21.


In a pipe, it may be more efficient to redirect the stdin to a file, rather than to cat the file.

cat filename | tr a-z A-Z
 tr a-z A-Z < filename   #  Same effect, but starts one less process,
                         #+ and also dispenses with the pipe.

tac, is the inverse of cat, listing a file backwards from its end.


reverses each line of a file, and outputs to stdout. This does not have the same effect as tac, as it preserves the order of the lines, but flips each one around.

bash$ cat file1.txt
 This is line 1.
  This is line 2.
 bash$ tac file1.txt
 This is line 2.
  This is line 1.
 bash$ rev file1.txt
 .1 enil si sihT
  .2 enil si sihT


This is the file copy command. cp file1 file2 copies file1 to file2, overwriting file2 if it already exists (see Example 12-6).


Particularly useful are the -a archive flag (for copying an entire directory tree) and the -r and -R recursive flags.


This is the file move command. It is equivalent to a combination of cp and rm. It may be used to move multiple files to a directory, or even to rename a directory. For some examples of using mv in a script, see Example 9-18 and Example A-2.


When used in a non-interactive script, mv takes the -f (force) option to bypass user input.

When a directory is moved to a preexisting directory, it becomes a subdirectory of the destination directory.

bash$ mv source_directory target_directory
 bash$ ls -lF target_directory
 total 1
  drwxrwxr-x    2 bozo  bozo      1024 May 28 19:20 source_directory/


Delete (remove) a file or files. The -f option forces removal of even readonly files, and is useful for bypassing user input in a script.


The rm command will, by itself, fail to remove filenames beginning with a dash.

bash$ rm -badname
 rm: invalid option -- b
  Try `rm --help' for more information.

One way to accomplish this is to preface the filename to be removed with a dot-slash .
bash$ rm ./-badname
Another method is to precede the filename with a " -- ".
bash$ rm -- -badname


When used with the recursive flag -r, this command removes files all the way down the directory tree from the current directory. A careless rm -rf * can wipe out a big chunk of a directory structure.


Remove directory. The directory must be empty of all files -- including "invisible" dotfiles [1] -- for this command to succeed.


Make directory, creates a new directory. For example, mkdir -p project/programs/December creates the named directory. The -p option automatically creates any necessary parent directories.


Changes the attributes of an existing file (see Example 11-12).

chmod +x filename
 # Makes "filename" executable for all users.
 chmod u+s filename
 # Sets "suid" bit on "filename" permissions.
 # An ordinary user may execute "filename" with same privileges as the file's owner.
 # (This does not apply to shell scripts.)

chmod 644 filename
 # Makes "filename" readable/writable to owner, readable to
 # others
 # (octal mode).

chmod 1777 directory-name
 # Gives everyone read, write, and execute permission in directory,
 # however also sets the "sticky bit".
 # This means that only the owner of the directory,
 # owner of the file, and, of course, root
 # can delete any particular file in that directory.


Change file attributes. This is analogous to chmod above, but with different options and a different invocation syntax, and it works only on an ext2 filesystem.

One particularly interesting chattr option is i. A chattr +i filename marks the file as immutable. The file cannot be modified, linked to, or deleted , not even by root. This file attribute can be set or removed only by root. In a similar fashion, the a option marks the file as append only.

root# chattr +i file1.txt
 root# rm file1.txt
 rm: remove write-protected regular file `file1.txt'? y
  rm: cannot remove `file1.txt': Operation not permitted

If a file has the s (secure) attribute set, then when it is deleted its block is zeroed out on the disk.

If a file has the u (undelete) attribute set, then when it is deleted, its contents can still be retrieved (undeleted).

If a file has the c (compress) attribute set, then it will automatically be compressed on writes to disk, and uncompressed on reads.


The file attributes set with chattr do not show in a file listing (ls -l).


Creates links to pre-existings files. A "link" is a reference to a file, an alternate name for it. The ln command permits referencing the linked file by more than one name and is a superior alternative to aliasing (see Example 4-6).

The ln creates only a reference, a pointer to the file only a few bytes in size.

The ln command is most often used with the -s, symbolic or "soft" link flag. An advantage of using the -s flag is that it permits linking across file systems.

The syntax of the command is a bit tricky. For example: ln -s oldfile newfile links the previously existing oldfile to the newly created link, newfile.


If a file named newfile has previously existed, it will be deleted when the filename newfile is preempted as the name for a link.

Links give the ability to invoke a script (or any other type of executable) with multiple names, and having that script behave according to how it was invoked.

Example 12-2. Hello or Good-bye

 # Saying "hello" or "goodbye"
 #+          depending on how script is invoked.
 # Make a link in current working directory ($PWD) to this script:
 #    ln -s goodbye
 # Now, try invoking this script both ways:
 # ./
 # ./goodbye
 if [ $0 = "./goodbye" ]
   echo "Good-bye!"
   # Some other goodbye-type commands, as appropriate.
 echo "Hello!"
 # Some other hello-type commands, as appropriate.
man, info

These commands access the manual and information pages on system commands and installed utilities. When available, the info pages usually contain a more detailed description than do the man pages.



Dotfiles are files whose names begin with a dot, such as ~/.Xdefaults. Such filenames do not appear in a normal ls listing (although an ls -a will show them), and they cannot be deleted by an accidental rm -rf *. Dotfiles are generally used as setup and configuration files in a user's home directory.

Complex Commands

Commands for more advanced users


-exec COMMAND \;

Carries out COMMAND on each file that find matches. The command sequence terminates with ; (the ";" is escaped to make certain the shell passes it to find literally, without interpreting it as a special character).

bash$ find ~/ -name '*.txt'

If COMMAND contains {}, then find substitutes the full path name of the selected file for "{}".

find ~/ -name 'core*' -exec rm {} \;
 # Removes all core dump files from user's home directory.

find /home/bozo/projects -mtime 1
 #  Lists all files in /home/bozo/projects directory tree
 #+ that were modified within the last day.
 #  mtime = last modification time of the target file
 #  ctime = last status change time (via 'chmod' or otherwise)
 #  atime = last access time
 find "$DIR" -type f -atime +5 -exec rm {} \;
 #                                      ^^
 #  Curly brackets are placeholder for the path name output by "find."
 #  Deletes all files in "/home/bozo/junk_files"
 #+ that have not been accessed in at least 5 days.
 #  "-type filetype", where
 #  f = regular file
 #  d = directory, etc.
 #  (The 'find' manpage has a complete listing.)

find /etc -exec grep '[0-9][0-9]*[.][0-9][0-9]*[.][0-9][0-9]*[.][0-9][0-9]*' {} \;
 # Finds all IP addresses ( in /etc directory files.
 # There a few extraneous hits. How can they be filtered out?
 # Perhaps by:
 find /etc -type f -exec cat '{}' \; | tr -c '.[:digit:]' '\n' \
 | grep '^[^.][^.]*\.[^.][^.]*\.[^.][^.]*\.[^.][^.]*$'
 #  [:digit:] is one of the character classes
 #+ introduced with the POSIX 1003.2 standard. 
 # Thanks, Stephan�Chazelas. 


The -exec option to find should not be confused with the exec shell builtin.

Example 12-3. Badname, eliminate file names in current directory containing bad characters and whitespace.

 # Delete filenames in current directory containing bad characters.
 for filename in *
   badname=`echo "$filename" | sed -n /[\+\{\;\"\\\=\?~\(\)\<\>\&\*\|\$]/p`
 # badname=`echo "$filename" | sed -n '/[+{;"\=?~()<>&*|$]/p'`  also works.
 # Deletes files containing these nasties:     + { ; " \ = ? ~ ( ) < > & * | $
   rm $badname 2>/dev/null
 #             ^^^^^^^^^^^ Error messages deep-sixed.
 # Now, take care of files containing all manner of whitespace.
 find . -name "* *" -exec rm -f {} \;
 # The path name of the file that "find" finds replaces the "{}".
 # The '\' ensures that the ';' is interpreted literally, as end of command.
 exit 0
 # Commands below this line will not execute because of "exit" command.
 # An alternative to the above script:
 find . -name '*[+{;"\\=?~()<>&*|$ ]*' -exec rm -f '{}' \;
 # (Thanks, S.C.)

Example 12-4. Deleting a file by its inode number

 # Deleting a file by its inode number.
 #  This is useful when a filename starts with an illegal character,
 #+ such as ? or -.
 ARGCOUNT=1                      # Filename arg must be passed to script.
 if [ $# -ne "$ARGCOUNT" ]
   echo "Usage: `basename $0` filename"
   exit $E_WRONGARGS
 if [ ! -e "$1" ]
   echo "File \""$1"\" does not exist."
 inum=`ls -i | grep "$1" | awk '{print $1}'`
 # inum = inode (index node) number of file
 # ----------------------------------------------------------------------
 # Every file has an inode, a record that hold its physical address info.
 # ----------------------------------------------------------------------
 echo; echo -n "Are you absolutely sure you want to delete \"$1\" (y/n)? "
 # The '-v' option to 'rm' also asks this.
 read answer
 case "$answer" in
 [nN]) echo "Changed your mind, huh?"
       exit $E_CHANGED_MIND
 *)    echo "Deleting file \"$1\".";;
 find . -inum $inum -exec rm {} \;
 #                           ^^
 #        Curly brackets are placeholder
 #+       for text output by "find."
 echo "File "\"$1"\" deleted!"
 exit 0

See Example 12-27, Example 3-4, and Example 10-9 for scripts using find. Its manpage provides more detail on this complex and powerful command.


A filter for feeding arguments to a command, and also a tool for assembling the commands themselves. It breaks a data stream into small enough chunks for filters and commands to process. Consider it as a powerful replacement for backquotes. In situations where command substitution fails with a too many arguments error, substituting xargs often works. Normally, xargs reads from stdin or from a pipe, but it can also be given the output of a file.

The default command for xargs is echo. This means that input piped to xargs may have linefeeds and other whitespace characters stripped out.
bash$ ls -l
 total 0
  -rw-rw-r--    1 bozo  bozo         0 Jan 29 23:58 file1
  -rw-rw-r--    1 bozo  bozo         0 Jan 29 23:58 file2
 bash$ ls -l | xargs
 total 0 -rw-rw-r-- 1 bozo bozo 0 Jan 29 23:58 file1 -rw-rw-r-- 1 bozo bozo 0 Jan 29 23:58 file2

ls | xargs -p -l gzip gzips every file in current directory, one at a time, prompting before each operation.


An interesting xargs option is -n NN, which limits to NN the number of arguments passed.

ls | xargs -n 8 echo lists the files in the current directory in 8 columns.


Another useful option is -0, in combination with find -print0 or grep -lZ. This allows handling arguments containing whitespace or quotes.

find / -type f -print0 | xargs -0 grep -liwZ GUI | xargs -0 rm -f

grep -rliwZ GUI / | xargs -0 rm -f

Either of the above will remove any file containing "GUI". (Thanks, S.C.)

Example 12-5. Logfile: Using xargs to monitor system log

 # Generates a log file in current directory
 # from the tail end of /var/log/messages.
 # Note: /var/log/messages must be world readable
 # if this script invoked by an ordinary user.
 #         #root chmod 644 /var/log/messages
 ( date; uname -a ) >>logfile
 # Time and machine name
 echo --------------------------------------------------------------------- >>logfile
 tail -$LINES /var/log/messages | xargs |  fmt -s >>logfile
 echo >>logfile
 echo >>logfile
 exit 0
 #  Note:
 #  ----
 #  As Frank Wang points out,
 #+ unmatched quotes (either single or double quotes) in the source file
 #+ may give xargs indigestion.
 #  He suggests the following substitution for line 15:
 #     tail -$LINES /var/log/messages | tr -d "\"'" | xargs | fmt -s >>logfile
 #  Exercise:
 #  --------
 #  Modify this script to track changes in /var/log/messages at intervals
 #+ of 20 minutes.
 #  Hint: Use the "watch" command. 

As in find, a curly bracket pair serves as a placeholder for replacement text.

Example 12-6. Copying files in current directory to another

 #  Copy (verbose) all files in current directory ($PWD)
 #+ to directory specified on command line.
 if [ -z "$1" ]   # Exit if no argument given.
   echo "Usage: `basename $0` directory-to-copy-to"
   exit $E_NOARGS
 ls . | xargs -i -t cp ./{} $1
 #            ^^ ^^      ^^
 #  -t is "verbose" (output command line to stderr) option.
 #  -i is "replace strings" option.
 #  {} is a placeholder for output text.
 #  This is similar to the use of a curly bracket pair in "find."
 #  List the files in current directory (ls .),
 #+ pass the output of "ls" as arguments to "xargs" (-i -t options),
 #+ then copy (cp) these arguments ({}) to new directory ($1).  
 #  The net result is the exact equivalent of
 #+   cp * $1
 #+ unless any of the filenames has embedded "whitespace" characters.
 exit 0

Example 12-7. Killing processes by name

 # Killing processes by name.
 # Compare this script with
 #  For instance,
 #+ try "./ xterm" --
 #+ and watch all the xterms on your desktop disappear.
 #  Warning:
 #  -------
 #  This is a fairly dangerous script.
 #  Running it carelessly (especially as root)
 #+ can cause data loss and other undesirable effects.
 if test -z "$1"  # No command line arg supplied?
   echo "Usage: `basename $0` Process(es)_to_kill"
   exit $E_BADARGS
 ps ax | grep "$PROCESS_NAME" | awk '{print $1}' | xargs -i kill {} 2&>/dev/null
 #                                                       ^^      ^^
 # -----------------------------------------------------------
 # Notes:
 # -i is the "replace strings" option to xargs.
 # The curly brackets are the placeholder for the replacement.
 # 2&>/dev/null suppresses unwanted error messages.
 # -----------------------------------------------------------
 exit $?

Example 12-8. Word frequency analysis using xargs

 # Crude word frequency analysis on a text file.
 # Uses 'xargs' to decompose lines of text into single words.
 # Compare this example to the "" script later on.
 # Check for input file on command line.
 if [ $# -ne "$ARGS" ]
 # Correct number of arguments passed to script?
   echo "Usage: `basename $0` filename"
   exit $E_BADARGS
 if [ ! -f "$1" ]       # Check if file exists.
   echo "File \"$1\" does not exist."
   exit $E_NOFILE
 cat "$1" | xargs -n1 | \
 #  List the file, one word per line. 
 tr A-Z a-z | \
 #  Shift characters to lowercase.
 sed -e 's/\.//g'  -e 's/\,//g' -e 's/ /\
 /g' | \
 #  Filter out periods and commas, and
 #+ change space between words to linefeed,
 sort | uniq -c | sort -nr
 #  Finally prefix occurrence count and sort numerically.
 #  This does the same job as the "" example,
 #+ but a bit more ponderously, and it runs more slowly (why?).
 exit 0

All-purpose expression evaluator: Concatenates and evaluates the arguments according to the operation given (arguments must be separated by spaces). Operations may be arithmetic, comparison, string, or logical.

expr 3 + 5

returns 8

expr 5 % 3

returns 2

expr 5 \* 3

returns 15

The multiplication operator must be escaped when used in an arithmetic expression with expr.

y=`expr $y + 1`

Increment a variable, with the same effect as let y=y+1 and y=$(($y+1)). This is an example of arithmetic expansion.

z=`expr substr $string $position $length`

Extract substring of $length characters, starting at $position.

Example 12-9. Using expr

 # Demonstrating some of the uses of 'expr'
 # =======================================
 # Arithmetic Operators
 # ---------- ---------
 echo "Arithmetic Operators"
 a=`expr 5 + 3`
 echo "5 + 3 = $a"
 a=`expr $a + 1`
 echo "a + 1 = $a"
 echo "(incrementing a variable)"
 a=`expr 5 % 3`
 # modulo
 echo "5 mod 3 = $a"
 # Logical Operators
 # ------- ---------
 #  Returns 1 if true, 0 if false,
 #+ opposite of normal Bash convention.
 echo "Logical Operators"
 b=`expr $x = $y`         # Test equality.
 echo "b = $b"            # 0  ( $x -ne $y )
 b=`expr $a \> 10`
 echo 'b=`expr $a \> 10`, therefore...'
 echo "If a > 10, b = 0 (false)"
 echo "b = $b"            # 0  ( 3 ! -gt 10 )
 b=`expr $a \< 10`
 echo "If a < 10, b = 1 (true)"
 echo "b = $b"            # 1  ( 3 -lt 10 )
 # Note escaping of operators.
 b=`expr $a \<= 3`
 echo "If a <= 3, b = 1 (true)"
 echo "b = $b"            # 1  ( 3 -le 3 )
 # There is also a "\>=" operator (greater than or equal to).
 # String Operators
 # ------ ---------
 echo "String Operators"
 echo "The string being operated upon is \"$a\"."
 # length: length of string
 b=`expr length $a`
 echo "Length of \"$a\" is $b."
 # index: position of first character in substring
 #        that matches a character in string
 b=`expr index $a 23`
 echo "Numerical position of first \"2\" in \"$a\" is \"$b\"."
 # substr: extract substring, starting position & length specified
 b=`expr substr $a 2 6`
 echo "Substring of \"$a\", starting at position 2,\
 and 6 chars long is \"$b\"."
 #  The default behavior of the 'match' operations is to
 #+ search for the specified match at the ***beginning*** of the string.
 #        uses Regular Expressions
 b=`expr match "$a" '[0-9]*'`               #  Numerical count.
 echo Number of digits at the beginning of \"$a\" is $b.
 b=`expr match "$a" '\([0-9]*\)'`           #  Note that escaped parentheses
 #                   ==      ==              + trigger substring match.
 echo "The digits at the beginning of \"$a\" are \"$b\"."
 exit 0


The : operator can substitute for match. For example, b=`expr $a : [0-9]*` is the exact equivalent of b=`expr match $a [0-9]*` in the above listing.

 echo "String operations using \"expr \$string : \" construct"
 echo "==================================================="
 echo "The string being operated upon is \"`expr "$a" : '\(.*\)'`\"."
 #     Escaped parentheses grouping operator.            ==  ==
 #       ***************************
 #+          Escaped parentheses
 #+           match a substring
 #       ***************************
 #  If no escaped parentheses...
 #+ then 'expr' converts the string operand to an integer.
 echo "Length of \"$a\" is `expr "$a" : '.*'`."   # Length of string
 echo "Number of digits at the beginning of \"$a\" is `expr "$a" : '[0-9]*'`."
 # ------------------------------------------------------------------------- #
 echo "The digits at the beginning of \"$a\" are `expr "$a" : '\([0-9]*\)'`."
 #                                                             ==      ==
 echo "The first 7 characters of \"$a\" are `expr "$a" : '\(.......\)'`."
 #         =====                                          ==       ==
 # Again, escaped parentheses force a substring match.
 echo "The last 7 characters of \"$a\" are `expr "$a" : '.*\(.......\)'`."
 #         ====                  end of string operator  ^^
 #  (actually means skip over one or more of any characters until specified
 #+  substring)
 exit 0

Time / Date Commands

Time/date and timing


Simply invoked, date prints the date and time to stdout. Where this command gets interesting is in its formatting and parsing options.

Example 12-10. Using date

 # Exercising the 'date' command
 echo "The number of days since the year's beginning is `date +%j`."
 # Needs a leading '+' to invoke formatting.
 # %j gives day of year.
 echo "The number of seconds elapsed since 01/01/1970 is `date +%s`."
 #  %s yields number of seconds since "UNIX epoch" began,
 #+ but how is this useful?
 suffix=$(date +%s)  # The "+%s" option to 'date' is GNU-specific.
 echo $filename
 #  It's great for creating "unique" temp filenames,
 #+ even better than using $$.
 # Read the 'date' man page for more formatting options.
 exit 0

The -u option gives the UTC (Universal Coordinated Time).

bash$ date
 Fri Mar 29 21:07:39 MST 2002
 bash$ date -u
 Sat Mar 30 04:07:42 UTC 2002

The date command has quite a number of output options. For example %N gives the nanosecond portion of the current time. One interesting use for this is to generate six-digit random integers.
date +%N | sed -e 's/000$//' -e 's/^0//'
 # Strip off leading and trailing zeroes, if present.

There are many more options (try man date).
date +%j
 # Echoes day of the year (days elapsed since January 1).
 date +%k%M
 # Echoes hour and minute in 24-hour format, as a single digit string.
 # The 'TZ' parameter permits overriding the default time zone.
 date                 # Mon Mar 28 21:42:16 MST 2005
 TZ=EST date          # Mon Mar 28 23:42:16 EST 2005
 # Thanks, Frank Kannemann and Pete Sjoberg, for the tip.
 SixDaysAgo=$(date --date='6 days ago')
 OneMonthAgo=$(date --date='1 month ago')  # Four weeks back (not a month).
 OneYearAgo=$(date --date='1 year ago')

See also Example 3-4.


Time zone dump: echoes the time in a specified time zone.

bash$ zdump EST
 EST  Tue Sep 18 22:09:22 2001 EST


Outputs very verbose timing statistics for executing a command.

time ls -l / gives something like this:
0.00user 0.01system 0:00.05elapsed 16%CPU (0avgtext+0avgdata 0maxresident)k
  0inputs+0outputs (149major+27minor)pagefaults 0swaps

See also the very similar times command in the previous section.


As of version 2.0 of Bash, time became a shell reserved word, with slightly altered behavior in a pipeline.


Utility for updating access/modification times of a file to current system time or other specified time, but also useful for creating a new file. The command touch zzz will create a new file of zero length, named zzz, assuming that zzz did not previously exist. Time-stamping empty files in this way is useful for storing date information, for example in keeping track of modification times on a project.


The touch command is equivalent to : >> newfile or >> newfile (for ordinary files).


The at job control command executes a given set of commands at a specified time. Superficially, it resembles cron, however, at is chiefly useful for one-time execution of a command set.

at 2pm January 15 prompts for a set of commands to execute at that time. These commands should be shell-script compatible, since, for all practical purposes, the user is typing in an executable shell script a line at a time. Input terminates with a Ctl-D.

Using either the -f option or input redirection (<), at reads a command list from a file. This file is an executable shell script, though it should, of course, be noninteractive. Particularly clever is including the run-parts command in the file to execute a different set of scripts.

bash$ at 2:30 am Friday < at-jobs.list
 job 2 at 2000-10-27 02:30


The batch job control command is similar to at, but it runs a command list when the system load drops below .8. Like at, it can read commands from a file with the -f option.


Prints a neatly formatted monthly calendar to stdout. Will do current year or a large range of past and future years.


This is the shell equivalent of a wait loop. It pauses for a specified number of seconds, doing nothing. It can be useful for timing or in processes running in the background, checking for a specific event every so often (polling), as in Example 29-6.
sleep 3     # Pauses 3 seconds.


The sleep command defaults to seconds, but minute, hours, or days may also be specified.
sleep 3 h   # Pauses 3 hours!


The watch command may be a better choice than sleep for running commands at timed intervals.


Microsleep (the "u" may be read as the Greek "mu", or micro- prefix). This is the same as sleep, above, but "sleeps" in microsecond intervals. It can be used for fine-grain timing, or for polling an ongoing process at very frequent intervals.

usleep 30     # Pauses 30 microseconds.

This command is part of the Red Hat initscripts / rc-scripts package.


The usleep command does not provide particularly accurate timing, and is therefore unsuitable for critical timing loops.

hwclock, clock

The hwclock command accesses or adjusts the machine's hardware clock. Some options require root privileges. The /etc/rc.d/rc.sysinit startup file uses hwclock to set the system time from the hardware clock at bootup.

The clock command is a synonym for hwclock.

Text Processing Commands

Commands affecting text and text files


File sorter, often used as a filter in a pipe. This command sorts a text stream or file forwards or backwards, or according to various keys or character positions. Using the -m option, it merges presorted input files. The info page lists its many capabilities and options. See Example 10-9, Example 10-10, and Example A-8.


Topological sort, reading in pairs of whitespace-separated strings and sorting according to input patterns.


This filter removes duplicate lines from a sorted file. It is often seen in a pipe coupled with sort.
cat list-1 list-2 list-3 | sort | uniq > final.list
 # Concatenates the list files,
 # sorts them,
 # removes duplicate lines,
 # and finally writes the result to an output file.

The useful -c option prefixes each line of the input file with its number of occurrences.

bash$ cat testfile
 This line occurs only once.
  This line occurs twice.
  This line occurs twice.
  This line occurs three times.
  This line occurs three times.
  This line occurs three times.
 bash$ uniq -c testfile
       1 This line occurs only once.
        2 This line occurs twice.
        3 This line occurs three times.
 bash$ sort testfile | uniq -c | sort -nr
       3 This line occurs three times.
        2 This line occurs twice.
        1 This line occurs only once.

The sort INPUTFILE | uniq -c | sort -nr command string produces a frequency of occurrence listing on the INPUTFILE file (the -nr options to sort cause a reverse numerical sort). This template finds use in analysis of log files and dictionary lists, and wherever the lexical structure of a document needs to be examined.

Example 12-11. Word Frequency Analysis

 # Crude word frequency analysis on a text file.
 # This is a more efficient version of the "" script.
 # Check for input file on command line.
 if [ $# -ne "$ARGS" ]  # Correct number of arguments passed to script?
   echo "Usage: `basename $0` filename"
   exit $E_BADARGS
 if [ ! -f "$1" ]       # Check if file exists.
   echo "File \"$1\" does not exist."
   exit $E_NOFILE
 # main ()
 sed -e 's/\.//g'  -e 's/\,//g' -e 's/ /\
 /g' "$1" | tr 'A-Z' 'a-z' | sort | uniq -c | sort -nr
 #                           =========================
 #                            Frequency of occurrence
 #  Filter out periods and commas, and
 #+ change space between words to linefeed,
 #+ then shift characters to lowercase, and
 #+ finally prefix occurrence count and sort numerically.
 #  Arun Giridhar suggests modifying the above to:
 #  . . . | sort | uniq -c | sort +1 [-f] | sort +0 -nr
 #  This adds a secondary sort key, so instances of
 #+ equal occurrence are sorted alphabetically.
 #  As he explains it:
 #  "This is effectively a radix sort, first on the
 #+ least significant column
 #+ (word or string, optionally case-insensitive)
 #+ and last on the most significant column (frequency)."
 exit 0
 # Exercises:
 # ---------
 # 1) Add 'sed' commands to filter out other punctuation,
 #+   such as semicolons.
 # 2) Modify the script to also filter out multiple spaces and
 #    other whitespace.

bash$ cat testfile
 This line occurs only once.
  This line occurs twice.
  This line occurs twice.
  This line occurs three times.
  This line occurs three times.
  This line occurs three times.
 bash$ ./ testfile
       6 this
        6 occurs
        6 line
        3 times
        3 three
        2 twice
        1 only
        1 once

expand, unexpand

The expand filter converts tabs to spaces. It is often used in a pipe.

The unexpand filter converts spaces to tabs. This reverses the effect of expand.


A tool for extracting fields from files. It is similar to the print $N command set in awk, but more limited. It may be simpler to use cut in a script than awk. Particularly important are the -d (delimiter) and -f (field specifier) options.

Using cut to obtain a listing of the mounted filesystems:
cat /etc/mtab | cut -d ' ' -f1,2

Using cut to list the OS and kernel version:
uname -a | cut -d" " -f1,3,11,12

Using cut to extract message headers from an e-mail folder:
bash$ grep '^Subject:' read-messages | cut -c10-80
 Re: Linux suitable for mission-critical apps?
  Spam complaint
  Re: Spam complaint

Using cut to parse a file:
# List all the users in /etc/passwd.
 for user in $(cut -d: -f1 $FILENAME)
   echo $user
 # Thanks, Oleg Philon for suggesting this.

cut -d ' ' -f2,3 filename is equivalent to awk -F'[ ]' '{ print $2, $3 }' filename

See also Example 12-42.


Tool for merging together different files into a single, multi-column file. In combination with cut, useful for creating system log files.


Consider this a special-purpose cousin of paste. This powerful utility allows merging two files in a meaningful fashion, which essentially creates a simple version of a relational database.

The join command operates on exactly two files, but pastes together only those lines with a common tagged field (usually a numerical label), and writes the result to stdout. The files to be joined should be sorted according to the tagged field for the matchups to work properly.

 100 Shoes
 200 Laces
 300 Socks

 100 $40.00
 200 $1.00
 300 $2.00

bash$ join
  100 Shoes $40.00
  200 Laces $1.00
  300 Socks $2.00


The tagged field appears only once in the output.


lists the beginning of a file to stdout (the default is 10 lines, but this can be changed). It has a number of interesting options.

Example 12-12. Which files are scripts?

 # Detects scripts within a directory.
 TESTCHARS=2    # Test first 2 characters.
 SHABANG='#!'   # Scripts begin with a "sha-bang."
 for file in *  # Traverse all the files in current directory.
   if [[ `head -c$TESTCHARS "$file"` = "$SHABANG" ]]
   #      head -c2                      #!
   #  The '-c' option to "head" outputs a specified
   #+ number of characters, rather than lines (the default).
     echo "File \"$file\" is a script."
     echo "File \"$file\" is *not* a script."
 exit 0
 #  Exercises:
 #  ---------
 #  1) Modify this script to take as an optional argument
 #+    the directory to scan for scripts
 #+    (rather than just the current working directory).
 #  2) As it stands, this script gives "false positives" for
 #+    Perl, awk, and other scripting language scripts.
 #     Correct this.

Example 12-13. Generating 10-digit random numbers

 # Outputs a 10-digit random number
 # Script by Stephane Chazelas.
 head -c4 /dev/urandom | od -N4 -tu4 | sed -ne '1s/.* //p'
 # =================================================================== #
 # Analysis
 # --------
 # head:
 # -c4 option takes first 4 bytes.
 # od:
 # -N4 option limits output to 4 bytes.
 # -tu4 option selects unsigned decimal format for output.
 # sed: 
 # -n option, in combination with "p" flag to the "s" command,
 # outputs only matched lines.
 # The author of this script explains the action of 'sed', as follows.
 # head -c4 /dev/urandom | od -N4 -tu4 | sed -ne '1s/.* //p'
 # ----------------------------------> |
 # Assume output up to "sed" --------> |
 # is 0000000 1198195154\n
 #  sed begins reading characters: 0000000 1198195154\n.
 #  Here it finds a newline character,
 #+ so it is ready to process the first line (0000000 1198195154).
 #  It looks at its <range><action>s. The first and only one is
 #   range     action
 #   1         s/.* //p
 #  The line number is in the range, so it executes the action:
 #+ tries to substitute the longest string ending with a space in the line
 #  ("0000000 ") with nothing (//), and if it succeeds, prints the result
 #  ("p" is a flag to the "s" command here, this is different from the "p" command).
 #  sed is now ready to continue reading its input. (Note that before
 #+ continuing, if -n option had not been passed, sed would have printed
 #+ the line once again).
 # Now, sed reads the remainder of the characters, and finds the end of the file.
 # It is now ready to process its 2nd line (which is also numbered '$' as
 # it's the last one).
 # It sees it is not matched by any <range>, so its job is done.
 #  In few word this sed commmand means:
 #  "On the first line only, remove any character up to the right-most space,
 #+ then print it."
 # A better way to do this would have been:
 #           sed -e 's/.* //;q'
 # Here, two <range><action>s (could have been written
 #           sed -e 's/.* //' -e q):
 #   range                    action
 #   nothing (matches line)   s/.* //
 #   nothing (matches line)   q (quit)
 #  Here, sed only reads its first line of input.
 #  It performs both actions, and prints the line (substituted) before quitting
 #+ (because of the "q" action) since the "-n" option is not passed.
 # =================================================================== #
 # An even simpler altenative to the above one-line script would be:
 #           head -c4 /dev/urandom| od -An -tu4
 exit 0
See also Example 12-35.


lists the end of a file to stdout (the default is 10 lines). Commonly used to keep track of changes to a system logfile, using the -f option, which outputs lines appended to the file.

Example 12-14. Using tail to monitor the system log

 cat /dev/null > $filename; echo "Creating / cleaning out file."
 #  Creates file if it does not already exist,
 #+ and truncates it to zero length if it does.
 #  : > filename   and   > filename also work.
 tail /var/log/messages > $filename  
 # /var/log/messages must have world read permission for this to work.
 echo "$filename contains tail end of system log."
 exit 0


To list a specific line of a text file, pipe the output of head to tail -1. For example head -8 database.txt | tail -1 lists the 8th line of the file database.txt.

To set a variable to a given block of a text file:
var=$(head -$m $filename | tail -$n)
 # filename = name of file
 # m = from beginning of file, number of lines to end of block
 # n = number of lines to set variable to (trim from end of block)

See also Example 12-5, Example 12-35 and Example 29-6.


A multi-purpose file search tool that uses Regular Expressions. It was originally a command/filter in the venerable ed line editor: g/re/p -- global - regular expression - print.

grep pattern [file...]

Search the target file(s) for occurrences of pattern, where pattern may be literal text or a Regular Expression.

bash$ grep '[rst]ystem.$' osinfo.txt
 The GPL governs the distribution of the Linux operating system.

If no target file(s) specified, grep works as a filter on stdout, as in a pipe.

bash$ ps ax | grep clock
 765 tty1     S      0:00 xclock
  901 pts/1    S      0:00 grep clock

The -i option causes a case-insensitive search.

The -w option matches only whole words.

The -l option lists only the files in which matches were found, but not the matching lines.

The -r (recursive) option searches files in the current working directory and all subdirectories below it.

The -n option lists the matching lines, together with line numbers.

bash$ grep -n Linux osinfo.txt
 2:This is a file containing information about Linux.
  6:The GPL governs the distribution of the Linux operating system.

The -v (or --invert-match) option filters out matches.
grep pattern1 *.txt | grep -v pattern2
 # Matches all lines in "*.txt" files containing "pattern1",
 # but ***not*** "pattern2".	      

The -c (--count) option gives a numerical count of matches, rather than actually listing the matches.
grep -c txt *.sgml   # (number of occurrences of "txt" in "*.sgml" files)
 #   grep -cz .
 #            ^ dot
 # means count (-c) zero-separated (-z) items matching "."
 # that is, non-empty ones (containing at least 1 character).
 printf 'a b\nc  d\n\n\n\n\n\000\n\000e\000\000\nf' | grep -cz .     # 4
 printf 'a b\nc  d\n\n\n\n\n\000\n\000e\000\000\nf' | grep -cz '$'   # 5
 printf 'a b\nc  d\n\n\n\n\n\000\n\000e\000\000\nf' | grep -cz '^'   # 5
 printf 'a b\nc  d\n\n\n\n\n\000\n\000e\000\000\nf' | grep -c '$'    # 9
 # By default, newline chars (\n) separate items to match. 
 # Note that the -z option is GNU "grep" specific.
 # Thanks, S.C.

When invoked with more than one target file given, grep specifies which file contains matches.

bash$ grep Linux osinfo.txt misc.txt
 osinfo.txt:This is a file containing information about Linux.
  osinfo.txt:The GPL governs the distribution of the Linux operating system.
  misc.txt:The Linux operating system is steadily gaining in popularity.


To force grep to show the filename when searching only one target file, simply give /dev/null as the second file.

bash$ grep Linux osinfo.txt /dev/null
 osinfo.txt:This is a file containing information about Linux.
  osinfo.txt:The GPL governs the distribution of the Linux operating system.

If there is a successful match, grep returns an exit status of 0, which makes it useful in a condition test in a script, especially in combination with the -q option to suppress output.
SUCCESS=0                      # if grep lookup succeeds
 grep -q "$word" "$filename"    # The "-q" option causes nothing to echo to stdout.
 if [ $? -eq $SUCCESS ]
 # if grep -q "$word" "$filename"   can replace lines 5 - 7.
   echo "$word found in $filename"
   echo "$word not found in $filename"

Example 29-6 demonstrates how to use grep to search for a word pattern in a system logfile.

Example 12-15. Emulating "grep" in a script

 # Very crude reimplementation of 'grep'.
 if [ -z "$1" ]    # Check for argument to script.
   echo "Usage: `basename $0` pattern"
   exit $E_BADARGS
 for file in *     # Traverse all files in $PWD.
   output=$(sed -n /"$1"/p $file)  # Command substitution.
   if [ ! -z "$output" ]           # What happens if "$output" is not quoted?
     echo -n "$file: "
     echo $output
   fi              #  sed -ne "/$1/s|^|${file}: |p"  is equivalent to above.
 exit 0
 # Exercises:
 # ---------
 # 1) Add newlines to output, if more than one match in any given file.
 # 2) Add features.

How can grep search for two (or more) separate patterns? What if you want grep to display all lines in a file or files that contain both "pattern1" and "pattern2"?

One method is to pipe the result of grep pattern1 to grep pattern2.

For example, given the following file:

# Filename: tstfile
 This is a sample file.
 This is an ordinary text file.
 This file does not contain any unusual text.
 This file is not unusual.
 Here is some text.

Now, let's search this file for lines containing both "file" and "test" . . .

bash$ grep file tstfile
 # Filename: tstfile
  This is a sample file.
  This is an ordinary text file.
  This file does not contain any unusual text.
  This file is not unusual.
 bash$ grep file tstfile | grep text
 This is an ordinary text file.
  This file does not contain any unusual text.


egrep - extended grep - is the same as grep -E. This uses a somewhat different, extended set of Regular Expressions, which can make the search a bit more flexible.

fgrep - fast grep - is the same as grep -F. It does a literal string search (no Regular Expressions), which usually speeds things up a bit.


On some Linux distros, egrep and fgrep are symbolic links to, or aliases for grep, but invoked with the -E and -F options, respectively.

Example 12-16. Looking up definitions in Webster's 1913 Dictionary

 #  This script looks up definitions in the 1913 Webster's Dictionary.
 #  This Public Domain dictionary is available for download
 #+ from various sites, including
 #+ Project Gutenberg (
 #  Convert it from DOS to UNIX format (only LF at end of line)
 #+ before using it with this script.
 #  Store the file in plain, uncompressed ASCII.
 #  Set DEFAULT_DICTFILE variable below to path/filename.
 MAXCONTEXTLINES=50                        # Maximum number of lines to show.
                                           # Default dictionary file pathname.
                                           # Change this as necessary.
 #  Note:
 #  ----
 #  This particular edition of the 1913 Webster's
 #+ begins each entry with an uppercase letter
 #+ (lowercase for the remaining characters).
 #  Only the *very first line* of an entry begins this way,
 #+ and that's why the search algorithm below works.
 if [[ -z $(echo "$1" | sed -n '/^[A-Z]/p') ]]
 #  Must at least specify word to look up, and
 #+ it must start with an uppercase letter.
   echo "Usage: `basename $0` Word-to-define [dictionary-file]"
   echo "Note: Word to look up must start with capital letter,"
   echo "with the rest of the word in lowercase."
   echo "--------------------------------------------"
   echo "Examples: Abandon, Dictionary, Marking, etc."
   exit $E_BADARGS
 if [ -z "$2" ]                            #  May specify different dictionary
                                           #+ as an argument to this script.
 # ---------------------------------------------------------
 Definition=$(fgrep -A $MAXCONTEXTLINES "$1 \\" "$dictfile")
 #                  Definitions in form "Word \..."
 #  And, yes, "fgrep" is fast enough
 #+ to search even a very large text file.
 # Now, snip out just the definition block.
 echo "$Definition" |
 sed -n '1,/^[A-Z]/p' |
 #  Print from first line of output
 #+ to the first line of the next entry.
 sed '$d' | sed '$d'
 #  Delete last two lines of output
 #+ (blank line and first line of next entry).
 # ---------------------------------------------------------
 exit 0
 # Exercises:
 # ---------
 # 1)  Modify the script to accept any type of alphabetic input
 #   + (uppercase, lowercase, mixed case), and convert it
 #   + to an acceptable format for processing.
 # 2)  Convert the script to a GUI application,
 #   + using something like "gdialog" . . .
 #     The script will then no longer take its argument(s)
 #   + from the command line.
 # 3)  Modify the script to parse one of the other available
 #   + Public Domain Dictionaries, such as the U.S. Census Bureau Gazetteer.

agrep (approximate grep) extends the capabilities of grep to approximate matching. The search string may differ by a specified number of characters from the resulting matches. This utility is not part of the core Linux distribution.


To search compressed files, use zgrep, zegrep, or zfgrep. These also work on non-compressed files, though slower than plain grep, egrep, fgrep. They are handy for searching through a mixed set of files, some compressed, some not.

To search bzipped files, use bzgrep.


The command look works like grep, but does a lookup on a "dictionary", a sorted word list. By default, look searches for a match in /usr/dict/words, but a different dictionary file may be specified.

Example 12-17. Checking words in a list for validity

 # lookup: Does a dictionary lookup on each word in a data file.  # Data file from which to read words to test.
 while [ "$word" != end ]  # Last word in data file.
   read word      # From data file, because of redirection at end of loop.
   look $word > /dev/null  # Don't want to display lines in dictionary file.
   lookup=$?      # Exit status of 'look' command.
   if [ "$lookup" -eq 0 ]
     echo "\"$word\" is valid."
     echo "\"$word\" is invalid."
 done <"$file"    # Redirects stdin to $file, so "reads" come from there.
 exit 0
 # ----------------------------------------------------------------
 # Code below line will not execute because of "exit" command above.
 # Stephane Chazelas proposes the following, more concise alternative:
 while read word && [[ $word != end ]]
 do if look "$word" > /dev/null
    then echo "\"$word\" is valid."
    else echo "\"$word\" is invalid."
 done <"$file"
 exit 0
sed, awk

Scripting languages especially suited for parsing text files and command output. May be embedded singly or in combination in pipes and shell scripts.


Non-interactive "stream editor", permits using many ex commands in batch mode. It finds many uses in shell scripts.


Programmable file extractor and formatter, good for manipulating and/or extracting fields (columns) in structured text files. Its syntax is similar to C.


wc gives a "word count" on a file or I/O stream:
bash $ wc /usr/share/doc/sed-4.1.2/README
 13  70  447 README
 [13 lines  70 words  447 characters]

wc -w gives only the word count.

wc -l gives only the line count.

wc -c gives only the byte count.

wc -m gives only the character count.

wc -L gives only the length of the longest line.

Using wc to count how many .txt files are in current working directory:
$ ls *.txt | wc -l
 # Will work as long as none of the "*.txt" files have a linefeed in their name.
 # Alternative ways of doing this are:
 #      find . -maxdepth 1 -name \*.txt -print0 | grep -cz .
 #      (shopt -s nullglob; set -- *.txt; echo $#)
 # Thanks, S.C.

Using wc to total up the size of all the files whose names begin with letters in the range d - h
bash$ wc [d-h]* | grep total | awk '{print $3}'

Using wc to count the instances of the word "Linux" in the main source file for this book.
bash$ grep Linux abs-book.sgml | wc -l

See also Example 12-35 and Example 16-8.

Certain commands include some of the functionality of wc as options.
... | grep foo | wc -l
 # This frequently used construct can be more concisely rendered.
 ... | grep -c foo
 # Just use the "-c" (or "--count") option of grep.
 # Thanks, S.C.


character translation filter.


Must use quoting and/or brackets, as appropriate. Quotes prevent the shell from reinterpreting the special characters in tr command sequences. Brackets should be quoted to prevent expansion by the shell.

Either tr "A-Z" "*" <filename or tr A-Z \* <filename changes all the uppercase letters in filename to asterisks (writes to stdout). On some systems this may not work, but tr A-Z '[**]' will.

The -d option deletes a range of characters.
echo "abcdef"                 # abcdef
 echo "abcdef" | tr -d b-d     # aef
 tr -d 0-9 <filename
 # Deletes all digits from the file "filename".

The --squeeze-repeats (or -s) option deletes all but the first instance of a string of consecutive characters. This option is useful for removing excess whitespace.
bash$ echo "XXXXX" | tr --squeeze-repeats 'X'

The -c "complement" option inverts the character set to match. With this option, tr acts only upon those characters not matching the specified set.

bash$ echo "acfdeb123" | tr -c b-d +

Note that tr recognizes POSIX character classes. [1]

bash$ echo "abcd2ef1" | tr '[:alpha:]' -

Example 12-18. toupper: Transforms a file to all uppercase.

 # Changes a file to all uppercase.
 if [ -z "$1" ]  # Standard check for command line arg.
   echo "Usage: `basename $0` filename"
   exit $E_BADARGS
 tr a-z A-Z <"$1"
 # Same effect as above, but using POSIX character set notation:
 #        tr '[:lower:]' '[:upper:]' <"$1"
 # Thanks, S.C.
 exit 0
 #  Exercise:
 #  Rewrite this script to give the option of changing a file
 #+ to *either* upper or lowercase.

Example 12-19. lowercase: Changes all filenames in working directory to lowercase.

 #  Changes every filename in working directory to all lowercase.
 #  Inspired by a script of John Dubois,
 #+ which was translated into Bash by Chet Ramey,
 #+ and considerably simplified by the author of the ABS Guide.
 for filename in *                # Traverse all files in directory.
    fname=`basename $filename`
    n=`echo $fname | tr A-Z a-z`  # Change name to lowercase.
    if [ "$fname" != "$n" ]       # Rename only files not already lowercase.
      mv $fname $n
 exit $?
 # Code below this line will not execute because of "exit".
 # To run it, delete script above line.
 # The above script will not work on filenames containing blanks or newlines.
 # Stephane Chazelas therefore suggests the following alternative:
 for filename in *    # Not necessary to use basename,
                      # since "*" won't return any file containing "/".
 do n=`echo "$filename/" | tr '[:upper:]' '[:lower:]'`
 #                             POSIX char set notation.
 #                    Slash added so that trailing newlines are not
 #                    removed by command substitution.
    # Variable substitution:
    n=${n%/}          # Removes trailing slash, added above, from filename.
    [[ $filename == $n ]] || mv "$filename" "$n"
                      # Checks if filename already lowercase.
 exit $?

Example 12-20. Du: DOS to UNIX text file conversion.

 # DOS to UNIX text file converter.
 if [ -z "$1" ]
   echo "Usage: `basename $0` filename-to-convert"
   exit $E_WRONGARGS
 CR='\015'  # Carriage return.
            # 015 is octal ASCII code for CR.
            # Lines in a DOS text file end in CR-LF.
            # Lines in a UNIX text file end in LF only.
 tr -d $CR < $1 > $NEWFILENAME
 # Delete CR's and write to new file.
 echo "Original DOS text file is \"$1\"."
 echo "Converted UNIX text file is \"$NEWFILENAME\"."
 exit 0
 # Exercise:
 # --------
 # Change the above script to convert from UNIX to DOS.

Example 12-21. rot13: rot13, ultra-weak encryption.

 # Classic rot13 algorithm,
 #           encryption that might fool a 3-year old.
 # Usage: ./ filename
 # or     ./ <filename
 # or     ./ and supply keyboard input (stdin)
 cat "$@" | tr 'a-zA-Z' 'n-za-mN-ZA-M'   # "a" goes to "n", "b" to "o", etc.
 #  The 'cat "$@"' construction
 #+ permits getting input either from stdin or from files.
 exit 0

Example 12-22. Generating "Crypto-Quote" Puzzles

 # Encrypt quotes
 #  Will encrypt famous quotes in a simple monoalphabetic substitution.
 #  The result is similar to the "Crypto Quote" puzzles
 #+ seen in the Op Ed pages of the Sunday paper.
 # The "key" is nothing more than a scrambled alphabet.
 # Changing the "key" changes the encryption.
 # The 'cat "$@"' construction gets input either from stdin or from files.
 # If using stdin, terminate input with a Control-D.
 # Otherwise, specify filename as command-line parameter.
 cat "$@" | tr "a-z" "A-Z" | tr "A-Z" "$key"
 #        |  to uppercase  |     encrypt       
 # Will work on lowercase, uppercase, or mixed-case quotes.
 # Passes non-alphabetic characters through unchanged.
 # Try this script with something like:
 # "Nothing so needs reforming as other people's habits."
 # --Mark Twain
 # Output is:
 # To reverse the encryption:
 # cat "$@" | tr "$key" "A-Z"
 #  This simple-minded cipher can be broken by an average 12-year old
 #+ using only pencil and paper.
 exit 0
 #  Exercise:
 #  --------
 #  Modify the script so that it will either encrypt or decrypt,
 #+ depending on command-line argument(s).

A filter that wraps lines of input to a specified width. This is especially useful with the -s option, which breaks lines at word spaces (see Example 12-23 and Example A-1).


Simple-minded file formatter, used as a filter in a pipe to "wrap" long lines of text output.

Example 12-23. Formatted file listing.

 WIDTH=40                    # 40 columns wide.
 b=`ls /usr/local/bin`       # Get a file listing...
 echo $b | fmt -w $WIDTH
 # Could also have been done by
 #    echo $b | fold - -s -w $WIDTH
 exit 0

See also Example 12-5.


A powerful alternative to fmt is Kamil Toman's par utility, available from


This deceptively named filter removes reverse line feeds from an input stream. It also attempts to replace whitespace with equivalent tabs. The chief use of col is in filtering the output from certain text processing utilities, such as groff and tbl.


Column formatter. This filter transforms list-type text output into a "pretty-printed" table by inserting tabs at appropriate places.

Example 12-24. Using column to format a directory listing

 # This is a slight modification of the example file in the "column" man page.
 ; ls -l | sed 1d) | column -t
 #  The "sed 1d" in the pipe deletes the first line of output,
 #+ which would be "total        N",
 #+ where "N" is the total number of files found by "ls -l".
 # The -t option to "column" pretty-prints a table.
 exit 0

Column removal filter. This removes columns (characters) from a file and writes the file, lacking the range of specified columns, back to stdout. colrm 2 4 <filename removes the second through fourth characters from each line of the text file filename.


If the file contains tabs or nonprintable characters, this may cause unpredictable behavior. In such cases, consider using expand and unexpand in a pipe preceding colrm.


Line numbering filter. nl filename lists filename to stdout, but inserts consecutive numbers at the beginning of each non-blank line. If filename omitted, operates on stdin.

The output of nl is very similar to cat -n, however, by default nl does not list blank lines.

Example 12-25. nl: A self-numbering script.

 # This script echoes itself twice to stdout with its lines numbered.
 # 'nl' sees this as line 4 since it does not number blank lines.
 # 'cat -n' sees the above line as number 6.
 nl `basename $0`
 echo; echo  # Now, let's try it with 'cat -n'
 cat -n `basename $0`
 # The difference is that 'cat -n' numbers the blank lines.
 # Note that 'nl -ba' will also do so.
 exit 0
 # -----------------------------------------------------------------

Print formatting filter. This will paginate files (or stdout) into sections suitable for hard copy printing or viewing on screen. Various options permit row and column manipulation, joining lines, setting margins, numbering lines, adding page headers, and merging files, among other things. The pr command combines much of the functionality of nl, paste, fold, column, and expand.

pr -o 5 --width=65 fileZZZ | more gives a nice paginated listing to screen of fileZZZ with margins set at 5 and 65.

A particularly useful option is -d, forcing double-spacing (same effect as sed -G).


The GNU gettext package is a set of utilities for localizing and translating the text output of programs into foreign languages. While originally intended for C programs, it now supports quite a number of programming and scripting languages.

The gettext program works on shell scripts. See the info page.


A program for generating binary message catalogs. It is used for localization.


A utility for converting file(s) to a different encoding (character set). Its chief use is for localization.


Consider this a fancier version of iconv, above. This very versatile utility for converting a file to a different encoding is not part of the standard Linux installation.

TeX, gs

TeX and Postscript are text markup languages used for preparing copy for printing or formatted video display.

TeX is Donald Knuth's elaborate typsetting system. It is often convenient to write a shell script encapsulating all the options and arguments passed to one of these markup languages.

Ghostscript (gs) is a GPL-ed Postscript interpreter.


Utility for converting plain text file to PostScript

For example, enscript filename.txt -p produces the PostScript output file

groff, tbl, eqn

Yet another text markup and display formatting language is groff. This is the enhanced GNU version of the venerable UNIX roff/troff display and typesetting package. Manpages use groff.

The tbl table processing utility is considered part of groff, as its function is to convert table markup into groff commands.

The eqn equation processing utility is likewise part of groff, and its function is to convert equation markup into groff commands.

Example 12-26. manview: Viewing formatted manpages

 # Formats the source of a man page for viewing.
 #  This script is useful when writing man page source.
 #  It lets you look at the intermediate results on the fly
 #+ while working on it.
 if [ -z "$1" ]
   echo "Usage: `basename $0` filename"
   exit $E_WRONGARGS
 # ---------------------------
 groff -Tascii -man $1 | less
 # From the man page for groff.
 # ---------------------------
 #  If the man page includes tables and/or equations,
 #+ then the above code will barf.
 #  The following line can handle such cases.
 #   gtbl < "$1" | geqn -Tlatin1 | groff -Tlatin1 -mtty-char -man
 #   Thanks, S.C.
 exit 0
lex, yacc

The lex lexical analyzer produces programs for pattern matching. This has been replaced by the nonproprietary flex on Linux systems.

The yacc utility creates a parser based on a set of specifications. This has been replaced by the nonproprietary bison on Linux systems.



This is only true of the GNU version of tr, not the generic version often found on commercial UNIX systems.

Оставьте свой комментарий !

Ваше имя:
Оба поля являются обязательными

 Автор  Комментарий к данной статье