Bash
Bash
Bash
Notes for Professionals
100+ pages
GoalKicker.com
Disclaimer
This is an unocial free book created for educational purposes and is
not aliated with ocial Bash group(s) or company(s).
All trademarks and registered trademarks are
the property of their respective owners
Contents
About .............................................................................
...................................................................................
................... 1
Chapter 1: Getting started with
Bash ..............................................................................
................................... 2
Section 1.1: Hello
World .............................................................................
.................................................................... 2
Section 1.2: Hello World Using
Variables .........................................................................
............................................ 4
Section 1.3: Hello World with User
Input .............................................................................
......................................... 4
Section 1.4: Importance of Quoting in
Strings ...........................................................................
................................. 5
Section 1.5: Viewing information for Bash built-
ins ...............................................................................
..................... 6
Section 1.6: Hello World in "Debug"
mode ..............................................................................
.................................... 6
Section 1.7: Handling Named
Arguments .........................................................................
........................................... 7
Chapter 2: Script
shebang ...........................................................................
............................................................. 8
Section 2.1: Env
shebang ...........................................................................
................................................................... 8
Section 2.2: Direct
shebang ...........................................................................
............................................................... 8
Section 2.3: Other
shebangs ..........................................................................
.............................................................. 8
Chapter 3: Navigating
directories .......................................................................
............................................... 10
Section 3.1: Absolute vs relative
directories .......................................................................
....................................... 10
Section 3.2: Change to the last
directory .........................................................................
......................................... 10
Section 3.3: Change to the home
directory .........................................................................
..................................... 10
Section 3.4: Change to the Directory of the
Script ............................................................................
...................... 10
Chapter 4: Listing
Files .............................................................................
............................................................... 12
Section 4.1: List Files in a Long Listing
Format ............................................................................
............................. 12
Section 4.2: List the Ten Most Recently Modified
Files .............................................................................
............... 13
Section 4.3: List All Files Including
Dotfiles ...........................................................................
..................................... 13
Section 4.4: List Files Without Using
`ls` ..............................................................................
....................................... 13
Section 4.5: List
Files .............................................................................
...................................................................... 14
Section 4.6: List Files in a Tree-Like
Format ............................................................................
................................. 14
Section 4.7: List Files Sorted by
Size ..............................................................................
............................................ 14
Chapter 5: Using
cat ...............................................................................
................................................................... 16
Section 5.1: Concatenate
files ..............................................................................
....................................................... 16
Section 5.2: Printing the Contents of a
File ..............................................................................
................................. 16
Section 5.3: Write to a
file ...............................................................................
............................................................ 17
Section 5.4: Show non printable
characters ........................................................................
..................................... 17
Section 5.5: Read from standard
input .............................................................................
........................................ 18
Section 5.6: Display line numbers with
output ............................................................................
............................. 18
Section 5.7: Concatenate gzipped
files ..............................................................................
....................................... 18
Chapter 6:
Grep ..............................................................................
............................................................................. 20
Section 6.1: How to search a file for a
pattern ...........................................................................
.............................. 20
Chapter 7:
Aliasing ..........................................................................
........................................................................... 21
Section 7.1: Bypass an
alias .............................................................................
........................................................... 21
Section 7.2: Create an
Alias .............................................................................
........................................................... 21
Section 7.3: Remove an
alias .............................................................................
........................................................ 21
Section 7.4: The BASH_ALIASES is an internal bash assoc
array .......................................................................... 22
Section 7.5: Expand
alias .............................................................................
............................................................... 22
Section 7.6: List all
Aliases ...........................................................................
............................................................... 22
Chapter 9:
Redirection .......................................................................
...................................................................... 27
Section 9.1: Redirecting standard
output ............................................................................
...................................... 27
Section 9.2: Append vs
Truncate ..........................................................................
..................................................... 27
Section 9.3: Redirecting both STDOUT and
STDERR ............................................................................
.................. 28
Section 9.4: Using named
pipes .............................................................................
.................................................... 28
Section 9.5: Redirection to network
addresses .........................................................................
............................... 30
Section 9.6: Print error messages to
stderr ............................................................................
.................................. 30
Section 9.7: Redirecting multiple commands to the same
file ...............................................................................
31
Section 9.8: Redirecting
STDIN .............................................................................
..................................................... 31
Section 9.9: Redirecting
STDERR ............................................................................
................................................... 32
Section 9.10: STDIN, STDOUT and STDERR
explained .........................................................................
................... 32
Chapter 12:
Arrays ............................................................................
.......................................................................... 42
Section 12.1: Array
Assignments .......................................................................
.......................................................... 42
Section 12.2: Accessing Array
Elements ..........................................................................
.......................................... 43
Section 12.3: Array
Modification .......................................................................
.......................................................... 43
Section 12.4: Array
Iteration .........................................................................
.............................................................. 44
Section 12.5: Array
Length ............................................................................
.............................................................. 45
Section 12.6: Associative
Arrays ............................................................................
..................................................... 45
Section 12.7: Looping through an
array .............................................................................
....................................... 46
Section 12.8: Destroy, Delete, or Unset an
Array .............................................................................
........................ 47
Section 12.9: Array from
string ............................................................................
....................................................... 47
Section 12.10: List of initialized
indexes ...........................................................................
........................................... 47
Section 12.11: Reading an entire file into an
array .............................................................................
....................... 48
Section 12.12: Array insert
function ..........................................................................
.................................................. 48
Chapter 14:
Functions .........................................................................
...................................................................... 52
Section 14.1: Functions with
arguments .........................................................................
............................................ 52
Section 14.2: Simple
Function ..........................................................................
........................................................... 53
Section 14.3: Handling flags and optional
parameters ........................................................................
................... 53
Section 14.4: Print the function
definition .........................................................................
......................................... 54
Section 14.5: A function that accepts named
parameters ........................................................................
.............. 54
Section 14.6: Return value from a
function ..........................................................................
..................................... 55
Section 14.7: The exit code of a function is the exit code of its last
command ..................................................... 55
Chapter 19:
Sourcing ..........................................................................
....................................................................... 74
Section 19.1: Sourcing a
file ...............................................................................
.......................................................... 74
Section 19.2: Sourcing a virtual
environment .......................................................................
.................................... 74
Chapter 21:
Quoting ...........................................................................
........................................................................ 80
Section 21.1: Double quotes for variable and command
substitution ....................................................................
80
Section 21.2: Dierence between double quote and single
quote ......................................................................... 80
Section 21.3: Newlines and control
characters ........................................................................
................................. 81
Section 21.4: Quoting literal
text ..............................................................................
................................................... 81
Chapter 27:
Scoping ...........................................................................
..................................................................... 100
Section 27.1: Dynamic scoping in
action ............................................................................
..................................... 100
Chapter 33:
Debugging .........................................................................
................................................................. 113
Section 33.1: Checking the syntax of a script with "-
n" ................................................................................
.......... 113
Section 33.2: Debugging using
bashdb ............................................................................
...................................... 113
Section 33.3: Debugging a bash script with "-
x" ................................................................................
.................... 113
Chapter 39: Read a file (data stream, variable) line-by-line (and/or field-by-
field)? ......... 135
Section 39.1: Looping through a file line by
line ..............................................................................
....................... 135
Section 39.2: Looping through the output of a command field by
field ............................................................. 135
Section 39.3: Read lines of a file into an
array .............................................................................
.......................... 135
Section 39.4: Read lines of a string into an
array .............................................................................
..................... 136
Section 39.5: Looping through a string line by
line ..............................................................................
.................. 136
Section 39.6: Looping through the output of a command line by
line ................................................................ 136
Section 39.7: Read a file field by
field ..............................................................................
....................................... 136
Section 39.8: Read a string field by
field ..............................................................................
.................................. 137
Section 39.9: Read fields of a file into an
array .............................................................................
........................ 137
Section 39.10: Read fields of a string into an
array .............................................................................
.................. 137
Section 39.11: Reads file (/etc/passwd) line by line and field by
field ................................................................. 138
Chapter 43:
Pipelines .........................................................................
..................................................................... 143
Section 43.1: Using |
& .................................................................................
................................................................ 143
Section 43.2: Show all processes
paginated .........................................................................
................................. 144
Section 43.3: Modify continuous output of a
command ...........................................................................
............ 144
Chapter 63:
Parallel ..........................................................................
....................................................................... 182
Section 63.1: Parallelize repetitive tasks on list of
files ..............................................................................
............ 182
Section 63.2: Parallelize
STDIN .............................................................................
................................................... 183
Chapter 66:
Pitfalls ..........................................................................
........................................................................ 187
Section 66.1: Whitespace When Assigning
Variables .........................................................................
................... 187
Section 66.2: Failed commands do not stop script
execution .........................................................................
..... 187
Section 66.3: Missing The Last Line in a
File ..............................................................................
............................. 187
Appendix A: Keyboard
shortcuts .........................................................................
............................................. 189
Section A.1: Editing
Shortcuts .........................................................................
.......................................................... 189
Section A.2: Recall
Shortcuts .........................................................................
.......................................................... 189
Section A.3:
Macros ............................................................................
....................................................................... 189
Section A.4: Custome Key
Bindings ..........................................................................
.............................................. 189
Section A.5: Job
Control ...........................................................................
................................................................ 190
Credits ...........................................................................
...................................................................................
.............. 191
You may also
like ..............................................................................
........................................................................ 195
About
Please feel free to share this PDF with anyone for free,
latest version of this book can be downloaded from:
https://goalkicker.com/BashBook
This Bash Notes for Professionals book is compiled from Stack Overflow
Documentation, the content is written by the beautiful people at Stack Overflow.
Text content is released under Creative Commons BY-SA, see credits at the end
of this book whom contributed to the various chapters. Images may be copyright
of their respective owners unless otherwise specified
This is an unofficial free book created for educational purposes and is not
affiliated with official Bash group(s) or company(s) nor Stack Overflow. All
trademarks and registered trademarks are the property of their respective
company owners
The information presented in this book is not guaranteed to be correct nor
accurate, use at your own risk
Please send feedback and corrections to [email protected]
1
Chapter 1: Getting started with Bash
Version Release Date
0.99
1989-06-08
1.01
1989-06-23
2.0
1996-12-31
2.02
1998-04-20
2.03
1999-02-19
2.04
2001-03-21
2.05b
2002-07-17
3.0
2004-08-03
3.1
2005-12-08
3.2
2006-10-11
4.0
2009-02-20
4.1
2009-12-31
4.2
2011-02-13
4.3
2014-02-26
4.4
2016-09-15
Section 1.1: Hello World
Interactive Shell
The Bash shell is commonly used interactively: It lets you enter and edit commands,
then executes them when
you press the Return key. Many Unix-based and Unix-like operating systems use Bash
as their default shell
(notably Linux and macOS). The terminal automatically enters an interactive Bash
shell process on startup.
Output Hello World by typing the following:
echo "Hello World"
#> Hello World # Output Example
Notes
You can change the shell by just typing the name of the shell in terminal. For
example: sh, bash, etc.
echo is a Bash builtin command that writes the arguments it receives to the
standard output. It appends a
Non-Interactive Shell
The Bash shell can also be run non-interactively from a script, making the shell
require no human interaction.
Interactive behavior and scripted behavior should be identical – an important
design consideration of Unix V7
Bourne shell and transitively Bash. Therefore anything that can be done at the
command line can be put in a script
file for reuse.
Follow these steps to create a Hello World script:
1. Create a new file called hello-world.sh
2
touch hello-world.sh
Line 1: The first line of the script must start with the character sequence #!,
referred to as shebang1. The
shebang instructs the operating system to run /bin/bash, the Bash shell, passing it
the script's path as an
argument.
E.g. /bin/bash hello-world.sh
Line 2: Uses the echo command to write Hello World to the standard output.
4. Execute the hello-world.sh script from the command line using one of the
following:
./hello-world.sh – most commonly used, and recommended
/bin/bash hello-world.sh
bash hello-world.sh – assuming /bin is in your $PATH
sh hello-world.sh
For real production use, you would omit the .sh extension (which is misleading
anyway, since this is a Bash script,
not a sh script) and perhaps move the file to a directory within your PATH so that
it is available to you regardless of
your current working directory, just like a system command such as cat or ls.
Common mistakes include:
1. Forgetting to apply execute permission on the file, i.e., chmod +x hello-
world.sh, resulting in the output of
./hello-world.sh: Permission denied.
2. Editing the script on Windows, which produces incorrect line ending characters
that Bash cannot handle.
A common symptom is : command not found where the carriage return has forced the
cursor to the
beginning of line, overwriting the text before the colon in the error message.
The script can be fixed using the dos2unix program.
An example use: dos2unix hello-world.sh
dos2unix edits the file inline.
3. Using sh ./hello-world.sh, not realizing that bash and sh are distinct shells
with distinct features (though
since Bash is backwards-compatible, the opposite mistake is harmless).
Anyway, simply relying on the script's shebang line is vastly preferable to
explicitly writing bash or sh (or
python or perl or awk or ruby or...) before each script's file name.
A common shebang line to use in order to make your script more portable is to use
#!/usr/bin/env bash
instead of hard-coding a path to Bash. That way, /usr/bin/env has to exist, but
beyond that point, bash just
GoalKicker.com – Bash Notes for Professionals
3
needs to be on your PATH. On many systems, /bin/bash doesn't exist, and you should
use
/usr/local/bin/bash or some other absolute path; this change avoids having to figure
out the details of
that.
#!/usr/bin/env bash
# Note that spaces cannot be used around the `=` assignment operator
whom_variable="World"
# Use printf to safely output the data
printf "Hello, %s\n" "$whom_variable"
#> Hello, World
tries to locate the script in one of the directories contained in the $PATH
environment variable.
The following code accepts an argument $1, which is the first command line argument,
and outputs it in a
formatted string, following Hello,.
Execute/Run via: ./hello.sh World
#!/usr/bin/env bash
printf "Hello, %s\n" "$1"
#> Hello, World
It is important to note that $1 has to be quoted in double quote, not single quote.
"$1" expands to the first
command line argument, as desired, while '$1' evaluates to literal string $1.
Security Note:
Read Security implications of forgetting to quote a variable in bash shells to
understand the
importance of placing the variable text within double quotes.
4
#!/usr/bin/env bash
echo "Who are you?"
read name
echo "Hello, $name."
The command read here reads one line of data from standard input into the variable
name. This is then referenced
using $name and printed to standard out using echo.
Example output:
$ ./hello_world.sh
Who are you?
Matt
Hello, Matt.
Here the user entered the name "Matt", and this code was used to say Hello, Matt..
And if you want to append something to the variable value while printing it, use
curly brackets around the variable
name as shown in the following example:
#!/usr/bin/env bash
echo "What are you doing?"
read action
echo "You are ${action}ing."
Example output:
$ ./hello_world.sh
What are you doing?
Sleep
You are Sleeping.
Here when user enters an action, "ing" is appended to that action while printing.
If you don't want to bash to expand your argument, you can use Strong Quoting:
#!/usr/bin/env bash
world="World"
echo 'Hello $world'
5
#> Hello $world
For more detailed information other than beginner details, you can continue to read
it here.
This will display the Bash help (manual) page for the specified built-in.
For example, help unset will show:
unset: unset [-f] [-v] [-n] [name ...]
Unset values and attributes of shell variables and functions.
For each NAME, remove the corresponding variable or function.
Options:
-f
treat each NAME as a shell function
-v
treat each NAME as a shell variable
-n
treat each NAME as a name reference and unset the variable itself
rather than the variable it references
Without options, unset first tries to unset a variable, and if that fails,
tries to unset a function.
Some variables cannot be unset; also see `readonly'.
Exit Status:
Returns success unless an invalid option is given or a NAME is read-only.
The -x argument enables you to walk through each line in the script. One good
example is here:
$ cat hello.sh
#!/bin/bash
echo "Hello World\n"
6
adding_string_to_number="s"
v=$(expr 5 + $adding_string_to_number)
$ ./hello.sh
Hello World
expr: non-integer argument
The above prompted error is not enough to trace the script; however, using the
following way gives you a better
sense where to look for the error in the script.
$ bash -x hello.sh
+ echo Hello World\n
Hello World
+ adding_string_to_number=s
+ expr 5 + s
expr: non-integer argument
+ v=
7
Chapter 2: Script shebang
Section 2.1: Env shebang
To execute a script file with the bash executable found in the PATH environment
variable by using the executable
env, the first line of a script file must indicate the absolute path to the env
executable with the argument bash:
#!/usr/bin/env bash
The env path in the shebang is resolved and used only if a script is directly
launch like this:
script.sh
The bash path in the shebang is resolved and used only if a script is directly
launch like this:
./script.sh
bash tries to execute its argument "something wrong" which doesn't exist. The name
of the script file is added too.
To see this clearly use an echo shebang:
8
#"/bin/echo something wrong
# and now call this script named "thisscript" like so:
# thisscript one two
# the output will be:
something wrong ./thisscript one two
Some programs like awk use this technique to run longer scripts residing in a disk
file.
9
Chapter 3: Navigating directories
Section 3.1: Absolute vs relative directories
To change to an absolutely specified directory, use the entire name, starting with a
slash /, thus:
cd /home/username/project/abc
If you want to change to a directory near your current on, you can specify a
relative location. For example, if you are
already in /home/username/project, you can enter the subdirectory abc thus:
cd abc
If you want to go to the directory above the current directory, you can use the
alias ... For example, if you were in
/home/username/project/abc and wanted to go to /home/username/project, then you
would do the following:
cd ..
Doing it multiple times effectively "toggles" you being in the current directory or
the previous one.
10
cd "$(dirname "$(readlink -f "$0")")"
11
Chapter 4: Listing Files
-a, --all
Option
Description
List all entries including ones that start with a dot
-A, --almost-all
-c
-d, --directory
-l
-o
-r, --reverse
-s, --size
-S
--sort=WORD
-t
-u
-v
Sort by version
-1
Example Output:
total 1204
drwxr-xr-x
-rw-r--r-drwxr-xr-x
...
3 root root
1 root root
2 root root
The output first displays total, which indicates the total size in blocks of all the
files in the listed directory. It then
displays eight columns of information for each file in the listed directory. Below
are the details for each column in
the output:
Column No. Example
d
1.1
Description
File type (see table below)
1.2
rwxr-xr-x
Permission string
root
Owner name
root
Owner group
4096
acpi
File name
12
File Type
The file type can be one of any of the following characters.
Character
File Type
Regular file
Directory
Symbolic link
Socket
.bash_logout
.bash_profile
.bashrc
bin
.lesshst
pki
.puppetlabs
.ssh
.viminfo
The -A or --almost-all option will list all files, including dotfiles, but does not
list implied . and ... Note that . is
the current directory and .. is the parent directory.
$ ls -A
.ansible
.bash_history
.bash_logout
.bash_profile
.bashrc
bin
.lesshst
pki
.puppetlabs
.ssh
.viminfo
Documents
eclipse
Fonts
git
Music
Pictures
Programming
Public
Templates
Videos
workspace
Use the tree command's -L option to limit the display depth and the -d option to
only list directories.
Example Output:
$ tree -L 1 -d /tmp
/tmp
└── evince-20965
14
$ ls -l -S ./Fruits
total 444
-rw-rw-rw- 1 root root 295303 Jul 28 19:19 apples.jpg
-rw-rw-rw- 1 root root 102283 Jul 28 19:19 kiwis.jpg
-rw-rw-rw- 1 root root 50197 Jul 28 19:19 bananas.jpg
15
Chapter 5: Using cat
Option
-n
Print line numbers
Details
-v
Show non-printing characters using ^ and M- notation except LFD and TAB
-T
-E
-e
Same as -vE
-b
-A
equivalent to -vET
-s
Very often, for interactive use, you are better off using an interactive pager like
less or more, though. (less is far
more powerful than more and it is advised to use less more often than more.)
less file.txt
In case the content needs to be listed backwards from its end the command tac can
be used:
tac file.txt
If you want to print the contents with line numbers, then use -n with cat:
cat -n file.txt
16
To display the contents of a file in a completely unambiguous byte-by-byte form, a
hex dump is the standard
solution. This is good for very brief snippets of a file, such as when you don't
know the precise encoding. The
standard hex dump utility is od -cH, though the representation is slightly
cumbersome; common replacements
include xxd and hexdump.
$ printf 'Hëllö wörld' | xxd
0000000: 48c3 ab6c 6cc3 b620 77c3 b672 6c64
H..ll.. w..rld
It will let you write the text on terminal which will be saved in a file named file.
cat >>file
will do the same, except it will append the text to the end of the file.
N.B: Ctrl+D to end writing text on terminal (Linux)
A here document can be used to inline the contents of a file into a command line or
a script:
cat <<END >file
Hello, World.
END
The token after the << redirection symbol is an arbitrary string which needs to
occur alone on a line (with no leading
or trailing whitespace) to indicate the end of the here document. You can add
quoting to prevent the shell from
performing command substitution and variable interpolation:
cat <<'fnord'
Nothing in `here` will be $changed
fnord
(Without the quotes, here would be executed as a command, and $changed would be
substituted with the value of
the variable changed -- or nothing, if it was undefined.)
e.g.
$ echo '”
M-bM-^@M-^]
You may also want to use cat -A (A for All) that is equivalent to cat -vET. It will
display TAB characters (displayed
GoalKicker.com – Bash Notes for Professionals
17
as ^I), non printable characters and end of each line:
$ echo '” `' | cat -A
M-bM-^@M-^]^I`$
Output is same as cat file.txt, but it reads the contents of the file from standard
input instead of directly from
the file.
printf "first line\nSecond line\n" | cat -n
The echo command before | outputs two lines. The cat command acts on the output to
add line numbers.
line 1
line 2
line 4
line 5
To skip empty lines when counting lines, use the --number-nonblank, or simply -b.
$ cat -b file
1
2
line 1
line 2
3
4
line 4
line 5
This is a property of gzip that is less efficient than concatenating the input files
and gzipping the result:
cat file1 file2 file3 | gzip > combined.gz
A complete demonstration:
echo 'Hello world!' > hello.txt
echo 'Howdy world!' > howdy.txt
gzip hello.txt
gzip howdy.txt
Which results in
Hello world!
Howdy world!
Notice that greetings.txt.gz is a single file and is decompressed as the single file
greeting.txt. Contrast this
with tar -czf hello.txt howdy.txt > greetings.tar.gz, which keeps the files separate
inside the tarball.
19
Chapter 6: Grep
Section 6.1: How to search a file for a pattern
To find the word foo in the file bar :
grep foo ~/Desktop/bar
To find all lines that do not contain foo in the file bar :
grep –v foo ~/Desktop/bar
To use find all words containing foo in the end (WIldcard Expansion):
grep "*foo" ~/Desktop/bar
20
Chapter 7: Aliasing
Shell aliases are a simple way to create new commands or to wrap existing commands
with code of your own. They
somewhat overlap with shell functions, which are however more versatile and should
therefore often be preferred.
And let's say you want to use the ls command without disabling the alias. You have
several options:
Use the command builtin: command ls
Use the full path of the command: /bin/ls
Add a \ anywhere in the command name, for example: \ls, or l\s
Quote the command: "ls" or 'ls'
Invoking word will run command. Any arguments supplied to the alias are simply
appended to the target of the alias:
alias myAlias='some command --with --options'
myAlias foo bar baz
To include multiple commands in the same alias, you can string them together with
&&. For example:
alias print_things='echo "foo" && echo "bar" && echo "baz"'
Example:
# create an alias
$ alias now='date'
# preview the alias
$ now
Thu Jul 21 17:11:25 CEST 2016
# remove the alias
$ unalias now
21
# test if removed
$ now
-bash: now: command not found
22
Chapter 8: Jobs and Processes
Section 8.1: Job handling
Creating jobs
To create an job, just append a single & after the command:
$ sleep 10 &
[1] 20024
sleep 10
Now you can interact with the process. To bring it back to the background you can
use the bg command. Due to the
occupied terminal session, you need to stop the process first by pressing Ctrl + z .
$ sleep 10
^Z
[1]+ Stopped
sleep 10
$ bg %1
[1]+ sleep 10 &
Due to the laziness of some Programmers, all these commands also work with a single
% if there is only one
process, or for the first process in the list. For Example:
$ sleep 10 &
[1] 20024
$ fg %
sleep 10
or just
$ %
sleep 10
Additionally, just typing fg or bg without any argument handles the last job:
$ sleep 20 &
$ sleep 10 &
$ fg
23
sleep 10
^C
$ fg
sleep 20
sleep 10
The sleep process runs in the background with process id (pid) 20024 and job number
1. In order to reference the
process, you can use either the pid or the job number. If you use the job number,
you must prefix it with %. The
default kill signal sent by kill is SIGTERM, which allows the target process to
exit gracefully.
Some common kill signals are shown below. To see a full list, run kill -l.
Signal name Signal value
Effect
SIGHUP
Hangup
SIGINT
SIGKILL
Kill signal
SIGTERM
15
Termination signal
(or) a more fool-proof way using pgrep to search for the actual process-id
kill $(pgrep -f 'python test.py')
The same result can be obtained using grep over ps -ef | grep name_of_process then
killing the process
associated with the resulting pid (process id). Selecting a process using its name
is convinient in a testing
environment but can be really dangerous when the script is used in production: it
is virtually impossible to be sure
that the name will match the process you actually want to kill. In those cases, the
following approach is actually
much safe.
Start the script that will eventually killed with the following approach. Let's
assume that the command you want to
execute and eventually kill is python test.py.
#!/bin/bash
if [[ ! -e /tmp/test.py.pid ]]; then
# Check if the file already exists
python test.py &
#+and if so do not run another process.
echo $! > /tmp/test.py.pid
else
echo -n "ERROR: The process is already running with pid "
cat /tmp/test.py.pid
echo
fi
This will create a file in the /tmp directory containing the pid of the python
test.py process. If the file already
exists, we assume that the command is already running and the script return an
error.
GoalKicker.com – Bash Notes for Professionals
24
Then, when you want to kill it use the following script:
#!/bin/bash
if [[ -e /tmp/test.py.pid ]]; then
kill `cat /tmp/test.py.pid`
rm /tmp/test.py.pid
else
echo "test.py is not running"
fi
that will kill exactly the process associated with your command, without relying on
any volatile information (like the
string used to run the command). Even in this case if the file does not exist, the
script assume that you want to kill a
non-running process.
This last example can be easily improved for running the same command multiple
times (appending to the pid file
instead of overwriting it, for example) and to manage cases where the process dies
before being killed.
This allows a long running process to continue once your shell (terminal, ssh, etc)
is closed.
Example:
root@server7:~# ps aux | grep nginx
root
315 0.0 0.3 144392 1020 ?
/usr/sbin/nginx
www-data 5647 0.0 1.1 145124 3048 ?
www-data 5648 0.0 0.1 144392
376 ?
root
13134 0.0 0.3
4960
920 pts/0
root@server7:~#
Ss
May28
S
S
S+
Jul18
Jul18
14:33
25
Here, second column is the process id. For example, if you want to kill the nginx
process, you can use the command
kill 5647. It is always adviced to use the kill command with SIGTERM rather than
SIGKILL.
This can be used to check if a given application is running. For example, to check
if the SSH server (sshd) is running:
ps -ef | grep sshd
26
Chapter 9: Redirection
Parameter
internal file descriptor
An integer.
Details
direction
external file descriptor or path & followed by an integer for file descriptor or a
path.
These examples write the output of the ls command into the file file.txt
ls >file.txt
> file.txt ls
The target file is created if it doesn't exists, otherwise this file is truncated.
The default redirection descriptor is the standard output or 1 when none is
specified. This command is equivalent
to the previous examples with the standard output explicitly indicated:
ls 1>file.txt
Note: the redirection is initialized by the executed shell and not by the executed
command, therefore it is done
before the command execution.
Append >>
1. Create specified file if it does not exist.
2. Append file (writing at end of file).
# Overwrite existing file
$ echo "first line" > /tmp/lines
# Append a second line
$ echo "second line" >> /tmp/lines
$ cat /tmp/lines
first line
second line
27
Section 9.3: Redirecting both STDOUT and STDERR
File descriptors like 0 and 1 are pointers. We change what file descriptors point to
with redirection. >/dev/null
means 1 points to /dev/null.
First we point 1 (STDOUT) to /dev/null then point 2 (STDERR) to whatever 1 points
to.
# STDERR is redirect to STDOUT: redirected to /dev/null,
# effectually redirecting both STDERR and STDOUT to /dev/null
echo 'hello' > /dev/null 2>&1
Version ≥ 4.0
This works fine for most applications, however, nobody will know what tempFile does
and someone might remove
it if it contains the output of ls -l in that directory. This is where a named pipe
comes into play:
mkfifo myPipe
ls -l > myPipe
grep ".log" < myPipe
myPipe is technically a file (everything is in Linux), so let's do ls -l in an empty
directory that we just created a pipe
in:
GoalKicker.com – Bash Notes for Professionals
28
mkdir pipeFolder
cd pipeFolder
mkfifo myPipe
ls -l
Notice the first character in the permissions, it's listed as a pipe, not a file.
Now let's do something cool.
Open one terminal, and make note of the directory (or create one so that cleanup is
easy), and make a pipe.
mkfifo myPipe
You'll notice this hangs, the other side of the pipe is still closed. Let's open up
the other side of the pipe and let that
stuff through.
Open another terminal and go to the directory that the pipe is in (or if you know
it, prepend it to the pipe):
cat < myPipe
You'll notice that after hello from the other side is output, the program in the
first terminal finishes, as does
that in the second terminal.
Now run the commands in reverse. Start with cat < myPipe and then echo something
into it. It still works, because
a program will wait until something is put into the pipe before terminating,
because it knows it has to get
something.
Named pipes can be useful for moving information between terminals or between
programs.
Pipes are small. Once full, the writer blocks until some reader reads the contents,
so you need to either run the
reader and writer in different terminals or run one or the other in the background:
ls -l /tmp > myPipe &
cat < myPipe
29
$ cat <mypipe
#Output: This prints on screen the contents of mypipe.
Mind that first contents of file3 are displayed and then the ls -l data is displayed
(LIFO configuration).
Example 3 - all commands on the same terminal / same shell
$ { pipedata=$(<mypipe) && echo "$pipedata"; } &
$ ls >mypipe
# Output: Prints the output of ls directly on screen
Mind that the variable $pipedata is not available for usage in the main terminal /
main shell since the use of
& invokes a subshell and $pipedata was only available in this subshell.
This prints correctly the value of $pipedata variable in the main shell due to the
export declaration of the
variable. The main terminal/main shell is not hanging due to the invocation of a
background shell (&).
Bash treats some paths as special and can do some network communication by writing
to
/dev/{udp|tcp}/host/port. Bash cannot setup a listening server, but can initiate a
connection, and for TCP can
and the results of www.google.com's default web page will be printed to stdout.
Similarly
printf 'HI\n' >/dev/udp/192.168.1.1/6666
30
cmd || echo 'cmd failed'
may work for simple cases but it's not the usual way. In this example, the error
message will pollute the actual
output of the script by mixing both errors and successful output in stdout.
In short, error message should go to stderr not stdout. It's pretty simple:
cmd || echo 'cmd failed' >/dev/stderr
Another example:
if cmd; then
echo 'success'
else
echo 'cmd failed' >/dev/stderr
fi
In the above example, the success message will be printed on stdout while the error
message will be printed on
stderr.
To write a file into STDIN we should read /tmp/a_file and write into STDIN i.e
0</tmp/a_file
Note: Internal file descriptor defaults to 0 (STDIN) for <
$ echo "b" > /tmp/list.txt
$ echo "a" >> /tmp/list.txt
$ echo "c" >> /tmp/list.txt
$ sort < /tmp/list.txt
a
b
c
31
Section 9.9: Redirecting STDERR
2 is STDERR.
$ echo_to_stderr 2>/dev/null # echos nothing
Definitions:
echo_to_stderr is a command that writes "stderr" to STDERR
echo_to_stderr () {
echo stderr >&2
}
$ echo_to_stderr
stderr
Standard input is used to provide input to a program. (Here we're using the read
builtin to read a line from STDIN.)
STDOUT
root@server~# ls file
file
Standard output is generally used for "normal" output from a command. For example,
ls lists files, so the files are
sent to STDOUT.
STDERR
root@server~# ls anotherfile
ls: cannot access 'anotherfile': No such file or directory
Standard error is (as the name implies) used for error messages. Because this
message is not a list of files, it is sent
to STDERR.
STDIN, STDOUT and STDERR are the three standard streams. They are identified to the
shell by a number rather
than a name:
0 = Standard in
1 = Standard out
2 = Standard error
By default, STDIN is attached to the keyboard, and both STDOUT and STDERR appear in
the terminal. However, we
GoalKicker.com – Bash Notes for Professionals
32
can redirect either STDOUT or STDERR to whatever we need. For example, let's say
that you only need the standard
out and all error messages printed on standard error should be suppressed. That's
when we use the descriptors 1
and 2.
Redirecting STDERR to /dev/null
Taking the previous example,
root@server~# ls anotherfile 2>/dev/null
root@server~#
33
Chapter 10: Control Structures
Parameter to [ or test
File Operators
Details
Details
-e "$file"
-d "$file"
-f "$file"
-h "$file"
String Comparators
Details
-z "$str"
-n "$str
"$str" = "$str2"
True if string $str is equal to string $str2. Not best for integers. It may work
but will be
inconsitent
"$str" != "$str2"
Integer Comparators
Details
When combining multiple statements in this manner, it's important to remember that
(unlike many C-style
languages) these operators have no precedence and are left-associative.
Thus, this statement will work as expected...
cd my_directory && pwd || echo "No such directory"
If the cd succeeds, the && pwd executes and the current working directory name is
printed. Unless pwd fails (a
rarity) the || echo ... will not be executed.
If the cd fails, the && pwd will be skipped and the || echo ... will run.
But this will not (if you're thinking if...then...else)...
GoalKicker.com – Bash Notes for Professionals
34
cd my_directory && ls || echo "No such directory"
If the cd fails, the && ls is skipped and the || echo ... is executed.
If the cd succeeds, the && ls is executed.
If the ls succeeds, the || echo ... is ignored. (so far so good)
BUT... if the ls fails, the || echo ... will also be executed.
It is the ls, not the cd, that is the previous command.
The closing fi is necessary, but the elif and/or the else clauses can be omitted.
The semicolons before then are standard syntax for combining two commands on a
single line; they can be omitted
only if then is moved to the next line.
It's important to understand that the brackets [[ are not part of the syntax, but
are treated as a command; it is the
exit code from this command that is being tested. Therefore, you must always
include spaces around the brackets.
GoalKicker.com – Bash Notes for Professionals
35
This also means that the result of any command can be tested. If the exit code from
the command is a zero, the
statement is considered true.
if grep "foo" bar.txt; then
echo "foo was found"
else
echo "foo was not found"
fi
You may also come across if statements with single brackets. These are defined in
the POSIX standard and are
guaranteed to work in all POSIX-compliant shells including Bash. The syntax is very
similar to that in Bash:
if [ "$1" -eq 1 ]; then
echo "1 was passed in the first parameter"
elif [ "$1" -gt 2 ]; then
echo "2 was not passed in the first parameter"
else
echo "The first parameter was not 1 and is not more than 2."
fi
Or
for ((i=0;i<${#arr[@]};i++));do
echo "${arr[$i]}"
done
while loop:
i=0
while [ $i -lt ${#arr[@]} ];do
echo "${arr[$i]}"
i=$(expr $i + 1)
done
Or
i=0
while (( $i < ${#arr[@]} ));do
echo "${arr[$i]}"
((i++))
36
done
37
for j in "${arr[@]}";do
echo "$j"
break 2
done
done
Output:
a
a
Output:
a
a
b
a
c
a
d
a
e
a
f
a
Watch that there are spaces around the brackets during the test (after the while
statement). These spaces are
necessary.
This loop outputs:
i is currently 0
i is currently 1
i is currently 2
i is currently 3
38
i is currently 4
Notes:
The assignment of the variable inside C-style for loop can contain spaces unlike
the usual assignment
Variables inside C-style for loop aren't preceded with $.
Example:
for (( i = 0; i < 10; i++ ))
do
echo "The iteration number is $i"
done
Output:
i=5
i=6
i=7
i=8
i=9
When i reaches 10 the condition in until loop becomes true and the loop ends.
39
case "$BASH_VERSION" in
[34]*)
echo {1..4}
;;
*)
seq -s" " 1 4
esac
Pattern are not regular expressions but shell pattern matching (aka globs).
A for loop without a list of words parameter will iterate over the positional
parameters instead. In other words, the
above example is equivalent to this code:
for arg in "$@"; do
echo arg=$arg
done
In other words, if you catch yourself writing for i in "$@"; do ...; done, just
drop the in part, and write simply
for i; do ...; done.
40
Chapter 11: true, false and : commands
Section 11.1: Infinite Loop
while true; do
echo ok
done
or
while :; do
echo ok
done
or
until false; do
echo ok
done
41
Chapter 12: Arrays
Section 12.1: Array Assignments
List Assignment
If you are familiar with Perl, C, or Java, you might think that Bash would use
commas to separate array elements,
however this is not the case; instead, Bash uses spaces:
# Array in Perl
my @array = (1, 2, 3, 4);
# Array in Bash
array=(1 2 3 4)
Subscript Assignment
Create an array with explicit element indices:
array=([3]='fourth element' [4]='fifth element')
Assignment by index
array[0]='first element'
array[1]='second element'
declare -A array
array[first]='First element'
array[second]='Second element'
Dynamic Assignment
Create an array from the output of other command, for example use seq to get a
range from 1 to 10:
array=(`seq 1 10`)
42
echo ${array[@]}
# output: 1 2 3 4 5 6 7 8 9 10
String Operations
If referring to a single element, string operations are permitted:
array=(zero one two)
echo "${array[0]:0:3}" # gives out zer (chars at position 0, 1 and 2 in the string
zero)
echo "${array[0]:1:3}" # gives out ero (chars at position 1, 2 and 3 in the string
zero)
so ${array[$i]:N:M} gives out a string from the Nth position (starting from 0) in
the string ${array[$i]} with M
following chars.
43
Version ≥ 3.1
Append
Modify array, adding elements to the end if no subscript is specified.
array+=('fourth element' 'fifth element')
Insert
Insert an element at a given index:
arr=(a b c d)
# insert an element at index 2
i=2
arr=("${arr[@]:0:$i}" 'new' "${arr[@]:$i}")
echo "${arr[2]}" #output: new
Delete
Delete array indexes using the unset builtin:
arr=(a b c)
echo "${arr[@]}"
echo "${!arr[@]}"
unset -v 'arr[1]'
echo "${arr[@]}"
echo "${!arr[@]}"
# outputs: a b c
# outputs: 0 1 2
# outputs: a c
# outputs: 0 2
Merge
array3=("${array1[@]}" "${array2[@]}")
44
for y in "${a[@]}"; do
# act on $y
echo "$y"
done
# classic for-loop
for ((idx=0; idx < ${#a[@]}; ++idx)); do
# act on ${a[$idx]}
echo "${a[$idx]}"
done
45
echo "${!aa[@]}"
#Out: hello ab key with space
46
Using while loop with numerical conditional:
i=0
while (( $i < ${#arr[@]} )); do
echo "${arr[$i]}"
((i++))
done
Each space in the string denotes a new item in the resulting array.
echo ${arrayVar[0]} # will print Apple
echo ${arrayVar[3]} # will print Mango
47
$ arr[2]='second'
$ arr[10]='tenth'
$ arr[25]='twenty five'
$ echo ${!arr[@]}
2 10 25
Reading in a loop:
arr=()
while IFS= read -r line; do
arr+=("$line")
done
Version ≥ 4.0
48
Example:
arr=(a b c d)
echo "${arr[2]}" # output: c
# Now call the insert function and pass the array variable name,
# index to insert at
# and the element to insert
insert arr 2 'New Element'
# 'New Element' was inserted at index 2 in arr, now print them
echo "${arr[2]}" # output: New Element
echo "${arr[3]}" # output: c
49
Chapter 13: Associative arrays
Section 13.1: Examining assoc arrays
All needed usage shown with this snippet:
#!/usr/bin/env bash
declare -A assoc_array=([key_string]=value
\
[one]="something"
\
[two]="another thing"
\
[ three ]='mind the blanks!'
\
[ " four" ]='count the blanks of this key later!'
\
[IMPORTANT]='SPACES DO ADD UP!!!'
\
[1]='there are no integers!'
\
[info]="to avoid history expansion "
\
[info2]="quote exclamation mark with single quotes" \
)
echo # just a blank line
echo now here are the values of assoc_array:
echo ${assoc_array[@]}
echo not that useful,
echo # just a blank line
echo this is better:
declare -p assoc_array
# -p == print
echo have a close look at the spaces in entries with keys two, three and four
above\!\!\!
echo # just a blank line
echo # just another blank line
echo there is a difference using integers as keys\!\!\!
i=1
echo declaring an integer var i=1
echo # just a blank line
echo Within an integer_array bash recognizes artithmetic context.
echo Within an assoc_array bash DOES NOT recognize artithmetic context.
echo # just a blank line
echo this works: \${assoc_array[\$i]}: ${assoc_array[$i]}
echo this NOT!!: \${assoc_array[i]}: ${assoc_array[i]}
50
echo # just a blank line
echo # just a blank line
echo an \${assoc_array[i]} has a string context within braces in contrast to an
integer_array
declare -i integer_array=( one two three )
echo "doing a: declare -i integer_array=( one two three )"
echo # just a blank line
echo both forms do work: \${integer_array[i]} : ${integer_array[i]}
echo and this too: \${integer_array[\$i]} : ${integer_array[$i]}
51
Chapter 14: Functions
Section 14.1: Functions with arguments
In helloJohn.sh:
#!/bin/bash
greet() {
local name="$1"
echo "Hello, $name"
}
greet "John Doe"
# running above script
$ bash helloJohn.sh
Hello, John Doe
1. If you don't modify the argument in any way, there is no need to copy it to a
local variable - simply echo
"Hello, $1".
2. You can use $1, $2, $3 and so on to access the arguments inside the function.
Note: for arguments more than 9 $10 won't work (bash will read it as $10), you need
to do ${10},
${11} and so on.
Note: You should practically always use double quotes around "$@", like here.
Omitting the quotes will cause the shell to expand wildcards (even when the user
specifically quoted them in
order to avoid that) and generally introduce unwelcome behavior and potentially
even security problems.
foo "string with spaces;" '$HOME' "*"
# output => string with spaces; $HOME *
52
foo
foo 30
# output => 25
# output => 30
Note that sourcing a file with functions makes them available in your current bash
session.
$ source helloWorld.sh
$ greet
Hello World!
You can export a function in some shells, so that it is exposed to child processes.
bash -c 'greet'
export -f greet
bash -c 'greet'
# fails
# export function; note -f
# success
53
{
local OPTIND OPTION OPTARG status
status=1
OPTIND=1
while getopts 'x:' OPTION; do
case ${OPTION} in
x)
status="${OPTARG}";;
*)
1>&2 printf 'failwith: %s: Unsupported option.\n' "${OPTION}";;
esac
done
shift $(( OPTIND - 1 ))
{
printf 'Failure: '
printf "$@"
printf '\n'
} 1>&2
exit "${status}"
}
and so on.
Note that as for printf, variables should not be used as first argument. If the
message to print consists of the
content of a variable, one should use the %s specifier to print it, like in
failwith '%s' "${message}"
Output:
func ()
{
echo "I am a sample function"
}
54
do
case $1 in
-f|--follow)
local FOLLOW="following"
;;
-t|--tail)
local TAIL="tail=$2"
;;
esac
shift
done
echo "FOLLOW: $FOLLOW"
echo "TAIL: $TAIL"
}
Example usage:
foo -f
foo -t 10
foo -f --tail 10
foo --follow --tail 10
Section 14.7: The exit code of a function is the exit code of its
last command
Consider this example function to check if a host is up:
is_alive() {
ping -c1 "$1" &> /dev/null
}
This function sends a single ping to the host specified by the first function
parameter. The output and error output
of ping are both redirected to /dev/null, so the function will never output
anything. But the ping command will
have exit code 0 on success, and non-zero on failure. As this is the last (and in
this example, the only) command of
the function, the exit code of ping will be reused for the exit code of the
function itself.
55
This fact is very useful in conditional statements.
For example, if host graucho is up, then connect to it with ssh:
if is_alive graucho; then
ssh graucho
fi
Another example: repeatedly check until host graucho is up, and then connect to it
with ssh:
while ! is_alive graucho; do
sleep 5
done
ssh graucho
56
Chapter 15: Bash Parameter Expansion
The $ character introduces parameter expansion, command substitution, or arithmetic
expansion. The parameter
name or symbol to be expanded may be enclosed in braces, which are optional but
serve to protect the variable to
be expanded from characters immediately following it which could be interpreted as
part of the name.
Read more in the Bash User Manual.
To uppercase
$ v="hello"
# Just the first character
$ printf '%s\n' "${v^}"
Hello
# All characters
$ printf '%s\n' "${v^^}"
HELLO
# Alternative
$ v="hello world"
$ declare -u string="$v"
$ echo "$string"
HELLO WORLD
To lowercase
$ v="BYE"
# Just the first character
$ printf '%s\n' "${v,}"
bYE
# All characters
$ printf '%s\n' "${v,,}"
bye
# Alternative
$ v="HELLO WORLD"
$ declare -l string="$v"
$ echo "$string"
hello world
Toggle Case
$ v="Hello World"
# All chars
$ echo "${v~~}"
hELLO wORLD
$ echo "${v~}"
# Just the first char
hello World
57
5
Note that it's the length in number of characters which is not necessarily the same
as the number of bytes (like in
UTF-8 where most characters are encoded in more than one byte), nor the number of
glyphs/graphemes (some of
which are combinations of characters), nor is it necessarily the same as the
display width.
# Number of array elements
$ myarr=(1 2 3)
$ echo "${#myarr[@]}"
3
# Works for positional parameters as well
$ set -- 1 2 3 4
$ echo "${#@}"
4
# But more commonly (and portably to other shells), one would use
$ echo "$#"
4
All matches:
$ echo "${a//a/A}"
I Am A string
58
Section 15.4: Substrings and subarrays
var='0123456789abcdef'
# Define a zero-based offset
$ printf '%s\n' "${var:3}"
3456789abcdef
# Offset and length of substring
$ printf '%s\n' "${var:3:4}"
3456
Version ≥ 4.2
The same expansions apply if the parameter is a positional parameter or the element
of a subscripted array:
# Set positional parameter $1
set -- 0123456789abcdef
# Define offset
$ printf '%s\n' "${1:5}"
56789abcdef
# Assign to array element
myarr[0]='0123456789abcdef'
# Define offset and length
$ printf '%s\n' "${myarr[0]:7:3}"
789
59
# Define an offset and a length
$ printf '%s\n' "${@:10:3}"
0
a
b
# No negative lengths allowed for positional parameters
$ printf '%s\n' "${@:10:-2}"
bash: -2: substring expression < 0
# Negative offset counts from the end
# Needs a space to avoid confusion with ${@:-10:2}
$ printf '%s\n' "${@: -10:2}"
7
8
# ${@:0} is $0 which is not otherwise a positional parameters or part
# of $@
$ printf '%s\n' "${@:0:2}"
/usr/bin/bash
1
Longest match:
$ echo "${a##*a}"
string
60
Section 15.6: Parameter indirection
Bash indirection permits to get the value of a variable whose name is contained in
another variable. Variables
example:
$ red="the color red"
$ green="the color green"
$ color=red
$ echo "${!color}"
the color red
$ color=green
$ echo "${!color}"
the color green
#Indirect expansion
61
#Out: myfile.txt
To emulate basename $FILENAME .txt and return the filename without the .txt.
extension:
BASENAME="${FILENAME##*/}"
echo "${BASENAME%%.txt}"
#Out: myfile
$ unset var
$ echo "${var:-XX}"
XX
$ var=""
$ echo "${var:-XX}"
XX
$ var=23
$ echo "${var:-XX}"
23
${parameter:=word}
$ unset var
$ echo "${var:=XX}"
XX
$ echo "$var"
XX
$ var=""
$ echo "${var:=XX}"
XX
$ echo "$var"
XX
$ var=23
$ echo "${var:=XX}"
23
$ echo "$var"
23
Longest match:
GoalKicker.com – Bash Notes for Professionals
62
$ echo "${a%%a*}"
I
It's also possible to expand a variable using a default value - say I want to
invoke the user's editor, but if they've not
set one I'd like to give them vim.
$ EDITOR=nano
$ ${EDITOR:-vim} /tmp/some_file
# opens nano
$ unset EDITOR
$ $ ${EDITOR:-vim} /tmp/some_file
# opens vim
There are two different ways of performing this expansion, which differ in whether
the relevant variable is empty or
unset. Using :- will use the default if the variable is either unset or empty,
whilst - only uses the default if the
variable is unset, but will use the variable if it is set to the empty string:
$ a="set"
$ b=""
$ unset c
$ echo ${a:-default_a} ${b:-default_b} ${c:-default_c}
set default_b default_c
$ echo ${a-default_a} ${b-default_b} ${c-default_c}
set default_c
Noting that these expansions can be nested, using alternatives becomes particularly
useful when supplying
arguments to command line flags;
$ output_file=/tmp/foo
$ wget ${output_file:+"-o ${output_file}"} www.stackexchange.com
# expands to wget -o /tmp/foo www.stackexchange.com
$ unset output_file
GoalKicker.com – Bash Notes for Professionals
63
$ wget ${output_file:+"-o ${output_file}"} www.stackexchange.com
# expands to wget www.stackexchange.com
The run the full example above each of the erroring echo statements needs to be
commented out to proceed.
64
Chapter 16: Copying (cp)
-a,-archive
Option
Description
Combines the d, p and r options
-b, -backup
-p, --preserve
-R, --recursive
if folder bar exists before issuing the command, then foo and its content will be
copied into the folder bar.
However, if bar does not exist before issuing the command, then the folder bar will
be created and the content of
foo will be placed into bar
65
Chapter 17: Find
find is a command to recursively search a directory for files(or directories) that
match a criteria, and then perform
some action on the selected files.
find search_path selection_criteria action
To find files/directories which name begin with abc and end with one alpha character
following a one digit:
$ find . -name "abc[a-z][0-9]"
The above command will recursively find all directories (-type d) relative to .
(which is your current working
directory), and execute chmod 770 on them. The -r option specifies to xargs to not
run chmod if find did not find
any files.
If your files names or directories have a space character in them, this command may
choke; a solution is to use the
following
66
find . -type d -print0 | xargs -r -0 chmod 770
In the above example, the -print0 and -0 flags specify that the file names will be
separated using a null byte, and
allows the use of special characters, like spaces, in the file names. This is a GNU
extension, and may not work in
other versions of find and xargs.
The preferred way to do this is to skip the xargs command and let find call the
subprocess itself:
find . -type d -exec chmod 770 {} \;
Here, the {} is a placeholder indicating that you want to use the file name at that
point. find will execute chmod on
each file individually.
You can alternatively pass all file names to a single call of chmod, by using
find . -type d -exec chmod 770 {} +
This is also the behaviour of the above xargs snippets. (To call on each file
individually, you can use xargs -n1).
A third option is to let bash loop over the list of filenames find outputs:
find . -type d | while read -r d; do chmod 770 "$d"; done
This is syntactically the most clunky, but convenient when you want to run multiple
commands on each found file.
However, this is unsafe in the face of file names with odd names.
find . -type f | while read -r d; do mv "$d" "${d// /_}"; done
which will replace all spaces in file names with underscores.(This example also
won't work if there are spaces in
leading directory names.)
The problem with the above is that while read -r expects one entry per line, but
file names can contain newlines
(and also, read -r will lose any trailing whitespace). You can fix this by turning
things around:
find . -type d -exec bash -c 'for f; do mv "$f" "${f// /_}"; done' _ {} +
This way, the -exec receives the file names in a form which is completely correct
and portable; the bash -c receives
them as a number of arguments, which will be found in $@, correctly quoted etc.
(The script will need to handle
these names correctly, of course; every variable which contains a file name needs to
be in double quotes.)
The mysterious _ is necessary because the first argument to bash -c 'script' is used
to populate $0.
67
To find files that have not been modified within the last 2 hours:
$ find . -mmin +120
The above example are searching only on the modified time - to search on access
times, or changed times, use a, or
c accordingly.
$ find . -amin -120
$ find . -cmin +120
General format:
-mmin n : File was modified n minutes ago
-mmin -n : File was modified less than n minutes ago
-mmin +n : File was modified more than n minutes ago
Find files that have been modified within the last 2 days:
find . -mtime -2
Find files that have not been modified within the last 2 days
find . -mtime +2
Use -atime and -ctime for access time and status change time respectively.
General format:
-mtime n : File was modified nx24 hours ago
-mtime -n : File was modified less than nx24 hours ago
-mtime +n : File was modified more than nx24 hours ago
Find files accessed in a range of timestamps (using files as timestamp), from 1 hour
ago to 10 minutes ago:
touch -t $(date -d '1 HOUR AGO' +%Y%m%d%H%M.%S) start_date
touch -t $(date -d '10 MINUTE AGO' +%Y%m%d%H%M.%S) end_date
timeout 10 find "$LOCAL_FOLDER" -newerat "start_date" ! -newerat "end_date" -print
General format:
-newerXY reference : Compares the timestamp of the current file with reference. XY
could have one of the
68
find -type f -size +15M
Or
find -type f -size 12288c
Or
find -type f -size 24b
Or
find -type f -size 24
General format:
find [options] -size n[cwbkMG]
Find files of n-block size, where +n means more than n-block, -n means less than n-
block and n (without
any sign) means exactly n-block
Block size:
1. c: bytes
2. w: 2 bytes
3. b: 512 bytes (default)
4. k: 1 KB
5. M: 1 MB
6. G: 1 GB
To find only files within a folder called log (on any level):
find . -type f -path '*/log/*'
69
find . -type f -path '*/log/*' -o -path '*/data/*'
To find all files except the ones contained in a folder called bin:
find . -type f -not -path '*/bin/*'
To find all file all files except the ones contained in a folder called bin or log
files:
find . -type f -not -path '*log' -not -path '*/bin/*'
To find all files of type .txt from the current directory alone, do
find . -maxdepth 1 -type f -name "*.txt"
70
Chapter 18: Using sort
Option
Meaning
-u
Make each lines of output unique
sort is a Unix command to order data in file(s) in a sequence.
Reversing sort order: To reverse the order of the sort use the -r option
71
To reverse the sort order of the above file use:
sort -rn file
Rowena
Ravenclaw
Lockhart
Olivander
Flitwick
Helga
Hufflepuff
Tonks
Newt
Sprout
Flitwick
Lockhart
Rowena
Ravenclaw
Olivander
Sprout
Tonks
Helga
Hufflepuff
Newt
Now if we have to sort the file with a secondary key along with the primary key use:
sort -k 2,2 -k 1,1 Hogwarts
This will first sort the file with column 2 as primary key, and then sort the file
with column 1 as secondary key:
Hermione
Ron
Harry
Gryffindor
Ron
Goyle
Goyle
Malfoy
Slytherin
Snape
Lockhart
Flitwick
Rowena
Ravenclaw
Olivander
Tonks
Sprout
Helga
Hufflepuff
Newt
If we need to sort a file with more than 1 key , then for every -k option we need to
specify where the sort ends. So k1,1 means start the sort at the first column and
end sort at first column.
-t option
In the previous example the file had the default delimeter - tab. In case of sorting
a file that has non-default
delimeter we need the -t option to specify the delimeter. Suppose we have the file
as below:
test>>cat file
72
5.|Gryffindor
4.|Hogwarts
2.|Harry
3.|Dumbledore
1.|The sorting hat
73
Chapter 19: Sourcing
Section 19.1: Sourcing a file
Sourcing a file is different from execution, in that all commands are evaluated
within the context of the current
bash session - this means that any variables, function, or aliases defined will
persist throughout your session.
Create the file you wish to source sourceme.sh
#!/bin/bash
export A="hello_world"
alias sayHi="echo Hi"
sayHello() {
echo Hello
}
From hencefourth, you have all the resources of the sourced file available
$ echo $A
hello_world
$ sayHi
Hi
$ sayHello
Hello
Note that the command . is synonymous to source, such that you can simply use
$ . sourceme.sh
74
source my_env/bin/activate
75
Chapter 20: Here documents and here
strings
Section 20.1: Execute command with here document
ssh -p 21 [email protected] <<EOF
echo 'printing pwd'
echo "\$(pwd)"
ls -a
find '*.txt'
EOF
$ is escaped because we do not want it to be expanded by the current shell i.e $
(pwd) is to be executed on the
remote shell.
Another way:
ssh -p 21 [email protected] <<'EOF'
echo 'printing pwd'
echo "$(pwd)"
ls -a
find '*.txt'
EOF
Note: The closing EOF should be at the beginning of the line (No whitespaces
before). If indentation is required,
tabs may be used if you start your heredoc with <<-. See the Indenting here
documents and Limit Strings examples
for more information.
One practical use case of this (as mentioned in man bash) is in shell scripts, for
example:
if cond; then
cat <<- EOF
hello
there
EOF
fi
It is customary to indent the lines within code blocks as in this if statement, for
better readability. Without the <<GoalKicker.com – Bash Notes for Professionals
76
operator syntax, we would be forced to write the above code like this:
if cond; then
cat << EOF
hello
there
EOF
fi
That's very unpleasant to read, and it gets much worse in a more complex realistic
script.
The here-document is the lines between the << EOF and EOF.
This here document becomes the input of the cat command. The cat command simply
outputs its input, and using
the output redirection operator > we redirect to a file fruits.txt.
As a result, the fruits.txt file will contain the lines:
apple
orange
lemon
The usual rules of output redirection apply: if fruits.txt did not exist before, it
will be created. If it existed before,
it will be truncated.
77
Section 20.5: Run several commands with sudo
sudo -s <<EOF
a='var'
echo 'Running serveral commands with sudo'
mktemp -d
echo "\$a"
EOF
$a needs to be escaped to prevent it to be expanded by the current shell
Or
sudo -s <<'EOF'
a='var'
echo 'Running serveral commands with sudo'
mktemp -d
echo "$a"
EOF
Incorrect use:
cat <<limitstring
line 1
line 2
limitstring
Since limitstring on the last line is not exactly at the start of the line, the
shell will continue to wait for further
input, until it sees a line that starts with limitstring and doesn't contain
anything else. Only then it will stop
waiting for input, and proceed to pass the here-document to the cat command.
Note that when you prefix the initial limitstring with a hyphen, any tabs at the
start of the line are removed before
parsing, so the data and the limit string can be indented with tabs (for ease of
reading in shell scripts).
cat <<-limitstring
line 1
has a tab each before the words line and has
line 2 has two leading tabs
78
limitstring
will produce
line 1
has a tab each before the words line and has
line 2 has two leading tabs
with the leading tabs (but not the internal tabs) removed.
79
Chapter 21: Quoting
Section 21.1: Double quotes for variable and command
substitution
Variable substitutions should only be used inside double quotes.
calculation='2 * 3'
echo "$calculation"
echo $calculation
echo "$(($calculation))"
# prints 2 * 3
# prints 2, the list of files in the current directory, and 3
# prints 6
Outside of double quotes, $var takes the value of var, splits it into whitespace-
delimited parts, and interprets each
part as a glob (wildcard) pattern. Unless you want this behavior, always put $var
inside double quotes: "$var".
The same applies to command substitutions: "$(mycommand)" is the output of
mycommand, $(mycommand) is the
result of split+glob on the output.
echo "$var"
echo "$(mycommand)"
another=$var
make -D THING=$var
make -D THING="$var"
make -D "THING=$var"
# good
# good
# also works, assignment is implicitly double-quoted
# BAD! This is not a bash assignment.
# good
# also good
Command substitutions get their own quoting contexts. Writing arbitrarily nested
substitutions is easy because the
parser will keep track of nesting depth instead of greedily searching for the first
" character. The StackOverflow
syntax highlighter parses this wrong, however. For example:
echo "formatted text: $(printf "a + b = %04d" "${c}")" # “formatted text: a + b =
0000”
Single quote
Prevents variable expansion
$, `, ", \ can be escaped with \ to prevent their special meaning All of them are
literals
80
$ echo "!cat"
echo "cat file"
cat file
$ echo '!cat'
!cat
echo "\"'\""
"'"
$ a='var'
$ echo '$a'
$a
$ echo "$a"
var
@[\]^`{|}~
A backslash quotes the next character, i.e. the next character is interpreted
literally. The one exception is a newline:
backslash-newline expands to the empty string.
echo \!\"\#\$\&\'\(\)\*\;\<\=\>\?\ \ \@\[\\\]\^\`\{\|\}\~
All text between single quotes (forward quotes ', also known as apostrophe) is
printed literally. Even backslash
stands for itself, and it's impossible to include a single quote; instead, you can
stop the literal string, include a literal
single quote with a backslash, and start the literal string again. Thus the 4-
character sequence '\'' effectively allow
to include a single quote in a literal string.
echo '!"#$&'\''()*;<=>?
@[\]^`{|}~'
^^^^
@[\\]^`{|}~'
^^
Double quotes " delimit semi-literal strings where only the characters " \ $ and `
retain their special meaning.
These characters need a backslash before them (note that if backslash is followed
by some other character, the
backslash remains). Double quotes are mostly useful when including a variable or a
command substitution.
echo "!\"#\$&'()*;<=>?
#
^^
echo "!\"#\$&'()*;<=>?
#
^^
@[\\]^\`{|}~"
^^ ^^
@[\]^\`{|}~"
^ ^^
\[ prints \[
Interactively, beware that ! triggers history expansion inside double quotes: "!
oops" looks for an older command
containing oops; "\!oops" doesn't do history expansion but keeps the backslash.
This does not happen in scripts.
82
Chapter 22: Conditional Expressions
Section 22.1: File type tests
The -e conditional operator tests whether a file exists (including all file types:
directories, etc.).
if [[ -e $filename ]]; then
echo "$filename exists"
fi
For a symbolic link, apart from -L, these tests apply to the target, and return
false for a broken link.
if [[ -L $filename || -e $filename ]]; then
echo "$filename exists (but may be a broken symbolic link)"
fi
if [[ -L $filename && ! -e $filename ]]; then
echo "$filename is a broken symbolic link"
fi
If the right-hand side is not quoted then it is a wildcard pattern that $string1 is
matched against.
string='abc'
pattern1='a*'
pattern2='x*'
if [[ "$string" == $pattern1 ]]; then
# the test is true
83
echo "The string $string matches the pattern $pattern"
fi
if [[ "$string" != $pattern2 ]]; then
# the test is false
echo "The string $string does not match the pattern $pattern"
fi
The < and > operators compare the strings in lexicographic order (there are no
less-or-equal or greater-or-equal
operators for strings).
There are unary tests for the empty string.
if [[ -n "$string" ]]; then
echo "$string is non-empty"
fi
if [[ -z "${string// }" ]]; then
echo "$string is empty or contains only spaces"
fi
if [[ -z "$string" ]]; then
echo "$string is empty"
fi
Above, the -z check may mean $string is unset, or it is set to an empty string. To
distinguish between empty and
unset, use:
if [[ -n "${string+x}" ]]; then
echo "$string is set, possibly to the empty string"
fi
if [[ -n "${string-x}" ]]; then
echo "$string is either unset or set to a non-empty string"
fi
if [[ -z "${string+x}" ]]; then
echo "$string is unset"
fi
if [[ -z "${string-x}" ]]; then
echo "$string is set to an empty string"
fi
84
esac
Where [:blank:] is locale specific horizontal spacing characters (tab, space, etc).
“Same file” means that modifying one of the files in place affects the other. Two files
can be the same even if they
have different names, for example if they are hard links, or if they are symbolic
links with the same target, or if one
is a symbolic link pointing to the other.
If two files have the same content, but they are distinct files (so that modifying
one does not affect the other), then
-ef reports them as different. If you want to compare two files byte by byte, use the
cmp utility.
if cmp -s -- "$file1" "$file2"; then
echo "$file1 and $file2 have identical contents"
else
echo "$file1 and $file2 differ"
fi
To produce a human-readable list of differences between text files, use the diff
utility.
if diff -u "$file1" "$file2"; then
85
echo "$file1 and $file2 have identical contents"
else
: # the differences between the files have been listed
fi
These tests take permissions and ownership into account to determine whether the
script (or programs launched
from the script) can access the file.
Beware of race conditions (TOCTOU): just because the test succeeds now doesn't mean
that it's still valid on the
next line. It's usually better to try to access a file, and handle the error, rather
than test first and then have to
handle the error anyway in case the file has changed in the meantime.
Note that the < and > operators inside [[ … ]] compare strings, not numbers.
if [[ 9 -lt 10 ]]; then
echo "9 is before 10 in numeric order"
fi
if [[ 9 > 10 ]]; then
echo "9 is after 10 in lexicographic order"
fi
The two sides must be numbers written in decimal (or in octal with a leading zero).
Alternatively, use the ((…))
arithmetic expression syntax, which performs integer calculations in a C/Java/…-
like syntax.
86
x=2
if ((2*x == 4)); then
echo "2 times 2 is 4"
fi
((x += 1))
echo "2 plus 1 is $x"
87
Chapter 23: Scripting with Parameters
Section 23.1: Multiple Parameter Parsing
To parse lots of parameters, the preferred way of doing this is using a while loop,
a case statement, and shift.
shift is used to pop the first parameter in the series, making what used to be $2,
now be $1. This is useful for
88
B: 2
Alt. Opt
Details
-h
--help
Show help
-v
--version
-dr path --doc-root path An option which takes a secondary parameter (a path)
-i
--install
-*
--
Invalid option
#!/bin/bash
dr=''
install=false
skip=false
for op in "$@";do
if $skip;then skip=false;continue;fi
case "$op" in
-v|--version)
echo "$ver_info"
shift
exit 0
;;
-h|--help)
echo "$help"
shift
exit 0
;;
-dr|--doc-root)
shift
if [[ "$1" != "" ]]; then
dr="${1/%\//}"
shift
skip=true
else
echo "E: Arg missing for -dr option"
exit 1
fi
;;
-i|--install)
install=true
shift
;;
-*)
echo "E: Invalid option: $1"
shift
exit 1
;;
esac
done
89
For example, the actual egrep in new GNU/Linux system is being replaced by a
wrapper script named egrep. This is
how it looks:
#!/bin/sh
exec grep -E "$@"
So, when you run egrep in such systems, you are actually running grep -E with all
the arguments forwarded.
In general case, if you want to run an example script/command exmp with another
script mexmp then the wrapper
mexmp script will look like:
#!/bin/sh
exmp "$@" # Add other options before "$@"
# or
#full/path/to/exmp "$@"
quotes, it expands to a single word with the value of each parameter separated by
the first character of the
IFS special variable.
$@: Expands to the positional parameters, starting from one. When the expansion
occurs within double
Example 1
Loop through all arguments and check if they are files:
for item in "$@"
do
if [[ -f $item ]]; then
echo "$item is a file"
90
fi
done
Example 2
Loop through all arguments and check if they are files:
for (( i = 1; i <= $#; ++ i ))
do
item=${@:$i:1}
if [[ -f $item ]]; then
echo "$item is a file"
fi
done
Here, IFS is a special variable called Internal field separator which defines the
character or characters used to
separate a pattern into tokens for some operations.
To access an individual element:
echo "${array[0]}"
91
Chapter 24: Bash history substitutions
Section 24.1: Quick Reference
Interaction with the history
# List all previous commands
history
# Clear the history, useful if you entered a password by accident
history -c
Event designators
# Expands to line n of bash history
!n
# Expands to last command
!!
# Expands to last command starting with "text"
!text
# Expands to last command containing "text"
!?text
# Expands to command n lines ago
!-n
# Expands to last command with first occurrence of "foo" replaced by "bar"
^foo^bar^
# Expands to the current command
!#
Word designators
These are separated by : from the event designator they refer to. The colon can be
omitted if the word designator
doesn't start with a number: !^ is the same as !:^.
# Expands to the first argument of the most recent command
!^
# Expands to the last argument of the most recent command (short for !!:$)
!$
# Expands to the third argument of the most recent command
!:3
# Expands to arguments x through y (inclusive) of the last command
# x and y can be numbers or the anchor characters ^ $
!:x-y
# Expands to all words of the last command except the 0th
# Equivalent to :^-$
!*
Modifiers
These modify the preceding event or word designator.
# Replacement in the expansion using sed syntax
92
# Allows flags before the s and alternate separators
:s/foo/bar/ #substitutes bar for first occurrence of foo
:gs|foo|bar| #substitutes bar for all foo
# Remove leading path from last argument ("tail")
:t
# Remove trailing path from last argument ("head")
:h
# Remove file extension from last argument
:r
For example, if you recently executed man 5 crontab, you can find it quickly by
starting to type "crontab". The
prompt will change like this:
(reverse-i-search)`cr': man 5 crontab
The `cr' there is the string I typed so far. This is an incremental search, so as
you continue typing, the search result
gets updated to match the most recent command that contained the pattern.
Press the left or right arrow keys to edit the matched command before running it,
or the enter key to run the
command.
By default the search finds the most recently executed command matching the pattern.
To go further back in the
history press control
r again. You may press it repeatedly until you find the desired command.
This will substitute the Nth argument of the current command. In the example !#:1
is replaced with the first
GoalKicker.com – Bash Notes for Professionals
93
argument, i.e. backup_download_directory.
Notice that in the last example we did not get ping pong, a great game because the
last argument passed to the
previous command was pong, we can avoid issue like this by adding quotes.
Continuing with the example, our last
argument was game:
$ echo "it is !$ time"
it is game time
$ echo "hooray, !$!"
hooray, it is game time!
This command will replace 1 with 2 in the previously executed command. It will only
replace the first occurrence of
the string and is equivalent to !!:s/1/2/.
If you want to replace all occurrences, you have to use !!:gs/1/2/ or !!:as/1/2/.
94
Chapter 25: Math
Section 25.1: Math using dc
dc is one of the oldest programs on Unix.
It uses reverse polish notation, which means that you first stack numbers, then
operations. For example 1+1 is
written as 1 1+.
To print an element from the top of the stack use command p
echo '2 3 + p' | dc
5
or
dc <<< '2 3 + p'
5
You can also use capital letters from A to F for numbers between 10 and 15 and . as
a decimal point
dc <<< 'A.4 p'
10.4
dc is using abitrary precision which means that the precision is limited only by
the available memory. By default the
95
EOF
6
Division:
echo $((5 / 2))
2
Modulo:
echo $((5 % 2))
1
Exponentiation:
echo $((5 ** 2))
25
96
echo '10 == 10 && 8 > 3' | bc
1
Basic arithmetics
expr 2 + 3
5
97
Chapter 26: Bash Arithmetic
Parameter
Details
EXPRESSION Expression to evaluate
Output: 3
# Using variables
#!/bin/bash
var1=4
var2=5
((output=$var1 * $var2))
printf "%d\n" "$output"
Output: 20
You need quotes if there are spaces or globbing characters. So those will get
error:
let num = 1 + 2
#wrong
let 'num = 1 + 2' #right
let a[1] = 1 + 1
#wrong
let 'a[1] = 1 + 1' #right
(( ))
((a=$a+1))
((a = a + 1))
((a += 1))
#add 1 to a
#like above
#like above
98
Section 26.3: Simple arithmetic with expr
#!/bin/bash
expr 1 + 2
Output: 3
99
Chapter 27: Scoping
Section 27.1: Dynamic scoping in action
Dynamic scoping means that variable lookups occur in the scope where a function is
called, not where it is defined.
$ x=3
$ func1 () { echo "in func1: $x"; }
$ func2 () { local x=9; func1; }
$ func2
in func1: 9
$ func1
in func1: 3
In a lexically scoped language, func1 would always look in the global scope for the
value of x, because func1 is
defined in the local scope.
In a dynamically scoped language, func1 looks in the scope where it is called. When
it is called from within func2, it
first looks in the body of func2 for a value of x. If it weren't defined there, it
would look in the global scope, where
func2 was called from.
100
Chapter 28: Process substitution
Section 28.1: Compare two files from the web
The following compares two files with diff using process substitution instead of
creating temporary files.
diff <(curl http://www.example.com/page1) <(curl http://www.example.com/page2)
doesn’t do what you want. By the time cat reads body.txt, it has already been
truncated by the redirection and it is
empty. The final result will be that body.txt will hold the contents of header.txt
only.
One might think to avoid this with process substitution, that is, that the command
$ cat header.txt <(cat body.txt) > body.txt
will force the original contents of body.txt to be somehow saved in some buffer
somewhere before the file is
truncated by the redirection. It doesn’t work. The cat in parentheses begins
reading the file only after all file
descriptors have been set up, just like the outer one. There is no point in trying
to use process substitution in this
case.
The only way to prepend a file to another file is to create an intermediate one:
$ cat header.txt body.txt >body.txt.new
$ mv body.txt.new body.txt
which is what sed or perl or similar programs do under the carpet when called with
an edit-in-place option (usually
-i).
Normally tee writes its input to one or more files (and stdout). We can write to
commands instead of files with tee
GoalKicker.com – Bash Notes for Professionals
101
>(command).
Here the command wc -l >&2 counts the lines read from tee (which in turn is reading
from bigfile). (The line
count is sent to stderr (>&2) to avoid mixing with the input to gzip.) The stdout
of tee is simultaneously fed into
gzip.
below are run in a sub-shell context and the scope of the variables used within are
lost after the sub-shell
terminates.
command &
command | command
( command )
Process substitution will solve the problem by avoiding use the of pipe | operator
as in
count=0
while IFS= read -r _; do
((count++))
done < <(find . -maxdepth 1 -type f -print)
This will retain the count variable value as no sub-shells are invoked.
102
Chapter 29: Programmable completion
Section 29.1: Simple completion using function
_mycompletion() {
local command_name="$1" # not used in this example
local current_word="$2"
local previous_word="$3" # not used in this example
# COMPREPLY is an array which has to be filled with the possible completions
# compgen is used to filter matching completions
COMPREPLY=( $(compgen -W 'hello world' -- "$current_word") )
}
complete -F _mycompletion mycommand
Usage Example:
$ mycommand [TAB][TAB]
hello world
$ mycommand h[TAB][TAB]
$ mycommand hello
103
Chapter 30: Customizing PS1
Section 30.1: Colorize and customize terminal prompt
This is how the author sets their personal PS1 variable:
gitPS1(){
gitps1=$(git branch 2>/dev/null | grep '*')
gitps1="${gitps1:+ (${gitps1/#\* /})}"
echo "$gitps1"
}
#Please use the below function if you are a mac user
gitPS1ForMac(){
git branch 2> /dev/null | sed -e '/^[^*]/d' -e 's/* \(.*\)/ (\1)/'
}
timeNow(){
echo "$(date +%r)"
}
if [ "$color_prompt" = yes ]; then
if [ x$EUID = x0 ]; then
PS1='\[\033[1;38m\][$(timeNow)]\[\033[00m\]
\[\033[1;31m\]\u\[\033[00m\]\[\033[1;37m\]@\[\033[00m\]\[\033[1;33m\]\h\[\033[00m\]
\[\033[1;34m\]\w\[\033[00m\]\[\033[1;36m\]$(gitPS1)\[\033[00m\] \[\033[1;31m\]:/#\
[\033[00m\] '
else
PS1='\[\033[1;38m\][$(timeNow)]\[\033[00m\]
\[\033[1;32m\]\u\[\033[00m\]\[\033[1;37m\]@\[\033[00m\]\[\033[1;33m\]\h\[\033[00m\]
\[\033[1;34m\]\w\[\033[00m\]\[\033[1;36m\]$(gitPS1)\[\033[00m\] \[\033[1;32m\]:/$\
[\033[00m\] '
fi
else
PS1='[$(timeNow)] \u@\h \w$(gitPS1) :/$ '
fi
Color reference:
# Colors
txtblk='\e[0;30m' # Black - Regular
txtred='\e[0;31m' # Red
txtgrn='\e[0;32m' # Green
txtylw='\e[0;33m' # Yellow
txtblu='\e[0;34m' # Blue
txtpur='\e[0;35m' # Purple
txtcyn='\e[0;36m' # Cyan
txtwht='\e[0;37m' # White
bldblk='\e[1;30m' # Black - Bold
bldred='\e[1;31m' # Red
bldgrn='\e[1;32m' # Green
bldylw='\e[1;33m' # Yellow
bldblu='\e[1;34m' # Blue
bldpur='\e[1;35m' # Purple
bldcyn='\e[1;36m' # Cyan
104
bldwht='\e[1;37m' # White
unkblk='\e[4;30m' # Black - Underline
undred='\e[4;31m' # Red
undgrn='\e[4;32m' # Green
undylw='\e[4;33m' # Yellow
undblu='\e[4;34m' # Blue
undpur='\e[4;35m' # Purple
undcyn='\e[4;36m' # Cyan
undwht='\e[4;37m' # White
bakblk='\e[40m'
# Black - Background
bakred='\e[41m'
# Red
badgrn='\e[42m'
# Green
bakylw='\e[43m'
# Yellow
bakblu='\e[44m'
# Blue
bakpur='\e[45m'
# Purple
bakcyn='\e[46m'
# Cyan
bakwht='\e[47m'
# White
txtrst='\e[0m'
# Text Reset
Notes:
Make the changes in ~/.bashrc or /etc/bashrc or ~/.bash_profile or ~./profile file
(depending on the
OS) and save it.
For root you might also need to edit the /etc/bash.bashrc or /root/.bashrc file
Run source ~/.bashrc (distro specific) after saving the file.
Note: if you have saved the changes in ~/.bashrc, then remember to add source
~/.bashrc in your
~/.bash_profile so that this change in PS1 will be recorded every time the Terminal
application starts.
Notes:
Make the changes in ~/.bashrc or /etc/bashrc or ~/.bash_profile or ~./profile file
(depending on the
OS) and save it.
Run source ~/.bashrc (distro specific) after saving the file.
105
PS1='[$(timeNow)] \u@\h:\w$ '
Notes:
Make the changes in ~/.bashrc or /etc/bashrc or ~/.bash_profile or ~./profile file
(depending on the
OS) and save it.
Run source ~/.bashrc (distro specific) after saving the file.
Action
\a
\d
the date in “Weekday Month Date” format (e.g., “Tue May 26”)
106
\D{format}
the format is passed to strftime(3) and the result is inserted into the prompt
string; an empty format
results in a locale-specific time representation. The braces are required
\e
\h
\H
the hostname
\j
\l
\n
newline
\r
carriage return
\s
the name of the shell, the basename of $0 (the portion following the final slash)
\t
\T
\@
\A
\u
\v
the version of bash (e.g., 2.00)
\V
\w
\W
the basename of the current working directory, with $HOME abbreviated with a tilde
\!
\#
\$
\nnn*
a backslash
\[
\]
107
Yellow="\033[0;33m"
####-Bold-####
function __stat() {
if [ $? -eq 0 ]; then
echo -en "$Green ✔ $Color_Off "
else
echo -en "$Red ✘ $Color_Off "
fi
}
PS1='$(__stat)'
PS1+="[\t] "
PS1+="\e[0;33m\u@\h\e[0m:\e[1;34m\w\e[0m \n$ "
export PS1
108
Chapter 31: Brace Expansion
Section 31.1: Modifying filename extension
$ mv filename.{jar,zip}
Entering the ls command will show that the following directories were created:
2009-01 2009-04 2009-07 2009-10 2010-01 2010-04 2010-07 2010-10 2011-01 2011-04
2011-07 2011-10
2009-02 2009-05 2009-08 2009-11 2010-02 2010-05 2010-08 2010-11 2011-02 2011-05
2011-08 2011-11
2009-03 2009-06 2009-09 2009-12 2010-03 2010-06 2010-09 2010-12 2011-03 2011-06
2011-09 2011-12
Putting a 0 in front of 9 in the example ensures the numbers are padded with a
single 0. You can also pad numbers
with multiple zeros, for example:
$ echo {001..10}
001 002 003 004 005 006 007 008 009 010
109
z y x w v u t s r q p o n m l k j i h g f e d c b a
# digits
$ echo {1..20}
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20
# with leading zeros
$ echo {01..20}
01 02 03 04 05 06 07 08 09 10 11 12 13 14 15 16 17 18 19 20
# reverse digit
$ echo {20..1}
20 19 18 17 16 15 14 13 12 11 10 9 8 7 6 5 4 3 2 1
# reversed with leading zeros
$ echo {20..01}
20 19 18 17 16 15 14 13 12 11 10 09 08 07 06 05 04 03 02 01
# combining multiple braces
$ echo {a..d}{1..3}
a1 a2 a3 b1 b2 b3 c1 c2 c3 d1 d2 d3
Brace expansion is the very first expansion that takes place, so it cannot be
combined with any other expansions.
Only chars and digits can be used.
This won't work: echo {$(date +$H)..24}
This will create a top level folder called toplevel, nine folders inside of
toplevel named sublevel_01, sublevel_02,
etc. Then inside of those sublevels: child1, child2, child3 folders, giving you:
toplevel/sublevel_01/child1
toplevel/sublevel_01/child2
toplevel/sublevel_01/child3
toplevel/sublevel_02/child1
and so on. I find this very useful for creating multiple folders and sub folders for
my specific purposes, with one
bash command. Substitute variables to help automate/parse information given to the
script.
110
Chapter 32: getopts : smart positionalparameter parsing
Parameter
Detail
optstring The option characters to be recognized
name
Output
$ ./pingnmap -nt -i google.com -p 80
Starting Nmap 6.40 ( http://nmap.org ) at 2016-07-23 14:31 IST
Nmap scan report for google.com (216.58.197.78)
Host is up (0.034s latency).
rDNS record for 216.58.197.78: maa03s21-in-f14.1e100.net
PORT
STATE SERVICE
80/tcp open http
Nmap done: 1 IP address (1 host up) scanned in 0.22 seconds
PING google.com (216.58.197.78) 56(84) bytes of data.
64 bytes from maa03s21-in-f14.1e100.net (216.58.197.78): icmp_seq=1 ttl=57
time=29.3 ms
64 bytes from maa03s21-in-f14.1e100.net (216.58.197.78): icmp_seq=2 ttl=57
time=30.9 ms
64 bytes from maa03s21-in-f14.1e100.net (216.58.197.78): icmp_seq=3 ttl=57
time=34.7 ms
64 bytes from maa03s21-in-f14.1e100.net (216.58.197.78): icmp_seq=4 ttl=57
time=39.6 ms
64 bytes from maa03s21-in-f14.1e100.net (216.58.197.78): icmp_seq=5 ttl=57
time=32.7 ms
--- google.com ping statistics --5 packets transmitted, 5 received, 0% packet loss,
time 4007ms
rtt min/avg/max/mdev = 29.342/33.481/39.631/3.576 ms
$ ./pingnmap -v
pingnmap version 1.0.0
$ ./pingnmap -h
Invalid option ?
Usage :
pingmap -[n|t[i|p]|v]
$ ./pingnmap -v
pingnmap version 1.0.0
$ ./pingnmap -h
Invalid option ?
Usage :
pingmap -[n|t[i|p]|v]
112
Chapter 33: Debugging
Section 33.1: Checking the syntax of a script with "-n"
The -n flag enables you to check the syntax of a script without having to execute
it:
~> $ bash -n testscript.sh
testscript.sh: line 128: unexpected EOF while looking for matching `"'
testscript.sh: line 130: syntax error: unexpected end of file
Or get it from the homepage. Then you can run it with your script as a paramater:
bashdb <YOUR SCRIPT>
113
Or
$ bash --debug myscript.sh
Turn on debugging within a bash script. It may optionally be turned back on, though
debug output is automatically
reset when the script exits.
#!/bin/bash
set -x
# Enable debugging
# some code here
set +x
# Disable debugging output.
114
Chapter 34: Pattern matching and regular
expressions
Section 34.1: Get captured groups from a regex match
against a string
a='I am a simple string with digits 1234'
pat='(.*) ([0-9]+)'
[[ "$a" =~ $pat ]]
echo "${BASH_REMATCH[0]}"
echo "${BASH_REMATCH[1]}"
echo "${BASH_REMATCH[2]}"
Output:
I am a simple string with digits 1234
I am a simple string with digits
1234
In case the glob does not match anything the result is determined by the options
nullglob and failglob. If neither
of them are set, Bash will return the glob itself if nothing is matched
$ echo no*match
no*match
115
Notice, that the failglob option supersedes the nullglob option, i.e., if nullglob
and failglob are both set, then in case of no match - an error is returned.
Output:
I am a string with some digits 1024
1024
Explanation
The [[ $s =~ $pat ]] construct performs the regex matching
The captured groups i.e the match results are available in an array named
BASH_REMATCH
The 0th index in the BASH_REMATCH array is the total match
The i'th index in the BASH_REMATCH array is the i'th captured group, where i = 1,
2, 3 ...
116
The asterisk * is probably the most commonly used glob. It simply matches any
String
$ echo *acy
macy stacy tracy
A single * will not match files and folders that reside in subfolders
$ echo *
emptyfolder folder macy stacy tracy
$ echo folder/*
folder/anotherfolder folder/subfolder
Preparation
$ mkdir globbing
$ cd globbing
$ mkdir -p folder/{sub,another}folder/content/deepfolder/
touch macy stacy tracy "file with space"
folder/{sub,another}folder/content/deepfolder/file
.hiddenfile
$ shopt -u nullglob
$ shopt -u failglob
$ shopt -u dotglob
$ shopt -u nocaseglob
$ shopt -u extglob
$ shopt -s globstar
Bash is able to interpret two adjacent asterisks as a single glob. With the
globstar option activated this can be used
to match folders that reside deeper in the directory structure
echo **
emptyfolder folder folder/anotherfolder folder/anotherfolder/content
folder/anotherfolder/content/deepfolder
folder/anotherfolder/content/deepfolder/file
folder/subfolder folder/subfolder/content folder/subfolder/content/deepfolder
folder/subfolder/content/deepfolder/file macy stacy tracy
The ** can be thought of a path expansion, no matter how deep the path is. This
example matches any file or
folder that starts with deep, regardless of how deep it is nested:
$ echo **/deep*
folder/anotherfolder/content/deepfolder folder/subfolder/content/deepfolder
117
$ shopt -u nocaseglob
$ shopt -u extglob
$ shopt -u globstar
If there is a need to match specific characters then '[]' can be used. Any character
inside '[]' will be matched exactly
once.
$ echo [m]acy
macy
$ echo [st][tr]acy
stacy tracy
The [] glob, however, is more versatile than just that. It also allows for a
negative match and even matching ranges
of characters and character classes. A negative match is achieved by using ! or ^
as the first character following [.
We can match stacy by
$ echo [!t][^r]acy
stacy
Here we are telling bash the we want to match only files which do not not start with
a t and the second letter is not
an r and the file ends in acy.
Ranges can be matched by seperating a pair of characters with a hyphen (-). Any
character that falls between those
two enclosing characters - inclusive - will be matched. E.g., [r-t] is equivalent
to [rst]
$ echo [r-t][r-t]acy
stacy tracy
Character classes can be matched by [:class:], e.g., in order to match files that
contain a whitespace
$ echo *[[:blank:]]*
file with space
118
Section 34.9: Matching hidden files
Preparation
$ mkdir globbing
$ cd globbing
$ mkdir -p folder/{sub,another}folder/content/deepfolder/
touch macy stacy tracy "file with space"
folder/{sub,another}folder/content/deepfolder/file
.hiddenfile
$ shopt -u nullglob
$ shopt -u failglob
$ shopt -u dotglob
$ shopt -u nocaseglob
$ shopt -u extglob
$ shopt -u globstar
The Bash built-in option dotglob allows to match hidden files and folders, i.e.,
files and folders that start with a .
$ shopt -s dotglob
$ echo *
file with space folder .hiddenfile macy stacy tracy
Setting the option nocaseglob will match the glob in a case insensitive manner
$ echo M*
M*
$ shopt -s nocaseglob
$ echo M*
macy
Preparation
$ mkdir globbing
$ cd globbing
$ mkdir -p folder/{sub,another}folder/content/deepfolder/
touch macy stacy tracy "file with space"
folder/{sub,another}folder/content/deepfolder/file
.hiddenfile
$ shopt -u nullglob
119
$ shopt -u failglob
$ shopt -u dotglob
$ shopt -u nocaseglob
$ shopt -u extglob
$ shopt -u globstar
The pattern-list itself can be another, nested extended glob. In the above example
we have seen that we can
match tracy and stacy with *(r-t). This extended glob itself can be used inside the
negated extended glob
!(pattern-list) in order to match macy
$ echo !(*([r-t]))acy
macy
It matches anything that does not start with zero or more occurrences of the
letters r, s and t, which leaves only
macy as possible match.
120
Chapter 35: Change shell
Section 35.1: Find the current shell
There are a few ways to determine the current shell
echo $0
ps -p $$
echo $SHELL
Example:
$ cat /etc/shells
# /etc/shells: valid login shells
/bin/sh
/bin/dash
/bin/bash
/bin/rbash
to change the bash that opens on startup edit .profile and add those lines
121
Chapter 36: Internal variables
An overview of Bash's internal variables, where, how, and when to use them.
$* / $@
Details
Function/script positional parameters (arguments). Expand as follows:
$* and $@ are the same as $1 $2 ... (note that it generally makes no sense to leave
those
unquoted)
"$*" is the same as "$1 $2 ..." 1
"$@" is the same as "$1" "$2" ...
1. Arguments are separated by the first character of $IFS, which does not have to be
a space.
$#
$!
Process ID of the last (righ-most for pipelines) command in the most recently job
put into the
background (note that it's not necessarily the same as the job's process group ID
when job control
is enabled)
$$
$?
$n
${n}
$0
In scripts, path with which the script was invoked; with bash -c 'printf "%s\n"
"$0"' name
args': name (the first argument after the inline script), otherwise, the argv[0]
that bash received.
$_
$IFS
Internal field separator
$PATH
$OLDPWD
$PWD
$FUNCNAME
$BASH_SOURCE
Array containing source paths for elements in FUNCNAME array. Can be used to get
the script path.
$PS1
$PS2
$PS3
$PS4
Quaternary command line prompt (used to append info with verbose output)
$RANDOM
Variable used by read by default when no variable is specified. Also used by SELECT
to return the
user-supplied value
$PIPESTATUS
Array variable that holds the exit status values of each command in the most
recently executed
foreground pipeline.
122
Variable Assignment must have no space before and after. a=123 not a = 123. The
latter (an equal sign
surrounded by spaces) in isolation means run the command a with the arguments = and
123, though it is
also seen in the string comparison operator (which syntactically is an argument to
[ or [[ or whichever
test you are using).
Section 36.2: $@
"$@" expands to all of the command line arguments as separate words. It is different
from "$*", which expands to
Consider we are in a script that we invoked with two arguments, like so:
$ ./script.sh "␣1␣2␣" "␣3␣␣4␣"
The variables $* or $@ will expand into $1␣$2, which in turn expand into 1␣2␣3␣4 so
the loop below:
for var in $*; do # same for var in $@; do
echo \\<"$var"\\>
done
While "$*" will be expanded into "$1␣$2" which will in turn expand into
"␣1␣2␣␣␣3␣␣4␣" and so the loop:
for var in "$*"; do
echo \\<"$var"\\>
done
<␣1␣2␣␣␣3␣␣4␣>
And finally "$@" will expand into "$1" "$2", which will expand into "␣1␣2␣" "␣3␣␣4␣"
and so the loop
for var in "$@"; do
echo \\<"$var"\\>
done
will print
<␣1␣2␣>
<␣3␣␣4␣>
thereby preserving both the internal spacing in the arguments and the arguments
separation. Note that the
GoalKicker.com – Bash Notes for Professionals
123
construction for var in "$@"; do ... is so common and idiomatic that it is the
default for a for loop and can be
shortened to for var; do ....
Section 36.3: $#
To get the number of command line arguments or positional parameters - type:
#!/bin/bash
echo "$#"
When run with three arguments the example above will result with the output:
~> $ ./testscript.sh firstarg secondarg thirdarg
3
This instruction will return nothing if you type it outside the function:
my_function
echo "This function is $FUNCNAME"
124
done
Notes:
This is responsible for the phenomenon known as word splitting.
If number of positional argument is greater than nine, curly braces must be used.
# "set -- " sets positional parameters
set -- 1 2 3 4 5 6 7 8 nine ten eleven twelve
# the following line will output 10 not 1 as the value of $1 the digit 1
# will be concatenated with the following 0
echo $10
# outputs 1
echo ${10} # outputs ten
125
# to show this clearly:
set -- arg{1..12}
echo $10
echo ${10}
Section 36.11: $*
Will return all of the positional parameters in a single string.
testscript.sh:
#!/bin/bash
echo "$*"
Output:
firstarg secondarg thirdarg
Section 36.12: $!
The Process ID (pid) of the last job run in the background:
~> $ ls &
testfile1 testfile2
[1]+ Done
~> $ echo $!
21715
ls
Section 36.13: $?
The exit status of the last executed function or command. Usually 0 will mean OK
anything else will indicate a
failure:
~> $ ls *.blah;echo $?
ls: cannot access *.blah: No such file or directory
2
~> $ ls;echo $?
testfile1 testfile2
0
Section 36.14: $$
The Process ID (pid) of the current process:
~> $ echo $$
13246
126
this variable seeds the random number generator (source).
~> $ echo $RANDOM
27119
~> $ echo $RANDOM
1349
127
Section 36.22: $HOSTTYPE
This variable identifies the hardware, it can be useful in determining which
binaries to execute:
~> $ echo $HOSTTYPE
x86_64
So, for example, given the above $PATH, if you type lss at the prompt, the shell
will look for
/usr/kerberos/bin/lss, then /usr/local/bin/lss, then /bin/lss, then /usr/bin/lss,
in this order, before
128
Section 36.28: $SHELLOPTS
A readonly list of the options bash is supplied on startup to control its
behaviour:
~> $ echo $SHELLOPTS
braceexpand:emacs:hashall:histexpand:history:interactive-comments:monitor
Section 36.29: $_
Outputs the last field from the last command executed, useful to get something to
pass onwards to another
command:
~> $ ls *.sh;echo $_
testscript1.sh testscript2.sh
testscript2.sh
Output:
~> $ ./test.sh # running test.sh
./test.sh
129
In a new terminal window, executing the following command will produce different
results based on the Linux
distribution in use.
echo $SHLVL
Using Fedora 25, the output is "3". This indicates, that when opening a new shell,
an initial bash command executes
and performs a task. The initial bash command executes a child process (another
bash command) which, in turn,
executes a final bash command to open the new shell. When the new shell opens, it is
running as a child process of
2 other shell processes, hence the output of "3".
In the following example (given the user is running Fedora 25), the output of
$SHLVL in a new shell will be set to "3".
As each bash command is executed, $SHLVL increments by one.
~> $ echo $SHLVL
3
~> $ bash
~> $ echo $SHLVL
4
~> $ bash
~> $ echo $SHLVL
5
One can see that executing the 'bash' command (or executing a bash script) opens a
new shell. In comparison,
sourcing a script runs the code in the current shell.
test1.sh
#!/usr/bin/env bash
echo "Hello from test1.sh. My shell level is $SHLVL"
source "test2.sh"
test2.sh
#!/usr/bin/env bash
echo "Hello from test2.sh. My shell level is $SHLVL"
run.sh
#!/usr/bin/env bash
echo "Hello from run.sh. My shell level is $SHLVL"
./test1.sh
Execute:
chmod +x test1.sh && chmod +x run.sh
./run.sh
Output:
Hello from run.sh. My shell level is 4
Hello from test1.sh. My shell level is 5
Hello from test2.sh. My shell level is 5
130
Section 36.33: $UID
A read only variable that stores the users' ID number:
~> $ echo $UID
12345
131
Chapter 37: Job Control
Section 37.1: List background processes
$ jobs
[1]
Running
[2]- Running
[3]+ Running
First field shows the job ids. The + and - sign that follows the job id for two jobs
denote the default job and next
candidate default job when the current default job ends respectively. The default
job is used when the fg or bg
commands are used without any argument.
Second field gives the status of the job. Third field is the command used to start
the process.
The last field (wd: ~) says that the sleep commands were started from the working
directory ~ (Home).
%2 specifies job no. 2. If fg is used without any arguments if brings the last
process put in background to the
foreground.
$ fg %?sle
sleep 500
?sle refers to the baground process command containing "sle". If multiple
background commands contain the
Puts the sleep command in background. 7582 is the process id of the background
process.
sleep 600
132
Chapter 38: Case statement
Section 38.1: Simple case statement
In its simplest form supported by all versions of bash, case statement executes the
case that matches the pattern.
;; operator breaks after the first match, if any.
#!/bin/bash
var=1
case $var in
1)
echo "Antartica"
;;
2)
echo "Brazil"
;;
3)
echo "Cat"
;;
esac
Outputs:
Antartica
Since bash 4.0, a new operator ;& was introduced which provides fall through
mechanism.
#!/bin/bash
var=1
case $var in
1)
echo "Antartica"
;&
2)
echo "Brazil"
;&
3)
echo "Cat"
;&
esac
Outputs:
Antartica
Brazil
Cat
133
Since Bash 4.0, another operator ;;& was introduced which also provides fall
through only if the patterns in
subsequent case statement(s), if any, match.
#!/bin/bash
var=abc
case $var in
a*)
echo "Antartica"
;;&
xyz)
echo "Brazil"
;;&
*b*)
echo "Cat"
;;&
esac
Outputs:
Antartica
Cat
In the below example, the abc matches both first and third case but not the second
case. So, second case is not
executed.
134
Chapter 39: Read a file (data stream,
variable) line-by-line (and/or field-byfield)?
Parameter
IFS
Internal field separator
Details
file
A file name/path
-r
-t
-d DELIM
Continue until the first character of DELIM is read (with read), rather than newline
Or with a pipe:
ping google.com | while IFS= read -d : -r field || [ -n "$field" ];do
echo "**$field**"
done
Or with a loop:
arr=()
while IFS= read -r line; do
arr+=("$line")
done <file
135
Section 39.4: Read lines of a string into an array
var='line 1
line 2
line3'
readarray -t arr <<< "$var"
or with a loop:
arr=()
while IFS= read -r line; do
arr+=("$line")
done <<< "$var"
or
readarray -t arr <<< "$var"
for i in "${arr[@]}";do
echo "-$i-"
done
or with a pipe:
ping google.com |
while IFS= read -r line;do
echo "**$line**"
done
For a content:
first : se
136
con
d:
Thi rd:
Fourth
Output:
-line- 1
line- 2
line3
-
137
echo "${arr[4]}"
Output:
newline
In unix password file, user information is stored line by line, each line consisting
of information for a user separated
by colon (:) character. In this example while reading the file line by line, the
line is also split into fields using colon
character as delimiter which is indicated by the value given for IFS.
Sample input
mysql:x:27:27:MySQL Server:/var/lib/mysql:/bin/bash
pulse:x:497:495:PulseAudio System Daemon:/var/run/pulse:/sbin/nologin
sshd:x:74:74:Privilege-separated SSH:/var/empty/sshd:/sbin/nologin
tomcat:x:91:91:Apache Tomcat:/usr/share/tomcat6:/sbin/nologin
webalizer:x:67:67:Webalizer:/var/www/usage:/sbin/nologin
Sample Output
mysql, 27, MySQL Server /var/lib/mysql
pulse, 497, PulseAudio System Daemon /var/run/pulse
sshd, 74, Privilege-separated SSH /var/empty/sshd
tomcat, 91, Apache Tomcat /usr/share/tomcat6
webalizer, 67, Webalizer /var/www/usage
To read line by line and have the entire line assigned to variable, following is a
modified version of the example.
Note that we have only one variable by name line mentioned here.
#!/bin/bash
FILENAME="/etc/passwd"
while IFS= read -r line
do
echo "$line"
done < $FILENAME
Sample Input
mysql:x:27:27:MySQL Server:/var/lib/mysql:/bin/bash
pulse:x:497:495:PulseAudio System Daemon:/var/run/pulse:/sbin/nologin
sshd:x:74:74:Privilege-separated SSH:/var/empty/sshd:/sbin/nologin
tomcat:x:91:91:Apache Tomcat:/usr/share/tomcat6:/sbin/nologin
webalizer:x:67:67:Webalizer:/var/www/usage:/sbin/nologin
Sample Output
138
mysql:x:27:27:MySQL Server:/var/lib/mysql:/bin/bash
pulse:x:497:495:PulseAudio System Daemon:/var/run/pulse:/sbin/nologin
sshd:x:74:74:Privilege-separated SSH:/var/empty/sshd:/sbin/nologin
tomcat:x:91:91:Apache Tomcat:/usr/share/tomcat6:/sbin/nologin
webalizer:x:67:67:Webalizer:/var/www/usage:/sbin/nologin
139
Chapter 40: File execution sequence
.bash_profile, .bash_login, .bashrc, and .profile all do pretty much the same
thing: set up and define
other similar shells, which bash is based off) Bash will fall back to .profile
if .bash_profile isn't found.
.bash_login is a fallback for .bash_profile, if it isn't found. Generally best to
use .bash_profile or .profile
instead.
140
Chapter 41: Splitting Files
Sometimes it's useful to split a file into multiple separate files. If you have large
files, it might be a good idea to
break it into smaller chunks
This will create files named xaa, xab, xac, etc, each containing up to 1000 lines.
As you can see, all of them are
prefixed with the letter x by default. If the initial file was less than 1000 lines,
only one such file would be created.
To change the prefix, add your desired prefix to the end of the command line
split file customprefix
OR
split --lines=5000 file
Alternatively, you can specify a maximum number of bytes instead of lines. This is
done by using the -b or --bytes
options. For example, to allow a maximum of 1MB
split --bytes=1MB file
141
Chapter 42: File Transfer using scp
Section 42.1: scp transferring file
To transfer a file securely to another machine - type:
scp file1.txt tom@server2:$HOME
This example presents transferring file1.txt from our host to server2's user tom's
home directory.
from my_folder directory with extension .txt to server2. In Below example all files
will be transferred to user tom
home directory.
scp /my_folder/*.txt tom@server2:$HOME
This example shows how to download the file named file.txt from user tom's home
directory to our local
machine's current directory.
142
Chapter 43: Pipelines
Section 43.1: Using |&
|& connects standard output and standard error of the first command to the second
one while | only connects
Output is:
> Host: www.google.com
Google
But with | a lot more information will be printed, i.e. those that are sent to
stderr because only stdout is piped to
the next command. In this example all lines except the last line (Google) were sent
to stderr by curl:
* Hostname was NOT found in DNS cache
*
Trying 172.217.20.228...
* Connected to www.google.com (172.217.20.228) port 80 (#0)
> GET / HTTP/1.1
> User-Agent: curl/7.35.0
> Host: www.google.com
> Accept: */*
>
* HTTP 1.0, assume close after body
< HTTP/1.0 200 OK
< Date: Sun, 24 Jul 2016 19:04:59 GMT
< Expires: -1
< Cache-Control: private, max-age=0
< Content-Type: text/html; charset=ISO-8859-1
< P3P: CP="This is not a P3P policy! See
https://www.google.com/support/accounts/answer/151657?hl=en for more info."
< Server: gws
< X-XSS-Protection: 1; mode=block
< X-Frame-Options: SAMEORIGIN
< Set-Cookie:
NID=82=jX0yZLPPUE7u13kKNevUCDg8yG9Ze_C03o0IMEopOSKL0mMITEagIE816G55L2wrTlQwgXkhq4Ap
FvvYEoaWFoEoq2T0sBTuQVdsIFULj9b2O8X35O0sAgUnc3a3JnTRBqelMcuS9QkQA; expires=Mon, 23-
Jan-2017 19:04:59 GMT;
path=/; domain=.google.com; HttpOnly
< Accept-Ranges: none
< Vary: Accept-Encoding
< X-Cache: MISS from jetsib_appliance
< X-Loop-Control: 5.202.190.157 81E4F9836653D5812995BA53992F8065
< Connection: close
<
{ [data not shown]
* Closing connection 0
Google
143
Section 43.2: Show all processes paginated
ps -e | less
ps -e shows all the processes, its output is connected to the input of more via |,
less paginates the results.
The pipe (|) connects the stdout of ping to the stdin of grep, which processes it
immediately. Some other
commands like sed default to buffering their stdin, which means that it has to
receive enough data, before it will
print anything, potentially causing delays in further processing.
144
Chapter 44: Managing PATH environment
variable
Parameter
Details
PATH
Path environment variable
But this will modify the PATH only in the current shell (and its subshell). Once
you exit the shell, this modification
will be gone.
To make it permanent, we need to add that bit of code to the ~/.bashrc (or
whatever) file and reload the file.
If you run the following code (in terminal), it will add ~/bin to the PATH
permanently:
echo 'PATH=~/bin:$PATH' >> ~/.bashrc && source ~/.bashrc
Explanation:
echo 'PATH=~/bin:$PATH' >> ~/.bashrc adds the line PATH=~/bin:$PATH at the end of
~/.bashrc file (you
This is a bit of code (run in terminal) that will check if a path is already
included and add the path only if not:
path=~/bin
# path to be included
bashrc=~/.bashrc
# bash file to be written and reloaded
# run the following code unmodified
echo $PATH | grep -q "\(^\|:\)$path\(:\|/\{0,1\}$\)" || echo "PATH=\$PATH:$path" >>
"$bashrc";
source "$bashrc"
To make it permanent, you will need to add it at the end of your bash configuration
file.
146
Chapter 45: Word splitting
Parameter
IFS
Internal field separator
-x
Details
Print commands and their arguments as they are executed (Shell option)
In the above example this is how the fun function is being executed:
fun I am a multiline string
the test command for which I am a string with spaces is not a single argument,
rather it's 6
arguments!!
The grep command returns a multiline string with spaces, so you can just imagine
how many arguments
are there...:D
See what, when and why for the basics.
This will fill up arr with all numeric values found in file
Looping through space separated words:
words='foo bar baz'
for w in $words;do
echo "W: $w"
done
Output:
W: foo
W: bar
W: baz
or
packs='
apache2
php
php-mbstring
php-mysql
'
sudo apt-get install $packs
This will install the packages. If you double quote the $packs then it will throw
an error.
Unquoetd $packs is sending all the space separated package names as arguments to
apt-get, while
quoting it will send the $packs string as a single argument and then apt-get will
try to install a package
named apache2 php php-mbstring php-mysql (for the first one) which obviously doesn't
exist
GoalKicker.com – Bash Notes for Professionals
148
See what, when and why for the basics.
It'll split the value of the variable sentence and show it line by line
respectively.
$var is split into 4 args. IFS is white space characters and thus word splitting
occurred in spaces
$ var="This/is/an/example"
$ showarg $var
1 args: <This/is/an/example>
In above word splitting didn't occur because the IFS characters weren't found.
Now let's set IFS=/
$ IFS=/
$ var="This/is/an/example"
$ showarg $var
4 args: <This> <is> <an> <example>
149
a
multiline string'
IFS=' '
fun() {
echo "-$1-"
echo "*$2*"
echo ".$3."
}
fun $var
This time word splitting will only work on spaces. The fun function will be
executed like this:
fun I 'am
a
multiline' string
will prevent word splitting in all the cases discussed above i.e the fun function
will be executed with only one
argument.
150
Chapter 46: Avoiding date using printf
In Bash 4.2, a shell built-in time conversion for printf was introduced: the format
specification %(datefmt)T makes
printf output the date-time string corresponding to the format string datefmt as
understood by strftime.
151
Chapter 47: Using "trap" to react to
signals and system events
Parameter
Meaning
-p
List currently installed traps
-l
152
And a variant which still allows you to quit the main program by pressing ^C twice
in a second:
last=0
allow_quit() {
[ $(date +%s) -lt $(( $last + 1 )) ] && exit
echo "Press ^C twice in a row to quit"
last=$(date +%s)
}
trap allow_quit INT
154
Chapter 48: Chain of commands and
operations
There are some means to chain commands together. Simple ones like just a ; or more
complex ones like logical
chains which run depending on some conditions. The third one is piping commands,
which effectively hands over
the output data to the next command in the chain.
In this case the output of the ls command is used as the input of the grep command.
The result will be the number
of files that include ".conf" in their name.
This can be used to contruct chains of subsequent commands as long as needed:
ls -1 | grep ".conf" | grep -c .
# if you want to run more commands within a logical chain, use curly braces
# which designate a block of commands
# They do need a ; before closing bracket so bash can diffentiate from other uses
# of curly braces
[ a = b ] && { echo "let me see."
echo "hmmm, yes, i think it is true" ; } \
|| { echo "as i am in the negation i think "
echo "this is false. a is a not b." ; }
# mind the use of line continuation sign \
# only needed to chain yes block with || ....
155
Section 48.5: chaining commands with |
The | takes the output of the left command and pipes it as input the right command.
Mind, that this is done in a
subshell. Hence you cannot set values of vars of the calling process within a pipe.
find . -type f -a -iname '*.mp3' | \
while read filename; do
mute --noise "$filename"
done
156
Chapter 49: Type of Shells
Section 49.1: Start an interactive shell
bash
157
GoalKicker.com – Bash Notes for Professionals
158
Chapter 50: Color script output (crossplatform)
Section 50.1: color-output.sh
In the opening section of a bash script, it's possible to define some variables that
function as helpers to color or
otherwise format the terminal output during the run of the script.
Different platforms use different character sequences to express color. However,
there's a utility called tput which
works on all *nix systems and returns platform-specific terminal coloring strings
via a consistent cross-platform API.
For example, to store the character sequence which turns the terminal text red or
green:
red=$(tput setaf 1)
green=$(tput setaf 2)
Or, to store the character sequence which resets the text to default appearance:
reset=$(tput sgr0)
Then, if the BASH script needed to show different colored outputs, this can be
achieved with:
cho "${green}Success!${reset}" echo "${red}Failure.${reset}"
159
Chapter 51: co-processes
Section 51.1: Hello World
# create the co-process
coproc bash
# send a command to it (echo a)
echo 'echo Hello World' >&"${COPROC[1]}"
# read a line from its output
read line <&"${COPROC[0]}"
# show the line
echo "$line"
160
Chapter 52: Typing variables
Section 52.1: declare weakly typed variables
declare is an internal command of bash. (internal command use help for displaying
"manpage"). It is used to show
and define variables or show function bodies.
Syntax: declare [options] [name[=value]]...
# options are used to define
# an integer
declare -i myInteger
declare -i anotherInt=10
# an array with values
declare -a anArray=( one two three)
# an assoc Array
declare -A assocArray=( [element1]="something" [second]=anotherthing )
# note that bash recognizes the string context within []
# some modifiers exist
# uppercase content
declare -u big='this will be uppercase'
# same for lower case
declare -l small='THIS WILL BE LOWERCASE'
# readonly array
declare -ra constarray=( eternal true and unchangeable )
# export integer to environment
declare -xi importantInt=42
You can use also the + which takes away the given attribute. Mostly useless, just
for completness.
To display variables and/or functions there are some options too
# printing definded vars and functions
declare -f
# restrict output to functions only
declare -F # if debugging prints line number and filename defined in too
161
Chapter 53: Jobs at specific times
Section 53.1: Execute job once at specific time
Note: at is not installed by default on most of modern distributions.
To execute a job once at some other time than now, in this example 5pm, you can use
echo "somecommand &" | at 5pm
If you want to catch the output, you can do that in the usual way:
echo "somecommand > out.txt 2>err.txt &" | at 5pm
at understands many time formats, so you can also say
echo "somecommand &" | at now + 2 minutes
echo "somecommand &" | at 17:00
echo "somecommand &" | at 17:00 Jul 7
echo "somecommand &" | at 4pm 12.03.17
If no year or date are given, it assumes the next time the time you specified
occurs. So if you give a hour that
already passed today, it will assume tomorrow, and if you give a month that already
passed this year, it will assume
next year.
This also works together with nohup like you would expect.
echo "nohup somecommand > out.txt 2>err.txt &" | at 5pm
162
#WantedBy=multi-user.target
163
Chapter 54: Handling the system prompt
Escape
Details
\a
A bell character.
\d
The date, in "Weekday Month Date" format (e.g., "Tue May 26").
\D{FORMAT}
The FORMAT is passed to `strftime'(3) and the result is inserted into the prompt
string; an empty
FORMAT results in a locale-specific time representation. The braces are required.
\e
\h
\H
\j
\l
\n
A newline.
\r
A carriage return.
\s
The name of the shell, the basename of `$0' (the portion following the final slash).
\t
\T
\A
\u
\v
\V
\w
The current working directory, with $HOME abbreviated with a tilde (uses the
$PROMPT_DIRTRIM
variable).
\W
\NNN
A backslash.
\[
\]
164
printf "\033[5;1;31mmind the lunch break\033[0m\n";
else
printf "\033[33mstill working...\033[0m\n";
fi;
}
# activating it
export PROMPT_COMMAND=lunchbreak
165
Section 54.5: Using PS1
PS1 is the normal system prompt indicating that bash waits for commands being typed
in. It understands some
escape sequences and can execute functions or progams. As bash has to position the
cursor after the displayes
prompt, it needs to know how to calculate the effective length of the prompt string.
To indicate non printing
sequences of chars within the PS1 variable escaped braces are used: \[ a non
printing sequence of chars \]. All being
said holds true for all PS* vars.
(The black caret indicates cursor)
#everything not being an escape sequence will be literally printed
export PS1="literal sequence " # Prompt is now:
literal sequence ▉
# \u == user \h == host \w == actual working directory
# mind the single quotes avoiding interpretation by shell
export PS1='\u@\h:\w > ' # \u == user, \h == host, \w actual working dir
looser@host:/some/path > ▉
# executing some commands within PS1
# following line will set foreground color to red, if user==root,
# else it resets attributes to default
# $( (($EUID == 0)) && tput setaf 1)
# later we do reset attributes to default with
# $( tput sgr0 )
# assuming being root:
PS1="\[$( (($EUID == 0)) && tput setaf 1 \]\u\[$(tput sgr0)\]@\w:\w \$ "
looser@host:/some/path > ▉ # if not root else <red>root<default>@host....
166
Chapter 55: The cut command
Parameter
-f, --fields
Details
Field-based selection
-d, --delimiter
-c, --characters
b c d e"
cut cannot be used to parse arguments as the shell and other programs do.
167
John,Smith
$ cut -d, -f2,2 <<<'John,Smith,USA' ## Just like -f2
Smith
168
Chapter 56: Bash on Windows 10
Section 56.1: Readme
The simpler way to use Bash in Windows is to install Git for Windows. It's shipped
with Git Bash which is a real Bash.
You can access it with shortcut in :
Start > All Programs > Git > Git Bash
169
Chapter 57: Cut Command
Option
-b LIST, --bytes=LIST
Description
Print the bytes listed in the LIST parameter
-d DELIMITER
In Bash, the cut command is useful for dividing a file into several smaller parts.
This file has 3 columns separated by spaces. To select only the first column, do the
following.
cut -d ' ' -f1 filename
Here the -d flag, specifies the delimiter, or what separates the records. The -f flag
specifies the field or column
number. This will display the following output
John
Robert
...
170
Chapter 58: global and local variables
By default, every variable in bash is global to every function, script and even the
outside shell if you are declaring
your variables inside a script.
If you want your variable to be local to a function, you can use local to have that
variable a new variable that is
independent to the global scope and whose value will only be accessible inside that
function.
Will obviously output "hello", but this works the other way around too:
function foo() {
var="hello"
}
foo
echo $var
Will output nothing, as var is a variable local to the function foo, and its value
is not visible from outside of it.
Will output
171
inside function, var=sup?
outside function, var=hello
172
Chapter 59: CGI Scripts
Section 59.1: Request Method: GET
It is quite easy to call a CGI-Script via GET.
First you will need the encoded url of the script.
Then you add a question mark ? followed by variables.
Every variable should have two sections separated by =.
First section should be always a unique name for each variable,
while the second part has values in it only
Variables are separated by &
Total length of the string should not rise above 255 characters
Names and values needs to be html-encoded (replace: </ , / ? : @ & = + $ )
Hint:
When using html-forms the request method can be generated by it self.
With Ajax you can encode all via encodeURI and encodeURIComponent
Example:
http://www.example.com/cgi-bin/script.sh?var1=Hello%20World!&var2=This%20is%20a
%20Test.&
The server should communicate via Cross-Origin Resource Sharing (CORS) only, to
make request more secure. In
this showcase we use CORS to determine the Data-Type we want to use.
There are many Data-Types we can choose from, the most common are...
text/html
text/plain
application/json
When sending a request, the server will also create many environment variables. For
now the most important
environment variables are $REQUEST_METHOD and $QUERY_STRING.
The Request Method has to be GET nothing else!
The Query String includes all the html-endoded data.
The Script
#!/bin/bash
# CORS is the way to communicate, so lets response to the server first
echo "Content-type: text/html"
# set the data-type we want to use
echo ""
# we don't need more rules, the empty line initiate this.
# CORS are set in stone and any communication from now on will be like reading a
html-document.
# Therefor we need to create any stdout in html format!
# create html scructure and send it to stdout
echo "<!DOCTYPE html>"
echo "<html><head>"
# The content will be created depending on the Request Method
if [ "$REQUEST_METHOD" = "GET" ]; then
173
# Note that the environment variables $REQUEST_METHOD and $QUERY_STRING can be
processed by the
shell directly.
# One must filter the input to avoid cross site scripting.
Var1=$(echo "$QUERY_STRING" | sed -n 's/^.*var1=\([^&]*\).*$/\1/p')
# read value of "var1"
Var1_Dec=$(echo -e $(echo "$Var1" | sed 's/+/ /g;s/%\(..\)/\\x\1/g;'))
# html decode
Var2=$(echo "$QUERY_STRING" | sed -n 's/^.*var2=\([^&]*\).*$/\1/p')
Var2_Dec=$(echo -e $(echo "$Var2" | sed 's/+/ /g;s/%\(..\)/\\x\1/g;'))
# create content for stdout
echo "<title>Bash-CGI Example 1</title>"
echo "</head><body>"
echo "<h1>Bash-CGI Example 1</h1>"
echo "<p>QUERY_STRING: ${QUERY_STRING}<br>var1=${Var1_Dec}<br>var2=${Var2_Dec}</p>"
the values to stdout
else
echo "<title>456 Wrong Request Method</title>"
echo "</head><body>"
echo "<h1>456</h1>"
echo "<p>Requesting data went wrong.<br>The Request method has to be \"GET\" only!
</p>"
fi
echo "<hr>"
echo "$SERVER_SIGNATURE"
# an other environment variable
echo "</body></html>"
# close html
exit 0
</body></html>
175
QUERY_STRING_POST=$(echo "$QUERY_STRING_POST" | sed -e :a -e
's/<[^>]*>//g;/</N;//ba')
removes most html declarations to prevent XSS within documents
JSON=$(echo "$QUERY_STRING_POST" | jq .)
# json encode - This is a pretty save way
to check for valide json code
;;
*)
response_with_html
exit 0
;;
esac
else
response_with_html
exit 0
fi
# Some Commands ...
response_with_json
exit 0
You will get {"message":"Hello World!"} as an answer when sending JSON-Data via
POST to this Script. Every
thing else will receive the html document.
Important is also the varialbe $JSON. This variable is free of XSS, but still could
have wrong values in it and needs to
be verify first. Please keep that in mind.
This code works similar without JSON.
You could get any data this way.
You just need to change the Content-Type for your needs.
Example:
if [ "$REQUEST_METHOD" = "POST" ]; then
case "$CONTENT_TYPE" in
application/x-www-form-urlencoded)
read -n "$CONTENT_LENGTH" QUERY_STRING_POST
text/plain)
read -n "$CONTENT_LENGTH" QUERY_STRING_POST
;;
esac
fi
Last but not least, don't forget to response to all requests, otherwise third party
programms won't know if they
succeeded
176
Chapter 60: Select keyword
Select keyword can be used for getting input argument in a menu format.
Explanation: Here SELECT keyword is used to loop through a list of items that will
be presented at the command
prompt for a user to pick from. Notice the break keyword for breaking out of the
loop once the user makes a
choice. Otherwise, the loop will be endless!
Results: Upon running this script, a menu of these items will be displayed and the
user will be prompted for a
selection. Upon selection, the value will be displayed, returning back to command
prompt.
>bash select_menu.sh
1) linux
2) windows
3) mac
#? 3
mac
>
177
Chapter 61: When to use eval
First and foremost: know what you're doing! Secondly, while you should avoid using
eval, if its use makes for
cleaner code, go ahead.
This code is often accompanied by getopt or getopts to set $@ to the output of the
aforementioned option parsers,
however, you can also use it to create a simple pop function that can operate on
variables silently and directly
without having to store the result to the original variable:
isnum()
{
# is argument an integer?
local re='^[0-9]+$'
if [[ -n $1 ]]; then
[[ $1 =~ $re ]] && return 0
return 1
else
return 2
fi
}
isvar()
{
if isnum "$1"; then
return 1
fi
local arr="$(eval eval -- echo -n "\$$1")"
if [[ -n ${arr[@]} ]]; then
return 0
fi
return 1
}
pop()
{
if [[ -z $@ ]]; then
return 1
fi
local var=
local isvar=0
local arr=()
if isvar "$1"; then # let's check to see if this is a variable or just a bare array
var="$1"
isvar=1
arr=($(eval eval -- echo -n "\${$1[@]}")) # if it is a var, get its contents
else
arr=($@)
fi
178
# we need to reverse the contents of $@ so that we can shift
# the last element into nothingness
arr=($(awk <<<"${arr[@]}" '{ for (i=NF; i>1; --i) printf("%s ",$i); print $1; }'
# set $@ to ${arr[@]} so that we can run shift against it.
eval set -- "${arr[@]}"
shift # remove the last element
# put the array back to its original order
arr=($(awk <<<"$@" '{ for (i=NF; i>1; --i) printf("%s ",$i); print $1; }'
# echo the contents for the benefit of users and for bare arrays
echo "${arr[@]}"
if ((isvar)); then
# set the contents of the original var to the new modified array
eval -- "$var=(${arr[@]})"
fi
}
179
Chapter 62: Networking With Bash
Bash is often commonly used in the management and maintenance of servers and
clusters. Information pertaining
to typical commands used by network operations, when to use which command for which
purpose, and
examples/samples of unique and/or interesting applications of it should be included
The above command will show all active interface of the machine and also give the
information of
1. IP address assign to interface
2. MAC address of the interface
3. Broadcast address
4. Transmit and Receive bytes
Some example
ifconfig -a
The above command (Packet Internet Grouper) is to test the connectivity between the
two nodes
ping -c2 8.8.8.8
The above command will ping or test the connectivity with google server for 2
seconds.
traceroute
The above command is to use in troubleshooting to find out the number of hops taken
to reach the destination.
netstat
180
The above command (Network statistics) give the connection info and their state
dig www.google.com
The above command (domain information grouper) query the DNS related information
nslookup www.google.com
The above command query the DNS and find out the IP address of corresponding the
website name.
route
The above command is used to check the Netwrok route information. It basically show
you the routing table
router add default gw 192.168.1.1 eth0
The above command will add the default route of network of eth0 Interface to
192.168.1.1 in routing table.
route del default
The above command will delete the default route from the routing table
181
Chapter 63: Parallel
Option
Description
-j n
-k
-X
--colsep regexp
Example: [email protected]
--trc {}.bar
--onall
--nonall
--pipe
--recend str
--recstart str
Jobs in GNU Linux can be parallelized using GNU parallel. A job can be a single
command or a small script that has
to be run for each of the lines in the input. The typical input is a list of files,
a list of hosts, a list of users, a list of
URLs, or a list of tables. A job can also be a command that reads from a pipe.
Using GNU Parallel, we can run 3 parallel jobs at once by simply doing
parallel -j 3 "bzcat {} | grep puppies" ::: $( cat filelist.txt ) | gzip >
output.gz
This command is simple, concise and more efficient when number of files and file size
is large. The jobs gets
initiated by parallel, option -j 3 launches 3 parallel jobs and input to the
parallel jobs is taken in by :::. The
output is eventually piped to gzip > output.gz
182
Section 63.2: Parallelize STDIN
Now, let's imagine we have 1 large file (e.g. 30 GB) that needs to be converted,
line by line. Say we have a script,
convert.sh, that does this <task>. We can pipe contents of this file to stdin for
parallel to take in and work with in
chunks such as
<stdin> | parallel --pipe --block <block size> -k <task> > output.txt
The above example takes <stdin> from bzcat data.bz2 | nl, where I included nl just
as a proof of concept that
the final output output.gz will be saved in the order it was received. Then,
parallel divides the <stdin> into
chunks of size 10 MB, and for each chunk it passes it through nl -n rz where it
just appends a numbers rightly
justified (see nl --help for further details). The options --pipe tells parallel to
split <stdin> into multiple jobs
and -- block specifies the size of the blocks. The option -k specifies that ordering
must be maintained.
Your final output should look something like
000001
000002
000003
000004
000005
...
000587
000588
000589
000590
000591
1
2
3
4
5
<data>
<data>
<data>
<data>
<data>
552409
552410
552411
552412
552413
<data>
<data>
<data>
<data>
<data>
My original file had 552,413 lines. The first column represents the parallel jobs,
and the second column represents
the original line numbering that was passed to parallel in chunks. You should
notice that the order in the second
column (and rest of the file) is maintained.
183
Chapter 64: Decoding URL
Section 64.1: Simple example
Encoded URL
https%3A%2F%2Ffanyv88.com%3A443%2Fhttp%2Fwww.foo.com%2Findex.php%3Fid%3Dqwerty
Use this command to decode the URL
echo "https%3A%2F%2Ffanyv88.com%3A443%2Fhttp%2Fwww.foo.com%2Findex.php%3Fid%3Dqwerty" | sed -e "s/%\([0-9A-F]
[0-9AF]\)/\\\\\x\1/g" | xargs -0 echo -e
184
Chapter 65: Design Patterns
Accomplish some common design patterns in Bash
185
#
subscribe "/do/work"
"action1" "${DIR}"
subscribe "/do/more/work"
"action2" "${DIR}"
subscribe "/do/even/more/work" "action1" "${DIR}"
#
# Execute our events
#
publish "/do/work"
publish "/do/more/work"
publish "/do/even/more/work" "again"
Run:
chmod +x pubsub.sh
./pubsub.sh
186
Chapter 66: Pitfalls
Section 66.1: Whitespace When Assigning Variables
Whitespace matters when assigning variables.
foo = 'bar' # incorrect
foo= 'bar' # incorrect
foo='bar'
# correct
The first two will result in syntax errors (or worse, executing an incorrect
command). The last example will correctly
set the variable $foo to the text "bar".
If cd-ing to this directory fails, Bash will ignore the failure and move onto the
next command, wiping clean the
directory from where you ran the script.
The best way to deal with this problem is to make use of the set command:
#!/bin/bash
set -e
cd ~/non/existent/directory
rm -rf *
set -e tells Bash to exit the script immediately if any command returns a non-zero
status.
To make sure this works correctly for in the above example, add a test so that it
will continue the loop if the last line
is not empty.
GoalKicker.com – Bash Notes for Professionals
187
$ while read line || [ -n "$line" ] ; do echo "line $line" ; done < file.txt
one
two
three
188
Appendix A: Keyboard shortcuts
Section A.1: Editing Shortcuts
Shortcut
Ctrl + a
Description
Ctrl + e
Ctrl + k
Kill the text from the current cursor position to the end of the line.
Ctrl + u
Kill the text from the current cursor position to the beginning of the line
Ctrl + w
Alt + b
Alt + f
Ctrl + Alt + e
Ctrl + y
Yank the most recently killed text back into the buffer at the cursor.
Alt + y
Rotate through killed text. You can only do this if the prior command is Ctrl + y
or
Alt + y .
Killing text will delete text, but save it so that the user can reinsert it by
yanking. Similar to cut and paste except that
the text is placed on a kill ring which allows for storing more than one set of
text to be yanked back on to the
command line.
You can find out more in the emacs manual.
Ctrl + p
Ctrl + n
Ctrl + g
Alt + .
Alt + n
Alt + .
!! + Return
Description
start recording a macro
Ctrl + x , )
Ctrl + x , e
189
If you want to execute the line immediately add \C-m ( Enter ) to it:
bind '"\ew"':"\" >/dev/null 2>&1\C-m\""
Description
Stop the current job
Ctrl + z
190
Credits
Thank you greatly to all the people from Stack Overflow Documentation who helped
provide this content,
more changes can be sent to [email protected] for new content to be published or
updated
Ajay Sangale
Ajinkya
Alessandro Mascolo
Alexej Magura
Amir Rachum
Anil
anishsane
Antoine Bolvy
Archemar
Arronical
Ashari
Ashkan
Batsu
Benjamin W.
binki
Blachshma
Bob Bagwill
Bostjan
BrunoLM
Brydon Gibson
Bubblepop
Burkhard
BurnsBA
Carpetsmoker
cb0
Chandrahas Aroori
chaos
charneykaye
chepner
Chris Rasys
Christopher Bottoms
codeforester
Cody
Colin Yang
Cows quack
CraftedCart
CrazyMax
criw
Daniel Käfer
Danny
Dario
David Grayson
Deepak K M
deepmax
depperm
dhimanta
dimo414
Chapter 1
Chapter 20
Chapters 11 and 26
Chapters 9, 12, 36 and 61
Chapter 8
Chapter 1
Chapter 5
Chapter 9
Chapter 9
Chapter 12
Chapter 36
Chapters 36 and 43
Chapter 17
Chapters 1, 9, 12, 15, 24, 31, 36, 46 and 47
Chapter 21
Chapter 1
Chapter 1
Chapter 7
Chapter 14
Chapter 9
Chapters 1, 5, 8, 12 and 24
Chapter 1
Chapter 22
Chapter 47
Chapter 28
Chapter 6
Chapter 9
Chapter 50
Chapters 15, 27 and 46
Chapter 34
Chapters 1, 3 and 5
Chapter 12
Chapter 66
Chapter 1
Chapter 30
Chapter 1
Chapter 64
Chapter 36
Chapter 67
Chapter 1
Chapters 28, 36 and 55
Chapter 9
Chapter 20
Chapter 25
Chapters 4 and 35
Chapter 62
Chapter 14
191
dingalapadum
divyum
DocSalvager
Doctor J
DonyorM
Dr Beco
Dunatotatos
Echoes_86
Edgar Rokjān
edi9999
Eric Renouf
fedorqui
fifaltra
Flows
Gavyn
George Vasiliou
Gilles
glenn jackman
Grexis
Grisha Levit
gzh
hedgar2017
Holt Johnson
I0_ol
Iain
IamaTacos
Inanc Gumus
Inian
intboolstring
Jahid
James Taylor
Jamie Metzger
jandob
janos
Jeffrey Lin
JepZ
jerblack
Jesse Chen
JHS
jimsug
John Kugelman
Jon
Jon Ericson
Jonny Henly
jordi
Judd Rogers
Kelum Senanayake
ksoni
leftaroundabout
Leo Ufimtsev
liborm
lynxlynxlynx
m02ph3u5
Chapters 7 and 16
Chapters 1 and 14
Chapter 10
Chapter 28
Chapter 10
Chapter 36
Chapter 51
Chapter 17
Chapter 10
Chapter 14
Chapter 9
Chapters 12, 15, 17, 20, 28 and 34
Chapters 8 and 53
Chapter 18
Chapters 9, 26, 33 and 36
Chapters 9, 15 and 58
Chapters 21 and 22
Chapters 1, 4, 5 and 7
Chapter 15
Chapter 36
Chapter 10
Chapters 9, 15 and 22
Chapter 4
Chapter 64
Chapters 4 and 20
Chapter 35
Chapter 1
Chapters 17 and 28
Chapters 4, 5 and 7
Chapters 1, 5, 9, 10, 12, 14, 15, 17, 20, 21, 22, 23, 30, 34, 39, 43, 44 and 45
Chapter 23
Chapter 31
Chapter 29
Chapters 7, 10, 12, 14, 20 and 24
Chapter 49
Chapter 3
Chapter 12
Chapters 15, 26 and 45
Chapters 7, 19 and 67
Chapter 24
Chapter 12
Chapter 63
Chapter 9
Chapter 4
Chapter 48
Chapters 9 and 67
Chapter 23
Chapter 30
Chapter 17
Chapter 33
Chapter 9
Chapter 43
Chapter 67
192
markjwill
Markus V.
Mateusz Piotrowski
Matt Clark
mattmc
Michael Le Barbier
Grünewald
Mike Metzger
miken32
Misa Lazovic
Mohima Chaudhuri
nautical
NeilWang
Neui
Ocab19
ormaaj
Osaka
P.P.
Pavel Kazhevets
Peter Uhnak
phs
Pooyan Khosravi
Rafa Moyano
Reboot
Riccardo Petraglia
Richard Hamilton
Riker
Roman Piták
Root
Sameer Srivastava
Samik
Samuel
Saqib Rokadia
satyanarayan rao
Scroff
Sergey
sjsam
Sk606
Skynet
SLePort
Stephane Chazelas
Stobor
suleiman
Sundeep
Sylvain Bugat
Thomas Champion
Tim Rijavec
TomOnTime
Trevor Clarke
tripleee
tversteeg
uhelp
UNagaswamy
Chapter 12
Chapter 4
Chapter 12
Chapters 1, 9, 14, 17, 19 and 23
Chapters 36 and 65
Chapter 14
Chapter 8
Chapters 9 and 10
Chapters 4 and 30
Chapters 18 and 41
Chapter 34
Chapter 12
Chapter 8
Chapter 58
Chapter 12
Chapter 4
Chapter 38
Chapter 25
Chapter 31
Chapter 47
Chapter 9
Chapter 42
Chapter 42
Chapter 8
Chapters 4, 16, 41 and 57
Chapters 1 and 40
Chapter 47
Chapters 5, 8 and 9
Chapter 8
Chapters 4, 5, 10, 12, 14 and 37
Chapter 5
Chapter 67
Chapter 1
Chapter 66
Chapter 14
Chapters 1 and 32
Chapters 8, 12 and 33
Chapter 45
Chapters 5 and 10
Chapters 15 and 36
Chapter 20
Chapter 59
Chapter 1
Chapters 2, 4, 9, 14 and 15
Chapter 56
Chapter 25
Chapter 47
Chapter 1
Chapters 1, 5, 14, 17 and 36
Chapter 30
Chapters 2, 7, 13, 20, 31, 36, 47, 48, 52, 53 and 54
Chapters 12, 13 and 60
193
user1336087
vielmetti
vmaroli
Warren Harper
Wenzhong
Will
Will Barnwell
William Pursell
Wojciech Kazior
Wolfgang
xhienne
ymbirtt
zarak
Zaz
Мона_Сах
南山竹
Chapters 1 and 26
Chapter 5
Chapter 39
Chapter 9
Chapter 30
Chapters 12, 15 and 21
Chapter 24
Chapters 1, 36 and 49
Chapter 36
Chapter 9
Chapter 5
Chapter 15
Chapters 8, 24 and 31
Chapter 1
Chapter 28
Chapters 1, 5, 9, 12 and 17
194
You may also like