Skip to content

Shell Scripting Best Practices

A collection of conventions and patterns that make shell scripts more reliable, portable, and maintainable. These aren't opinions - they prevent real bugs.


Start Every Script Right

Every bash script should begin with:

#!/bin/bash
set -euo pipefail

What Each Option Does

set -e (errexit) - exit immediately if any command returns non-zero:

set -e
cp important.txt /backup/     # if this fails, the script stops
rm important.txt               # this won't run if cp failed

Without -e, the script would happily continue after the failed cp and delete the file.

set -u (nounset) - treat references to unset variables as errors:

set -u
rm -rf "$DEPLOY_DIR/app"      # if DEPLOY_DIR is unset, script exits with an error

Without -u, an unset DEPLOY_DIR expands to empty, and you'd run rm -rf /app.

Without set -u, unset variable in rm -rf expands to empty

This is one of the most dangerous scripting bugs. rm -rf "$UNSET_VAR/" becomes rm -rf / when the variable is empty. The -u option makes bash abort immediately on any reference to an unset variable, turning a catastrophic silent failure into a clear error message.

set -o pipefail - a pipeline fails if any command in it fails:

set -o pipefail
cat /nonexistent | sort        # pipeline returns non-zero (cat failed)

Without pipefail, only the exit code of the last command (sort) matters, hiding the cat failure.


Always Quote Your Variables

Unquoted variables undergo word splitting and glob expansion. This causes bugs with filenames containing spaces, wildcards, or empty values.

# BAD
file=$1
rm $file              # if file="my documents", this runs: rm my documents

# GOOD
file="$1"
rm "$file"            # runs: rm "my documents"
# BAD
if [ -n $var ]; then   # if var is empty, this becomes: [ -n ] (always true)

# GOOD
if [ -n "$var" ]; then # correctly tests for non-empty

The rule is simple: always double-quote variable expansions ("$var") unless you specifically want word splitting.

Exceptions where quoting is unnecessary: - Inside [[ ]] (no word splitting, but quoting doesn't hurt) - Inside $(( )) arithmetic


Use [[ ]] Over [ ] in Bash

[[ ]] is a bash keyword that's safer and more powerful than [ ]:

# [ ] requires careful quoting
[ -n "$var" ]           # works
[ -n $var ]             # BUG if var is empty

# [[ ]] handles it
[[ -n $var ]]           # works even without quotes

# [[ ]] supports pattern matching
[[ $file == *.txt ]]    # glob pattern

# [[ ]] supports regex
[[ $email =~ ^[a-z]+@[a-z]+\.[a-z]+$ ]]

# [[ ]] uses familiar logical operators
[[ $a -gt 0 && $b -gt 0 ]]     # clean
[ "$a" -gt 0 ] && [ "$b" -gt 0 ]  # clunky equivalent with [ ]

If you're writing a bash script (not a POSIX sh script), always use [[ ]].


Prefer $() Over Backticks

Both perform command substitution, but $() is clearer and nests properly:

# BAD - backticks are hard to read and don't nest
result=`echo \`date\``

# GOOD - $() nests cleanly
result=$(echo $(date))

Backticks also have surprising escaping rules. $() behaves predictably.


Use mktemp for Temporary Files

Never hardcode temporary file paths. Multiple script instances would collide, and predictable paths are a security risk.

# BAD
tmpfile="/tmp/mydata.tmp"

# GOOD
tmpfile=$(mktemp)
tmpdir=$(mktemp -d)

# Always clean up
trap 'rm -f "$tmpfile"' EXIT

mktemp creates a unique filename that doesn't already exist. Combine with trap to ensure cleanup.


Use shellcheck

shellcheck is a static analysis tool that catches common mistakes in shell scripts:

shellcheck myscript.sh

It finds issues like: - Unquoted variables - Useless cat usage - Incorrect test syntax - Word splitting bugs - Unreachable code

Install it:

sudo apt install shellcheck       # Debian/Ubuntu
sudo dnf install ShellCheck       # Fedora
brew install shellcheck           # macOS

Run it in CI/CD to catch shell script bugs before they reach production.

Use shellcheck in CI/CD pipelines

Add shellcheck *.sh to your CI pipeline to catch bugs automatically on every commit. ShellCheck's exit code is non-zero when it finds issues, making it easy to integrate. Use # shellcheck disable=SC2086 comments to suppress specific warnings when you know what you're doing.

Here's what shellcheck output looks like on a buggy script:

$ cat buggy.sh
#!/bin/bash
files=$(ls *.txt)
for f in $files; do
    if [ $f == "important" ]; then
        rm $f
    fi
done

$ shellcheck buggy.sh
In buggy.sh line 2:
files=$(ls *.txt)
        ^------^ SC2012: Use find instead of ls to better handle non-alphanumeric filenames.

In buggy.sh line 3:
for f in $files; do
         ^----^ SC2086: Double quote to prevent globbing and word splitting.

In buggy.sh line 4:
    if [ $f == "important" ]; then
         ^-- SC2086: Double quote to prevent globbing and word splitting.
         ^-- SC2039: In POSIX sh, == in place of = is undefined.

Each finding includes a code (like SC2086) that you can look up at the shellcheck wiki for a detailed explanation and fix. Editor integration makes this even more useful - the VS Code ShellCheck extension and vim plugins like ALE or Syntastic show warnings inline as you type, catching bugs before you even save the file.


Don't Parse ls Output

The output of ls is meant for humans, not scripts. Filenames with spaces, newlines, or special characters will break parsing.

# BAD - breaks on filenames with spaces
for file in $(ls); do
    echo "$file"
done

# GOOD - globbing handles filenames correctly
for file in *; do
    echo "$file"
done

# GOOD - find with -print0 for truly safe handling
find . -type f -print0 | while IFS= read -r -d '' file; do
    echo "$file"
done

Globbing handles spaces correctly where ls parsing does not

Shell globbing (for f in *.txt) expands filenames as properly quoted tokens, so my file.txt stays as one argument. Parsing ls output (for f in $(ls)) splits on whitespace, turning my file.txt into two separate arguments: my and file.txt. Globbing is both safer and simpler.


Use Arrays for Lists

Don't store lists of items in a string variable. Use arrays.

# BAD - breaks on filenames with spaces
files="file one.txt file two.txt"
for f in $files; do
    echo "$f"        # prints: file, one.txt, file, two.txt
done

# GOOD - arrays preserve elements
files=("file one.txt" "file two.txt")
for f in "${files[@]}"; do
    echo "$f"        # prints: file one.txt, file two.txt
done

Array operations:

arr=("one" "two" "three")
echo "${arr[0]}"              # first element
echo "${arr[@]}"              # all elements
echo "${#arr[@]}"             # number of elements
arr+=("four")                 # append

# Build arrays from command output
mapfile -t lines < file.txt   # read file into array (one line per element)

Use mapfile to read files into arrays

mapfile -t lines < file.txt (also called readarray) reads an entire file into an array with one element per line. The -t flag strips trailing newlines. This is safer than lines=($(cat file)) which breaks on spaces and globbing characters.


Portable vs Bash-Specific

If your script needs to run on minimal systems (Docker containers, embedded systems, Debian's dash), avoid bash-specific features:

Bash-Specific POSIX Alternative
[[ ]] [ ]
$(( )) $(( )) (this one is POSIX)
{1..10} seq 1 10
function name { name() {
source file . file
$RANDOM No POSIX equivalent
Arrays No POSIX equivalent
<<< here strings echo "str" \| cmd

If you need bash features, make sure your shebang is #!/bin/bash, not #!/bin/sh. On some systems, /bin/sh is dash, which doesn't support bash extensions.

#!/bin/sh may not be bash on all systems

On Debian, Ubuntu, and Alpine, /bin/sh is dash (or ash), not bash. If your script uses [[ ]], arrays, $RANDOM, process substitution, or any bash-specific feature, it will silently break or produce wrong results under dash. Always use #!/bin/bash for bash scripts.

When to target POSIX sh: Docker scratch images and minimal containers often only have /bin/sh (usually dash or busybox ash). Cron jobs on minimal systems may run under sh by default. CI/CD pipeline scripts that need to run across different environments (Alpine Linux, Ubuntu, macOS) are safer with POSIX sh. When bash is fine: application scripts, developer tools, interactive helpers, and anything where you control the execution environment. If you're writing a deployment script that only runs on your Ubuntu servers, use bash and enjoy its features - forcing POSIX compatibility on a known-bash environment just makes the code harder to read.


Avoid Common Pitfalls

Don't Use eval

eval is almost always a security risk

eval interprets its arguments as shell code, meaning any user-controlled input becomes executable commands. An attacker passing ; rm -rf / as input to eval "process $input" gets arbitrary command execution. Use indirect expansion (${!var}), associative arrays, or declare -n nameref variables instead.

eval executes a string as a command. It's almost always a security risk and there's usually a better way.

# BAD - eval is dangerous
eval "rm $user_input"

# GOOD - pass arguments directly
rm "$filename"

If you find yourself reaching for eval to dynamically construct variable names, bash has safer alternatives. Indirect expansion with ${!var} lets you dereference a variable name stored in another variable:

key='HOME'
echo "${!key}"    # prints the value of $HOME

Associative arrays (bash 4+) are even better for key-value lookups:

declare -A config
config[host]='localhost'
config[port]='5432'
echo "${config[host]}"

Both approaches avoid the security risk of eval interpreting arbitrary strings as code.

Handle Missing Commands

Check that required tools exist before using them:

for cmd in jq curl git; do
    if ! command -v "$cmd" &>/dev/null; then
        echo "Required command not found: $cmd" >&2
        exit 1
    fi
done

Use readonly for Constants

readonly CONFIG_DIR="/etc/myapp"
readonly LOG_FILE="/var/log/myapp.log"
readonly MAX_RETRIES=3

readonly prevents accidental reassignment. Use it for values that should never change.

Use readonly for configuration constants

Declare values that should never change with readonly: paths, URLs, retry counts, and other configuration. If any code accidentally tries to reassign a readonly variable, bash raises an error immediately rather than silently using the wrong value.

Use Epoch Timestamps

For log files and backups, use epoch timestamps or ISO dates to avoid collisions and ensure sorting:

# Epoch seconds
logfile="deploy_$(date +%s).log"

# ISO format (human-readable and sortable)
logfile="deploy_$(date +%Y%m%d_%H%M%S).log"

Always use --dry-run when available before destructive operations

Many commands offer a --dry-run (or -n) flag that shows what would happen without actually doing it. Use it before rsync --delete, rm -rf, apt autoremove, git clean, and similar destructive operations. The few seconds spent previewing can prevent hours of recovery work.

Script Template

Ideal script structure showing shebang, readonly constants, utility functions, cleanup trap, usage, main function, arg parsing, business logic, and main call at bottom

A starting point for new scripts:

#!/bin/bash
set -euo pipefail

# BASH_SOURCE[0] is the path to this script, even when sourced from another script.
# $0 would give the caller's name instead. cd + pwd resolves symlinks and relative paths.
readonly SCRIPT_DIR="$(cd "$(dirname "${BASH_SOURCE[0]}")" && pwd)"
readonly SCRIPT_NAME="$(basename "$0")"

usage() {
    cat <<EOF
Usage: $SCRIPT_NAME [options] <argument>

Description of what this script does.

Options:
    -h, --help    Show this help message
    -v, --verbose Enable verbose output
EOF
    exit 1
}

log() {
    echo "[$(date '+%Y-%m-%d %H:%M:%S')] $*" >&2
}

# main() is defined here but called at the bottom of the file. This pattern lets you
# define helper functions anywhere in the file without worrying about order - bash reads
# function definitions before executing main(). It also makes the entry point obvious.
main() {
    local verbose=false

    while [[ $# -gt 0 ]]; do
        case "$1" in
            -h|--help) usage ;;
            -v|--verbose) verbose=true; shift ;;
            -*) echo "Unknown option: $1" >&2; usage ;;
            *) break ;;
        esac
    done

    [[ $# -lt 1 ]] && usage

    log "Starting $SCRIPT_NAME"
    # ... your logic here ...
    log "Done"
}

# "$@" passes all command-line arguments to main, preserving quoting.
main "$@"

Further Reading


Previous: Archiving and Compression | Next: Cron and Scheduled Tasks | Back to Index

Comments