![]()
Shell scripts are full of secrets and hidden tricks, so it’s helpful to have a few patterns on hand. For inspiration, try these scripts from real projects, including Homebrew, BashBlog, and nvm. By learning from these examples, you can improve your shell scripts and learn new techniques.
Find the default configuration location
To keep your configuration files tidy
Use this example when you want to set a configuration directory for your project:
"${XDG_CONFIG_HOME:-$HOME/.config}/myproject/config"
This value will expand to the folder that your project can use to store per-user configuration files.
A very common example, this uses shell parameter expansion, one of many Types of extensions supported by Bash. It also uses XDG Core Catalog Specificationthis helps keep your directory layout clean and standard.
The :- syntax ensures that this part of the value is either $XDG_CONFIG_HOME – if set and not empty – or $HOME/.config otherwise. The XDG specification recommends that if XDG_CONFIG_HOME does not exist, $HOME/.config should be used by default; this model respects that.
Todo.txt, a shell script that manages the to-do list file, uses this model to specify one of the locations where it looks for the configuration file.
Find the executable program
Make sure your script’s dependencies are handled appropriately
To find a binary from a set of possible alternatives, use an example like this:
(( -f Markdown.pl )) && markdown_bin=./Markdown.pl \
|| markdown_bin=$(which Markdown.pl 2>/dev/null \
|| which markdown 2>/dev/null)
This model uses boolean logic, short-circuiting, and test operators check condition (presence of Markdown.pl file) set variable using standard file or alternative. The alternative uses the which command to find one of two possible backups in the current user’s path.
The resulting variable, markdown_bin, will contain either the path to the appropriate executable or an empty string (you can try using -n to do this). BashBlog, a simple blogging system written in a single bash script, uses this model to provide additional support for Markdown using the standard program.
Create a random file name
When you need to write to a new file but don’t care about its name, try this example:
while (( -f $out )); do out=${out%.html}.$RANDOM.html; done
This while loop uses the -f operator again using the special RANDOM variable. The clever bit ensures that the file no longer exists, although this is a brute-force solution: generate random filenames until one of them no longer exists.
$RANDOM is not the best approach if you really need random numbers, but it’s fine for this kind of use. In this particular case, the final file name is mypost.6592.html, mypost.26005.html, etc. will be something like
BashBlog uses this common pattern to send the generated output to a temporary file.
Request a variable
Defensive programming is good practice
Use this example to ensure that a variable is defined before doing anything else:
do_stuff() {
(( -z $global_variable )) && return
}
The combination of -z to check for an empty line and the short circuit && to return early is quite flexible and can be applied to a wide range of situations. In this particular case, it is used to check the global variable and terminate the function early. This is a good practice if you have a function that depends on a value and can’t take any reasonable action other than failing gracefully.
You can also check for the presence of positional parameters, so you can request them if you recover from failure early:
(( -z $1 )) && return
Again, BashBlog uses this example broadly. Homebrew also uses it In this brew script that checks common variables like BASH_VERSION, PWD, and HOME.
Assign parameters to local variables
For anyone who needs to protect your code, including you
Using clear, logical names for variables makes your code easier to read and less risky to edit. Here is a sample example that might help a lot:
foo() {
username=$1
name=$2
...
}
Unlike most programming languages, Bash script does not support named parameters for programs either or functions. In their place, $1 for the first referral, $2 for the second, and so on. There are positional parameters such as But within a script or function, especially those on the longer side, $1 and $2 soon become awkward to work with.
This example works by solving that problem, assigning positional parameters to local variables with readable names at the first opportunity.
Resetting parameters also helps prevent problems when they are processed in a loop or modified using set is established. If your function takes a username in $1, reassigning it will make it available later, no matter what happens to $1. It also helps if you need to rearrange the parameters of your function; If so, then you just need to change the initial assignment at the top of the function, not every $1 usage inside it.
The The nvm script uses this patternand is widely used in many shell scripts.
Redirect output from multiple commands to a file
This example can eliminate a lot of redundancy
You probably already know How to refer like a probut it can still be awkward, especially with lots of commands. Thankfully, there is a shortcut, and this example takes full advantage of it:
{
command1
command2
} > filename
This example will run command1 and then command2, directing the output of each to a file named filename. The more commands you have to run, the more you gain by not having to repeat the filename and redirection operator every time, and it’s very easy to use a different filename if needed.
It’s important to note that there is a very similar grouping syntax, using parentheses instead of curly braces:
(
command1
command2
) > filename
The difference is that these commands now run in a subshell, meaning that, for example, any variable assignments are not available outside of grouping.
There are additional syntax nuances, but you shouldn’t encounter them if you use this multiline format.
Process the file line by line
Take everything one step at a time
This example will help you process configuration files, Markdown text, and other simple text files. This is most useful when each line of the file is largely independent of the rest:
while IFS='' read -r line; do
...
command $line
...
done < $filename
There’s a lot here, so I’ll take it slowly. The read builtin takes data from stdin and stores it in a variable – in this case. Returns false if there is nothing to read, so the loop will continue until all input is exhausted, along with redirecting input from the named file. The IFS (internal field separator) variable affects how it handles leading/trailing spaces, ensuring that the reading is preserved in this case.
Use heredocs to write to the file
This particular syntax seems cleaner
Heredoc syntax it’s one you’ll find a reason to use again after picking it up. But it’s more powerful than you might imagine, as this example shows:
bar() {
cat <<- EOF > "$filename"
Enter lines
of text here
EOF
}
The documentation here allows you to redirect multiple lines of input without using any awkward escape characters or repeating complex syntax. The one in this example is even more special, but has two interesting features.
First, it uses the syntax with a dash followed by double lowercase characters to trim leading characters from each line between delimiters. This allows you to go inside to line things up nicely, which is a particular concern inside a function or any other block construct.
Second, it uses cat with output redirection to send the contents of the heredoc to a file. This makes it pointless to populate a file from a script, be it a README, a configuration file, or something else.
If you want to use variable expansion within the document here, make sure to leave the delimiter on the first line without quotes.
Stand on the shoulders of giants
Try to find out Popular shell projects on GitHub to get inspired and learn new techniques. If you come across a particular syntax or command that you don’t understand, look it up to understand why it’s used that way. Learn from other shell programmers and your script will get better and better.




