Unit testing Bash scripts

Leaking Bash options and modularization challenges

Unit testing in Bash is perhaps even more important than in most other languages. Because it has so many sharp edges it’s even more to benefit from a test suite during development. Although Bash comes with its own unique set of challenges when it comes to modularization, it’s possible to isolate and test individual components.

Requirements for unit testing

In Bash you can define functions to capture part of the business logic.

function hello() {
  local username=$1
  echo "Hello, $username"
}

Functions can be invoked from the same script file but it’s also possible to import definitions to other scripts by using the source command.

source greeting.sh
hello "John"

These tools provide the foundations for unit testing.

source greeting.sh

result=$(hello "John")
expected=$("Hello, John")

if [[ "${result}" == "${expected}" ]]; then
  echo "Test passed!"
else
  echo "Test failed!"
  exit 1
fi

(This example does not leverage a testing framework, but conceptually it would look similarly.)

However, it’s not quite that simple in Bash. Even if there are several functions in a script file they are typically also invoked by the same script.

# define function
function hello() {
  local username=$1
  echo "Hello, $username"
}

# call function
hello "John"

In these case sourcing the file would not only include the function definitions but it would also execute them immediately.

Let’s see two approaches on how to deal with this.

We write articles like this regularly. Join our mailing list and let's keep in touch.

Modularization

A possible solution is to separate the script into multiple files by extracting the function definitions into a separate file and leaving the rest behind. The file with the definitions could be easily unit tested.

Slicing up a big file into multiple, smaller chunks is probably not a bad idea anyway; it’s easier to reuse and understand smaller modules.

Let’s create a more modular version of the previous example by creating the following two files in the same directory.


informal_module.sh:

function hello() {
  local username=$1
  echo "Hello, $username"
}


greeting.sh:

source informal_module.sh
hello "John"


It looks clean and nice, but unfortunately, it would only work in some cases. source works relative to the current working directory (where the script is invoked from) rather than to the location of the script itself.

Depending on which directory we are at the time when we execute the previous example, it might fail to include the definitions from informal_module.sh.

According to Stack Overflow (1, 2), the BASH_SOURCE variable can be used reliably to determine the path to the directory of the script file. This can be used to include other scripts relative to its location.

Let’s see how greeting.sh can be modified to overcome this problem.

script_dir="$( cd "$( dirname "${BASH_SOURCE[0]}" )" && pwd )"
source "${script_dir}/informal_module.sh"
hello "John"

Detect sourcing

Another alternative suggested by Darin London is to do what Python does and modify the script to detect if it’s being run directly, or if it is just being sourced.

function hello() {
  local username=$1
  echo "Hello, $username"
}

# Call the function only if the script is executed directly.
if [[ "${BASH_SOURCE[0]}" == "${0}" ]]; then
  hello "John"
fi

This script can be executed normally to do its job, but if it’s sourced to a test file only the function definitions will be imported without doing anything else.

Dealing with leaking Bash Options

The set builtin is capable of changing how Bash works. Some examples are:

  • set -e: Exit immediately if a pipeline, which may consist of a single simple command, a list, or a compound command returns a non-zero status.
  • set -u: Treat unset variables and parameters as an error and exit immediately.

Depending on your programming style, setting some of these options at the beginning of a script might be considered as a best practice (see the Unofficial Bash Strict Mode), or something to be avoided completely (Why doesn’t set -e do what I expected?, What are the advantages and disadvantages of using set -u?).

In any case, it’s important to keep this in mind when designing for testing and modularity, because sourcing a file that calls set might globally change how Bash interprets commands.

Bash options can make it harder to design reusable modules, because they have to work nicely with different combinations, depending on which file are they imported to.

It’s even worse when a sourced function changes one of the options, thus changing the behavior of the original and other included scripts work.

For unit testing where scripts files are sourced to the test case, this can also be a problem.

Consider this example where we use diff in a test to compare files:

# include functions to be tested
source "${SRC}/greeting.sh"

# exercise a function
hello "John" formal_greeting.txt informal_greeting.txt

# assert that the two produced files are different
diff formal_greeting.txt informal_greeting.txt

if [ $? -eq 1 ]; then
  echo "Test passed!"
else
  echo "Test failed Formal and informal greeting for the user should be different!"
  exit 1
fi

If greeting.sh does not change the Bash options all is fine. However, if it defines set -e the test will misbehave, as it will terminate immediately if diff finishes with a non-zero exit code.

Similar problems can arise when setting -u as it changes how undefined variables can be detected.

It’s possible to query the current Bash options with shopt -s and set -o, and according to this SO post it’s also possible to save and later restore shell options, but it might not work in all environments and adds quite some complexity. However, this might come in handy in cases where a function is sourced into multiple scripts, using different options.

Although a good testing framework can help a lot by isolating the individual test cases from each other, keep Bash options in mind when designing your tests.

Conclusion

Testing Bash scripts is a rarely used practice, after all, Bash doesn’t make it any easier. However, general best practices for software development, such as modularity and unit testing—even if they are not always easy to achieve—can greatly enhance the experience of programming such scripts.

12 May 2020
In this article