Bash Scripting for QA
Bash is the glue that holds CI/CD pipelines, test environments, and automation workflows together. You do not need to be a shell scripting expert, but you must be comfortable writing scripts that set up test environments, run test suites, and process results.
The Essential Script Template
Every bash script should start with this:
#!/bin/bash
set -euo pipefail
| Flag | Meaning | Why It Matters |
|---|---|---|
set -e |
Exit on any error | Prevents scripts from silently continuing after a failure |
set -u |
Error on undefined variables | Catches typos in variable names |
set -o pipefail |
Pipe fails if any command in the pipe fails | `cmd1 |
Without these flags, a script can fail silently — the database reset fails, but the tests still run against stale data.
Test Environment Setup Script
#!/bin/bash
set -euo pipefail
echo "=== Resetting test database ==="
psql -h localhost -U testuser -d testdb -f reset.sql
echo "=== Starting test server ==="
npm run start:test &
SERVER_PID=$!
# Wait for server to be ready (up to 30 seconds)
echo "=== Waiting for server ==="
for i in {1..30}; do
if curl -s http://localhost:3000/health > /dev/null 2>&1; then
echo "Server ready after ${i}s"
break
fi
if [ $i -eq 30 ]; then
echo "ERROR: Server did not start within 30s"
kill $SERVER_PID 2>/dev/null || true
exit 1
fi
sleep 1
done
echo "=== Running tests ==="
pytest tests/ --junitxml=results.xml
TEST_EXIT=$?
echo "=== Stopping server ==="
kill $SERVER_PID 2>/dev/null || true
echo "=== Done (exit code: $TEST_EXIT) ==="
exit $TEST_EXIT
Key Patterns in This Script
- Background process (
&and$!): Start the server in the background and capture its PID - Health check loop: Wait for the server to be ready before running tests
- Exit code capture (
$?): Save the test exit code before running cleanup - Cleanup on exit: Kill the server regardless of test outcome
- Graceful error handling:
kill $PID 2>/dev/null || truedoes not fail if the process is already dead
Environment Variables
Environment variables parameterize test runs across environments.
#!/bin/bash
set -euo pipefail
# Default values with fallback
BASE_URL="${API_BASE_URL:-http://localhost:3000}"
DB_HOST="${TEST_DB_HOST:-localhost}"
WORKERS="${TEST_WORKERS:-4}"
echo "Running tests against: $BASE_URL"
echo "Database: $DB_HOST"
echo "Workers: $WORKERS"
API_BASE_URL="$BASE_URL" pytest tests/ -n "$WORKERS"
# Run against different environments
API_BASE_URL=https://staging.example.com ./run_tests.sh
API_BASE_URL=https://production.example.com TEST_WORKERS=1 ./run_tests.sh
Useful Bash Constructs
Conditional Execution
# Run smoke tests first, only run full suite if smoke passes
pytest tests/smoke/ && pytest tests/full/
# Run cleanup regardless of test outcome
pytest tests/ || true
./cleanup.sh
File and Directory Operations
# Create test output directory
mkdir -p test-results/screenshots
# Check if file exists
if [ -f "test-data.sql" ]; then
psql testdb < test-data.sql
else
echo "WARNING: test-data.sql not found"
fi
# Clean up old test artifacts
find test-results/ -name "*.png" -mtime +7 -delete
Looping and Parallel Execution
# Run tests for each environment
for env in staging production; do
echo "Testing $env..."
API_BASE_URL="https://${env}.example.com" pytest tests/smoke/ \
--junitxml="results-${env}.xml"
done
# Parallel execution with xargs
echo "staging production sandbox" | tr ' ' '\n' | \
xargs -P 3 -I {} bash -c 'API_BASE_URL="https://{}.example.com" pytest tests/smoke/'
String Processing
# Extract test count from pytest output
RESULT=$(pytest tests/ --tb=no 2>&1 | tail -1)
echo "Result: $RESULT"
# "5 passed, 2 failed in 12.3s"
# Check if tests failed
if echo "$RESULT" | grep -q "failed"; then
echo "TESTS FAILED"
exit 1
fi
CI/CD Pipeline Scripts
GitHub Actions Helper Script
#!/bin/bash
set -euo pipefail
# Parse command line arguments
SUITE="${1:-smoke}"
ENVIRONMENT="${2:-staging}"
echo "Running $SUITE tests against $ENVIRONMENT"
# Set environment-specific variables
case "$ENVIRONMENT" in
staging)
export API_BASE_URL="https://api.staging.example.com"
export DB_HOST="staging-db.internal"
;;
production)
export API_BASE_URL="https://api.example.com"
export DB_HOST="prod-db.internal"
;;
*)
echo "Unknown environment: $ENVIRONMENT"
exit 1
;;
esac
# Run the appropriate test suite
case "$SUITE" in
smoke)
pytest tests/ -m smoke --junitxml=results.xml
;;
full)
pytest tests/ --junitxml=results.xml -n 4
;;
api)
pytest tests/api/ --junitxml=results.xml
;;
*)
echo "Unknown suite: $SUITE"
exit 1
;;
esac
Docker-Based Test Runner
#!/bin/bash
set -euo pipefail
echo "=== Building test image ==="
docker build -t test-runner -f Dockerfile.test .
echo "=== Starting dependencies ==="
docker compose -f docker-compose.test.yml up -d db redis
echo "=== Waiting for database ==="
for i in {1..30}; do
if docker compose -f docker-compose.test.yml exec -T db pg_isready; then
break
fi
sleep 1
done
echo "=== Running tests ==="
docker run --rm \
--network test-network \
-e API_BASE_URL=http://app:3000 \
-e DB_HOST=db \
-v "$(pwd)/test-results:/app/test-results" \
test-runner pytest tests/ --junitxml=/app/test-results/results.xml
TEST_EXIT=$?
echo "=== Cleaning up ==="
docker compose -f docker-compose.test.yml down
exit $TEST_EXIT
Trap for Cleanup
Ensure cleanup runs even if the script crashes:
#!/bin/bash
set -euo pipefail
cleanup() {
echo "Cleaning up..."
kill $SERVER_PID 2>/dev/null || true
docker compose down 2>/dev/null || true
}
# Register cleanup to run on EXIT, regardless of how the script ends
trap cleanup EXIT
# Now start things up — cleanup is guaranteed to run
docker compose up -d
SERVER_PID=$!
pytest tests/
Common Bash Reference
| Concept | Syntax | Why It Matters |
|---|---|---|
set -euo pipefail |
Script header | Catches errors instead of silently continuing |
$? |
Exit code of last command | Check if a tool succeeded before proceeding |
$! |
PID of last background process | Track and kill background servers |
${VAR:-default} |
Default value | Parameterize without requiring the variable |
command & |
Run in background | Start servers without blocking |
trap cleanup EXIT |
Run on exit | Guaranteed cleanup |
xargs -P N |
Parallel execution | Run repetitive tasks in parallel |
2>/dev/null |
Suppress stderr | Hide expected error messages |
| ` | true` |
Practical Exercise
Write a bash script that:
- Accepts an environment name as a command-line argument (staging/production)
- Starts a local database using Docker
- Waits for the database to be ready (health check loop)
- Runs smoke tests with pytest
- Captures the test exit code
- Cleans up Docker containers (using trap)
- Exits with the test exit code
Key Takeaways
- Always start scripts with
set -euo pipefailto catch errors - Use environment variables to parameterize across environments
- Health check loops prevent tests from running against unready services
- Capture test exit codes before running cleanup
- Use
trap cleanup EXITto guarantee cleanup runs - Bash is the glue between tools — learn enough to orchestrate your test infrastructure
Interview Talking Point: "I write test automation in Python and TypeScript. I use OOP for framework structure — page objects with composition over inheritance — and functional patterns like map/filter for data transformation. I am comfortable with async/await, regex for log parsing, and Bash scripting for CI pipeline tasks. Programming is not a secondary skill for me; it is how I build reliable, maintainable test infrastructure."