Build an HTTP Status Code Checker Using Bash For Loop + curl
Monitoring website availability is a common task for developers, sysadmins, and DevOps engineers. While there are plenty of website uptime monitoring tools available, sometimes you just need a lightweight, quick solution right from the terminal. In this post, we’ll walk through how to build a simple HTTP status code checker using a Bash for
loop and the ubiquitous curl
command. This script reads URLs from a plain text file and prints their status codes—perfect for lightweight uptime checks or automation scripts.
1. Why Use Bash and curl?
Bash is installed on virtually every Unix-based system, and curl
is the Swiss Army knife of network tools. Together, they let us make HTTP requests and process data without relying on third-party dependencies. This makes the solution perfect for scripting environments, cron jobs, and CI pipelines.
Some reasons to use Bash + curl:
- No external dependencies beyond what’s typically pre-installed
- Lightweight and fast
- Automatable and scriptable for cron jobs or CI/CD workflows
- Useful for quick diagnostics and bulk URL checks
2. Preparing the Input: The URLs File
Start by creating a text file named urls.txt
that contains the list of websites or endpoints you want to check. Each URL should go on its own line:
https://google.com
https://github.com
https://nonexistent.example.com
https://httpstat.us/404
This plain text format makes it easy to modify and extend the list without editing code. Keep in mind that URLs should be full addresses including http://
or https://
.
3. Writing the Bash Script
Let’s write a bash script named status_checker.sh
that loops through all URLs within urls.txt
and prints their HTTP status codes using curl
.
#!/bin/bash
# Check if file exists
if [ ! -f urls.txt ]; then
echo "Error: urls.txt not found!"
exit 1
fi
# Loop through each URL in the file
while IFS= read -r url; do
if [[ -z "$url" ]]; then
continue
fi
status=$(curl -o /dev/null -s -w "%{http_code}" "$url")
echo "$url -> HTTP $status"
done < urls.txt
Let’s break down how this works:
curl -o /dev/null
discards the body of the response-s
enables silent mode, suppressing progress output-w "%{http_code}"
outputs only the HTTP status code- We loop over lines in the file using
while read
to handle spaces and special characters robustly
4. Making It Smarter: Output Formatting and Error Handling
Let’s enhance the script by adding color-coded outputs and handling unreachable URLs (e.g., DNS errors, connection timeouts):
#!/bin/bash
if [ ! -f urls.txt ]; then
echo "Error: urls.txt not found!"
exit 1
fi
reset="\033[0m"
green="\033[0;32m"
yellow="\033[1;33m"
red="\033[0;31m"
while IFS= read -r url; do
if [[ -z "$url" ]]; then
continue
fi
status=$(curl --max-time 10 -o /dev/null -s -w "%{http_code}" "$url")
if [[ "$status" =~ ^[0-9]{3}$ ]]; then
if [[ "$status" -ge 200 && "$status" -lt 300 ]]; then
echo -e "$url -> ${green}HTTP $status${reset}"
elif [[ "$status" -ge 400 && "$status" -lt 600 ]]; then
echo -e "$url -> ${red}HTTP $status${reset}"
else
echo -e "$url -> ${yellow}HTTP $status${reset}"
fi
else
echo -e "$url -> ${red}Failed to retrieve status${reset}"
fi
done < urls.txt
Here’s what’s changed:
--max-time 10
to prevent hanging on slow connections- If
curl
fails to return a valid status code (like during a DNS error), we catch it and print a failure - We use ANSI escape codes for green (200s), yellow (other valid responses), and red (errors)
5. Automation and Performance Tips
This Bash script is great for small lists, but you may run into bottlenecks if checking hundreds of URLs serially. Here are some tips to scale or automate it:
- Parallel Processing: Use GNU
parallel
or background jobs to make simultaneous curl requests. - Cron Integration: Schedule checks by placing a line like
0 * * * * /path/status_checker.sh > result.txt
in your crontab. - Logging: Redirect output to a timestamped file for historical tracking:
./status_checker.sh > logs/status_$(date +%Y%m%d).log
- Webhook Alerts: You could extend the script to trigger Slack or Discord webhooks on failures.
For large-scale monitoring, you may want to move to a language with better concurrency primitives like Go or Python’s asyncio—but for day-to-day tasks, Bash gets the job done fast and simply.
6. Final Thoughts
This minimal HTTP status checker is a powerful addition to your toolbox, especially when working on remote servers or constrained environments. With just a few lines of shell scripting, you can gain visibility into server uptime, verify deployments, or validate endpoints. By building a habit of scripting small utilities like this in Bash, you’ll improve both your automation skills and your operational efficiency.
Happy scripting!
Useful links: