@CronMaster3000: Schedule Daily Reports via Bash + crontab + SQL
In a world overflowing with data, automating daily reporting is no longer just nice-to-have—it’s a must. Thankfully, with a simple combination of Bash scripting, crontab scheduling, and SQL queries, you can automate daily KPI report generation and delivery effortlessly. This guide walks you through setting up a complete workflow that extracts data from a database, formats it as CSV, and sends it via email.
1. The Use Case: Automating the Daily KPI Report
Imagine you’re managing a sales dashboard, and you want a daily summary email showing yesterday’s KPIs (e.g., total sales, number of new users, etc.). Instead of pulling this data manually each day, why not automate it via a scheduled job using cron, Bash, and SQL?
Here’s the overall flow:
- Schedule a cron job at 6:00 AM every day
- Run a Bash script that connects to the database
- Execute SQL queries and export results to a CSV file
- Email the report to a predefined list of recipients
2. Writing the Bash Script
Let’s create a script called daily_report.sh that runs a SQL query and emails the results. We’ll assume you’re working with PostgreSQL, but you can adapt this for MySQL or SQLite easily.
#!/bin/bash
# Configuration
DB_NAME=my_database
DB_USER=admin
DB_HOST=localhost
REPORT_DIR="/home/reports"
DATE=$(date +"%Y-%m-%d")
FILENAME="kpi_report_$DATE.csv"
EMAIL="team@company.com"
# SQL query
QUERY="COPY (SELECT date, total_sales, new_customers FROM daily_kpis WHERE date = CURRENT_DATE - INTERVAL '1 day') TO STDOUT WITH CSV HEADER;"
# Generate report
mkdir -p $REPORT_DIR
psql -U $DB_USER -h $DB_HOST -d $DB_NAME -c "$QUERY" > "$REPORT_DIR/$FILENAME"
# Send email
mail -s "Daily KPI Report - $DATE" -A "$REPORT_DIR/$FILENAME" $EMAIL < /dev/null
Why this works: We use PostgreSQL’s COPY command to dump query output directly to CSV. The mail command sends this file as an attachment. Simple and effective.
3. Scheduling with crontab
Now, let’s schedule this script to run automatically every day at 6 AM. Run crontab -e and add the following line:
0 6 * * * /bin/bash /home/reports/daily_report.sh > /home/reports/cron.log 2>&1
Make sure your script is executable:
chmod +x /home/reports/daily_report.sh
This cron expression means: At minute 0 of hour 6 (6:00 AM), every day, run the script. We also redirect output to a log file for debugging.
4. Tips for Robustness & Scalability
Here are a few best practices to make your report pipeline more reliable:
- Error Handling: Use
set -ein the script to exit on failure. - Logging: Store logs for both success and error tracking. Use
loggeror append timestamps in files. - Notification on Failure: Hook up conditional logic to notify admin (e.g., send a failure email if the query fails).
- SQL Optimization: Maintain indexes for queried columns to reduce execution time.
- Security: Avoid hardcoding passwords; use .pgpass file with restricted permissions.
5. Advanced Enhancements
When you’re ready to get fancier:
- Multiple Queries: Combine several KPIs into a single report using JOINs or run separate SQLs and aggregate the output.
- Email Formatting: Switch to
muttor a Python script for rich HTML emails with tables and graphs. - Version Control: Store your report scripts in Git and separate environments (dev/staging/prod).
- Containerize: Run the script in a Docker container for portable repeatability.
This system is already powerful, but as your KPIs evolve and teams grow, you can layer on sophistication without rewriting the core system.
6. Wrapping Up
Combining Bash, SQL, and cron jobs is a classic but incredibly efficient pattern for daily data automation. With only a few lines of code, you’ve created a robust pipeline for extracting and distributing actionable insights. You can now sleep well knowing the KPIs are always delivered before everyone gets to work. 💻☕
Happy scripting!
Useful links:


