Backing up critical system files, configurations, and application data is essential for any production-grade Linux server. In this article, we walk you through building a powerful yet flexible backup script using Bash β ideal for developers, sysadmins, and DevOps engineers who want control over their backup logic without relying on external tools.
π Why Use a Custom Backup Script?
While tools like rsnapshot
, Bacula
, or Duplicity
are robust and mature, a custom script allows:
- Complete control over what gets backed up
- Seamless integration into existing workflows (e.g., GitLab, cron)
- Minimal dependencies
- Lightweight and portable logic
π¦ Key Features of the Script
This Bash script:
- Creates daily full or incremental backups
- Automatically includes the latest GitLab backup archive
- Backs up vital directories like
/var/www
,/etc
, and more - Uploads the final compressed archive to a remote FTP server
- Cleans up temporary files only after a successful upload
Letβs explore it step by step.
π§ 1. Initial Setup & Parameters
We begin by defining whether the backup is full or incremental, based on the input argument:
#!/bin/bash -x
INCREMENTAL=$1
This allows calling the script as either:
./backup.sh # full backup
./backup.sh Y # incremental backup
π 2. Trigger GitLab Backup (If Installed)
If your server hosts a GitLab instance, triggering its backup ensures Git repositories and metadata are saved:
gitlab-backup create
This assumes GitLab is installed via Omnibus and accessible via PATH
.
π 3. Locate the Latest GitLab Backup
Once the GitLab backup completes, the script captures the most recent .tar
file from the /backup
directory:
LATEST_BACKUP=$(ls -t /backup/*.tar 2>/dev/null | head -n1)
if [ ! -f "$LATEST_BACKUP" ]; then
echo "Error: No GitLab backup found."
exit 1
fi
This ensures that the latest GitLab archive is included in the final backup, and errors are caught early.
π 4. Define What to Back Up
Here we define a comprehensive list of directories and files to include in the backup archive. This includes configuration files, website data, system settings, and the latest GitLab backup:
FILESTOBACKUP="/var/www* /etc/host* /etc/dhcp* /etc/sysconfig/network* $LATEST_BACKUP /usr/local/src* /etc/firewalld* /etc/group* /etc/shadow* /etc/gshadow* /etc/postfix* /etc/passwd* /etc/openldap* /etc/crontab*"
Using shell wildcards (*
) ensures flexibility across file variations (e.g., passwd-
backups).
π§ 5. Define Output Filename & Target Location
The backup archive filename includes the hostname and date to ensure uniqueness:
BACKUPLOC="$(ip addr | grep inet | grep 192.168 | awk '{print $2}' | cut -f1 -d"/")_$(hostname)"
BACKUPOUTPUT="/backup/$(hostname)-$(date +%Y%m%d)-full.tgz"
This dynamic naming is especially helpful when managing backups from multiple servers.
π 6. Include OS and Package Version Info
Before creating the archive, itβs a good idea to capture OS and package versions β helpful for post-disaster recovery:
cat /etc/*release > /backup/vers.info
rpm -qa >> /backup/vers.info
π¦ 7. Create the Backup Archive
Depending on the INCREMENTAL
flag, the script uses tar
to compress:
- Full backups: all defined files + GitLab backup + version info + modified files in
/repos-old
- Incremental backups: only recently modified files
if [ ! -z "$INCREMENTAL" ] && [ "$INCREMENTAL" == "Y" ]; then
tar czvf "$BACKUPOUTPUT" $(find $FILESTOBACKUP -mtime -1)
else
tar czvf "$BACKUPOUTPUT" /backup/vers.info $FILESTOBACKUP $(find /repos-old -mtime -1)
fi
βοΈ 8. Upload Backup to Remote Server
To ensure offsite safety, the script uploads the .tgz
archive to a remote FTP server:
/usr/bin/ncftpput -u"$BACKUPUSER" -p"$BACKUPPASS" -m "$BACKUPHOST" "/NetBackup/$BACKUPLOC" "$BACKUPOUTPUT"
-m
ensures the destination directory is created if it doesnβt exist.
π§Ή 9. Cleanup on Success
To avoid cluttering the local disk, the script removes the archive and version info only if the upload was successful:
if [ $? -eq 0 ]; then
rm -f "$BACKUPOUTPUT" /backup/vers.info
fi
This avoids data loss in case of network or authentication issues during upload.
π§ͺ Example Cron Job
You can schedule this script via cron
to run daily:
0 2 * * * /path/to/backup.sh >> /var/log/backup.log 2>&1
Or use systemd timers for more flexibility.
π Security Tip
Never hardcode credentials in scripts that are shared or versioned. Use:
- Environment variables
.netrc
files (for FTP/SFTP)- External secrets managers (Vault, AWS Secrets Manager)
π§ Final Thoughts
This Bash script is a great starting point for:
- Backing up key Linux system files
- Including GitLab archives
- Automating uploads to offsite locations
Itβs a practical DevOps solution that can be tailored for any environment β from startups to enterprise systems.