In order to have a stable Linux server we need to have a stable, continuous backup system in case our data get lost or server crash.

Best solution for me is bash script compressing the data via .gz format. To achieve this i use the following command, open a file called backup_file.sh and paste:

[code]cd /home/backup/files/ && tar cfz  public_html-$(date +%Y-%m-%d).tar.gz /var/www/public_html[/code]

This is the stored files, now we should backup also the database as well, open another bash file named backup_db.sh.
[code]#!/bin/bash
USER=”database_user”
PASSWORD=”user_password”
OUTPUT=”/backup_directory/”
rm “$OUTPUT/*gz” > /dev/null 2>&1
databases=`mysql –user=$USER –password=$PASSWORD -e “SHOW DATABASES;” | tr -d “| ” | grep -v Database`
for db in $databases; do
if [[ “$db” != “information_schema” ]] && [[ “$db” != _* ]] ; then
echo “Dumping database: $db”
mysqldump –force –opt –user=$USER –password=$PASSWORD –databases $db > $OUTPUT/`date +%Y-%m-%d`.$db.sql
gzip $OUTPUT/`date +%Y-%m-%d`.$db.sql
fi
done [/code]

USER = is the mysql user to login
PASSWORD = mysql user password
OUTPUT = the directory where the database backup should be stored

In this way we will have a file and database backup system, now we should execute periodically, a good way to do this is via cron job system.  Add the following line into the crontab

[code]00 1 * * * root /directory_path/ backup_file.sh
00 1 * * * root /directory_path/ backup_db.sh [code/]

The cron will execute a backup file and backup database every day at 1 am, this means we will have a full storage after some time.

To solve this i added a extra bash code to delete the files older than 3 days for example.

[code]#!/bin/bash
find /home/backup/files -mtime +3 -exec rm {} \;[/code]

Please add another one for the database folder

[code]#!/bin/bash
find /home/backup/db -mtime +3 -exec rm {} \;[/code]

Please remind to “chmod +x” the bash files in order to get executed via cron.

Leave a Reply

Your email address will not be published. Required fields are marked *