Because the data of the server website needs to be backed up regularly, I also wrote a simple one click package to back up all website directories. Here I share it with you. If you need it, you can refer to it.
#!/bin/bash
#--------------------------------------------
#Pack and compress all website directories under the current directory to tar gz
#All packaged tar. The GZ file is saved to/ In the backup / directory,
#And the packing records and results are stored in the/ backup/backup. log
#In log file
#--------------------------------------------
date=$(date "+%Y%m%d")
Date2 = $(date "+% y year% m month% d day% h hour% m minute% s second")
for i in $(ls)
do
if [ -d $i ]
then
/bin/tar -zcvf "./backup/"${i}"_backup_$date.tar.gz" $i
if [ $? -eq 0 ]
then
Echo - e "folder $I packed and compressed successfully in ${date2} \ n" > >/ backup/backup. log
else
Echo - e "folder $I failed to pack and compress on ${$date2} \ n" > >/ backup/backup. log
fi
fi
done