backup strategy: reduce size of daily backup dumps

We need a new backup strategy.
In the meantime, dramatically reduce the size by not dumping large tables.
TODO: even increase the frequency of dumping the more volatile data?
TODO: implement some kind of incremental backup.
This commit is contained in:
Monocots 2019-06-30 15:48:30 +00:00
parent 36709ef8cf
commit 58f74c3095

View file

@ -28,6 +28,11 @@ function perform_backups()
SUFFIX=$1 SUFFIX=$1
FINAL_BACKUP_DIR=$BACKUP_DIR"`date +\%Y-\%m-\%d`$SUFFIX/" FINAL_BACKUP_DIR=$BACKUP_DIR"`date +\%Y-\%m-\%d`$SUFFIX/"
EX_TABLES = ""
if [ $SUFFIX = "-daily" ] ; then
EX_TABLES=" -T cdr -T sms -T prefix_mexico -T rates "
fi
echo "Making backup directory in $FINAL_BACKUP_DIR" echo "Making backup directory in $FINAL_BACKUP_DIR"
if ! mkdir -p $FINAL_BACKUP_DIR; then if ! mkdir -p $FINAL_BACKUP_DIR; then