backup strategy: reduce size of daily backup dumps
We need a new backup strategy. In the meantime, dramatically reduce the size by not dumping large tables. TODO: even increase the frequency of dumping the more volatile data? TODO: implement some kind of incremental backup.
This commit is contained in:
parent
36709ef8cf
commit
58f74c3095
1 changed files with 5 additions and 0 deletions
|
@ -28,6 +28,11 @@ function perform_backups()
|
||||||
SUFFIX=$1
|
SUFFIX=$1
|
||||||
FINAL_BACKUP_DIR=$BACKUP_DIR"`date +\%Y-\%m-\%d`$SUFFIX/"
|
FINAL_BACKUP_DIR=$BACKUP_DIR"`date +\%Y-\%m-\%d`$SUFFIX/"
|
||||||
|
|
||||||
|
EX_TABLES = ""
|
||||||
|
if [ $SUFFIX = "-daily" ] ; then
|
||||||
|
EX_TABLES=" -T cdr -T sms -T prefix_mexico -T rates "
|
||||||
|
fi
|
||||||
|
|
||||||
echo "Making backup directory in $FINAL_BACKUP_DIR"
|
echo "Making backup directory in $FINAL_BACKUP_DIR"
|
||||||
|
|
||||||
if ! mkdir -p $FINAL_BACKUP_DIR; then
|
if ! mkdir -p $FINAL_BACKUP_DIR; then
|
||||||
|
|
Loading…
Add table
Add a link
Reference in a new issue