unidef network
backups are extremely important, and i suggest the cloud or if you really have the money, a storage system where you can save files.
this bash script timestamps tar files, stores them locally, and allows you to transfer over nfs or whatever medium you use to network computers to save files, including samba
i shared this with my web host, and hopefully they implement it in their system, along with my other suggestions!
here you go:
#!/bin/bash
echo "hi $1"
echo "welcome to $0"
echo "setting variables"
TIME=`date +%Y-%m-%d_%H-%M-%S`
echo "TIME $TIME"
DIR_TO_BACKUP=~/.nvm/
echo "DIR_TO_BACKUP $DIR_TO_BACKUP"
BACKUP_FILE_PREFIX=~
echo "BACKUP_FILE_PREFIX $BACKUP_FILE_PREFIX"
BACKUP_DIR=$BACKUP_FILE_PREFIX/backup/cloudflare/$TIME
echo "BACKUP_DIR $BACKUP_DIR"
BACKUP_FILENAME=$BACKUP_FILE_PREFIX/$TIME.tar
echo "BACKUP_FILENAME $BACKUP_FILENAME"
BACKUP_FILENAME_GZ=$BACKUP_FILE_PREFIX/$TIME.tar.gz
echo "BACKUP_FILENAME_GZ $BACKUP_FILENAME_GZ"
echo "variables set"
echo "starting backup"
echo "please wait until script denotes that backup is done"
mkdir $BACKUP_DIR
echo "made $BACKUP_DIR"
echo "creating tarball in $BACKUP_DIR for directory $DIR_TO_BACKUP"
echo "dont worry about the error from tar !"
echo "the error regards preserving filename structure!"
echo "saving tarball to $BACKUP_FILENAME"
tar cf $BACKUP_FILENAME $DIR_TO_BACKUP
echo "tarball done"
echo "copy tarball ($TIME.tar) to $BACKUP_DIR"
cp -r $BACKUP_FILENAME $BACKUP_DIR
echo "copying done"
echo "gzip of $BACKUP_FILENAME started"
gzip $BACKUP_FILENAME
echo "gzip done"
echo "copying $BACKUP_FILENAME but gzipped to $BACKUP_DIR"
cp $BACKUP_FILENAME_GZ $BACKUP_DIR
echo "done at $TIME"
echo "be sure to delete/copy/archive/etc $BACKUP_FILENAME and $BACKUP_FILENAME_GZ !!"
should work but at your descreition