With all the “disappearing” Google Drive content, I needed a way to quickly backup my Drive — here’s how to replicate my steps.
I’ve been feeling uneasy about Google Drive since the reports about users’ data disappearing, so I had to look up how to back up your drive with rclone (https://rclone.org/). It’s surprisingly easy! For future reference, I’ve written it down here for myself (and others).
The following quick guide includes:
Other thoughts / notes:
subl /Users/janzheng/.config/rclone/rclone.conf
--drive-skip-dangling-shortcuts
to skip shortcuts and suppress the error--vv
(very verbose) doesn’t output anything; it just sits there. Quitting + restarting it seems to work.brew install rclone
rclone config
— https://rclone.org/drive/rclone lsd jan_gdrive:
list top-level directoriesclone ls jan_gdrive:
list all the files in driverclone copy jan_gdrive:/MyFolder .
— copy a MyFolder from google drive to local folderrclone copy --update --verbose --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s --stats-file-name-length 0 "jan_gdrive:FOLDER" .
— fancier copy of a folder to a local folder (source: https://www.youtube.com/watch?v=GvPI9ls0ahw)which rclone
make sure where it’s installed/opt/homebrew/bin/rclone
(is where mine is installed)/Users/yourusername/Documents
or /Downloads
or wherever you want to download your filesmkdir jan_gdrive
: I find this to be more convenient when you have multiple Drives to back up; this makes managing folders way neater.zip -r jan_gdrive.zip ./jan_gdrive/
subl jan_gdrive.sh
to create an empty file in sublime editor
#!/bin/bash
/opt/homebrew/bin/rclone copy --update --verbose --drive-skip-dangling-shortcuts --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s --stats-file-name-length 0 "jan_gdrive:FOLDER" ./jan_gdrive/FOLDER
# optionally zip up the folder for upload
zip -r ./jan_gdrive.zip ./jan_gdrive/
# optionally remove the original folder
# rm -r ./jan_gdrive/FOLDER
chmod +x jan_gdrive.sh
give permissions to run the file./jan_gdrive.sh
pwd
to get folder path; for me this is /Users/janzheng/Desktop/Projects/rclonedrive/jan_gdrive.sh
(yes I put everything under Desktop)crontab -e
0 0 * * 0 /Users/janzheng/Desktop/Projects/rclonedrive/jan_gdrive.sh
If you want to put these backups online, I find Cloudflare R2 as a cheap and easy alternative to AWS.
rclone-backups
rclone-backups
and set a 1 year TTL (it’s good practice in case your keys leak)https://***********.r2.cloudflarestorage.com/rclone-backups
rclone
to your Cloudflare R2 URLrclone config file
subl /Users/janzheng/.config/rclone/rclone.conf
. Note that you have to set no_check_bucket=true
because you set specific read/write permissions for your bucket earlier.
[r2backups]
type = s3
provider = Cloudflare
access_key_id = abc123
secret_access_key = xyz456
endpoint = https://accountid.r2.cloudflarestorage.com
acl = private
no_check_bucket = true
rclone-backups
:rclone copy ./jan_gdrive.zip r2backups:rclone-backups/jan_gdrive
will copy the file into R2 storage: rclone-backups/jan_gdrive/jan_gdrive.zip
rclone tree r2backups:rclone-backups
rclone copy r2backups:rclone-backups/jan_gdrive ./mynewfolder
jan_gdrive.sh
file with a few additions:
#!/bin/bash
# Google Drive folder to back up
FOLDER=""
# Name for local copy
NAME="phaus_gdrive"
# BUCKET="rclone-backups"
BUCKET="phageaus"
# run this by typing ./phaus_gdrive.sh
echo "---->> [Download] Starting!"
/opt/homebrew/bin/rclone copy --update --verbose --drive-skip-dangling-shortcuts --transfers 30 --checkers 8 --contimeout 60s --timeout 300s --retries 3 --low-level-retries 10 --stats 1s --stats-file-name-length 0 "$NAME:$FOLDER" ./$NAME/${FOLDER:+/}
echo "<<---- [Download] Finished!"
# optionally zip up the folder for upload
echo "---->> [Compressing] Starting!"
zip -r ./$NAME.zip ./$NAME/
echo "<<---- [Compressing] Finished!"
# optionally remove the original folder
# rm -r ./$NAME/FOLDER
echo "---->> [R2 Upload] Starting!"
rclone copy ./$NAME.zip r2backups:$BUCKET/$NAME --progress --verbose --stats 10s
rclone tree r2backups:rclone-backups
echo "<<---- [R2 Upload] Finished!"
I haven’t run this with cron yet, and I’ve never actually used cron so let’s see what happens!
./$NAME/${FOLDER:+/}
was added for more flexibility when specifying folder--verbose
to --vv
for “very verbose” so you know what’s being added and what’s going on under the hood