I was in the need to securely backup a significant amount of data. My plan involved to use Linux to script this overnight when nobody in the office would be affected by this, so I was thinking of using SCP. This was until I realized that SCP is not the right tool. RSync is by far the best tool for this project, namely because this is what the tool is intended to do. Rsync copies over the changed blocks of a file, and not just the entire file. When you are pushing an 800MB file, this changes backup time significantly. Check it out:
Using SCP (secure copy)
scp -Cpr f-prot/ administrator@<server>:/home/administrator/f-prot
the options I used, “-C = compression“, “-p = preserve times”, and “-r = recursive”. This results in every file/folder to be copied over the connection, each and every time the script would run:
The command I issue is as followed:
rsync –stats –progress -czave ssh /home/administrator/f-prot administrator@<server>:/home/administrator/
The options I used (c = checksum, z = compressed, a = archive, v= verbose, e= execute (for ssh)). Now the first time I run this it will take around the same amount of time as the SCP command, because it has to send all the original data:
The beauty is on the next run, when only a couple files have changed. For instance, let’s make a file double in size:
So now that we have a 9MB file, which is larger and now has a newer modification date on the source server, we run the same Rsync command:
And if nothing has changed, it doesn’t send anything, saving you time and bandwidth:
So in conclusion, if you are contemplating between SCP and RSYNC, go with RSYNC.
Note: I did this using a small file sizes, but only because it was just a quick test, and I didn’t want to hog up bandwidth at work and waste my time using 2gb files.