11

I have a directory called my_projects_linux inside the Ubuntu file system, which contains all my work from many years. The directory contains files, subdirectories and so on.

For backup purposes, I occasionally copy this directory and all its contents to external hard drive.

Hence, the contents of my external drive look like:

/mounted_drive/my_projects_linux
/mounted_drive/my_projects_windows  # the same idea to backup Windows work

Hence, what I am looking is a command which:

  1. Would work as

    cp /home/my_projects_linux /mounted_drive/my_projects_linux
    

    It should replace old files, subdirectories, files inside subdirectories, etc. in the external disk by the new content from my PC.

  2. Be fast. It should only copy modified files or those that have been newly created. Given that the size of my_projects_linux is >50 GB, copying everything takes more than an hour, which is too slow. In reality often only a few MB have changed since last backup, so theoretically a copy can be made much faster.

How to do that?

I googled that cp with the -u flag could possibly match my needs. Would that work (e.g. would it correctly handle subdirectories of subdirectories)?

Also, is storing file system on an external disk an appropriate way of doing a backup, or is there a fancier way? For example, using a cloud? Note that the fancier way should be simple, as otherwise it will not outweigh the ease of executing one shell command.

3 Answers3

14

You're sort of describing what rsync was designed for. From man rsync:

   Rsync  finds  files  that  need to be transferred using a "quick
   check" algorithm (by default) that looks  for  files  that  have
   changed  in  size  or in last-modified time.  Any changes in the
   other preserved attributes (as requested by options) are made on
   the  destination  file  directly  when the quick check indicates
   that the file’s data does not need to be updated.

There's also a lot of guff about networking, but it's happy doing local too.
Essentially all you need is:

rsync -rtv /home/my_projects_linux /mounted_drive/my_projects_linux

There are a ton of options available but -rtv will sync recursively, keeping the timecodes the same, while being verbose about what it's doing.

Oli
  • 299,380
1

To answer your second question, a more fancy way to backup is rdiff-backup.

rdiff-backup backs up one directory to another, possibly over a network. The target directory ends up a copy of the source directory, but extra reverse diffs are stored in a special subdirectory of that target directory, so you can still recover files lost some time ago. The idea is to combine the best features of a mirror and an incremental backup. rdiff-backup also preserves subdirectories, hard links, dev files, permissions, uid/gid ownership, modification times, extended attributes, acls, and resource forks.

And if you really want to go cloud-based: I use CrashPlan with a paid subscription. Then you never have to think about manual backups any more.

Amedee Van Gasse
  • 342
  • 3
  • 17
1

I know this question already has an accepted answer. However, I wanted to add a cross platform GUI solution as I see you also use windows. I use for the same purposes Freefilesync. It is also FLOSS.

The configuration is absolutely intuitive and you can save different synchronization jobs.

There is also a ppa, though it does not include the latest version for 14.04 at the moment. To install via ppa:

sudo apt-add-repository ppa:freefilesync/ffs
sudo apt-get update
Bruni
  • 11,099