I've been searching for good backup software for some time and I think I've
found it here. However, I haven't been able to find an answer to what I take
to be a simple question: How, if at all, can one prevent corrupted files on
the source disk from being sent to the backup disk, overwriting good copies?
Here's the scenario I am imagining:
I backup disk A to disk B. File 1 on A becomes corrupt for some reason. When I
run the backup software as part of my regular routine, it checks for file
differences and finds that file 1 on A is different from file 1 on B (because
the one on A is corrupt), so it goes ahead and overwrites file 1 on B with the
corrupt copy.
How, if at all, can one prevent this from happening, or am I imagining
something that is impossible or too unlikely to worry about?
How to prevent corruption?
- Posts: 4
- Joined: 15 Mar 2009
- Site Admin
- Posts: 7212
- Joined: 9 Dec 2007
First, there is nothing to prevent data failure or corruption. Since we're not
carving information into stone anymore, every medium is suspect to losing its
information sooner or later, CDs and hard disks are said to last multiple
decades. So the question is not how one can avoid it, but how can one detect
it order to repair it (e.g. by another yet valid copy of the data).
Making sure data exists unaltered is done via checksums. This is inherent in
file packer software, so the easiest way to make sure you are able to detect
data corruption is put all your data into archives, so that each file is saved
next to a MD5 or CRC32. The packer software will notify you instantly if you
try to encrypt data that has been damaged.
There may be other means to do the same, personally I'm using my own tool that
creates and maintains a database of MD5 sums, which I validate each time I
migrate my data from one hard disk to another.
(Caveat: The file is "safe" only after its checksum has been generated. Even
during the packing process it is theoretically possible that data corruption
creeps in. So if you're paranoid you'll have to binary compare the original
file against the just packed file.)
> imagining something that is impossible too unlikely to worry about?
It is "very" unlikey, which is a quite subjective statement. Personally I had
found a file corrupted just twice in my lifetime so far. Finally you'll need
to evaluate how important a particular piece of information is to you
personally and how important 100% integrity is. For example for movie or image
data a currupted bit will not make a big visual impact, but for executable
data, it may entail silent erratic behavior and application crashes.
carving information into stone anymore, every medium is suspect to losing its
information sooner or later, CDs and hard disks are said to last multiple
decades. So the question is not how one can avoid it, but how can one detect
it order to repair it (e.g. by another yet valid copy of the data).
Making sure data exists unaltered is done via checksums. This is inherent in
file packer software, so the easiest way to make sure you are able to detect
data corruption is put all your data into archives, so that each file is saved
next to a MD5 or CRC32. The packer software will notify you instantly if you
try to encrypt data that has been damaged.
There may be other means to do the same, personally I'm using my own tool that
creates and maintains a database of MD5 sums, which I validate each time I
migrate my data from one hard disk to another.
(Caveat: The file is "safe" only after its checksum has been generated. Even
during the packing process it is theoretically possible that data corruption
creeps in. So if you're paranoid you'll have to binary compare the original
file against the just packed file.)
> imagining something that is impossible too unlikely to worry about?
It is "very" unlikey, which is a quite subjective statement. Personally I had
found a file corrupted just twice in my lifetime so far. Finally you'll need
to evaluate how important a particular piece of information is to you
personally and how important 100% integrity is. For example for movie or image
data a currupted bit will not make a big visual impact, but for executable
data, it may entail silent erratic behavior and application crashes.
- Posts: 4
- Joined: 15 Mar 2009
Thanks for the reply, it is quite helpful. Is this corruption scenario
something I can avoid by keeping a keen eye on what FreeFileSync reports? For
example, suppose I've used the program to sync disk 1 to disk 2 some time ago.
I know that File A has not changed on either disk since that time. Now,
suppose File A becomes corrupt on disk 1. Will FreeFileSync report file 1 as
"changed" when I press "compare" or is the comparison done in a way that won't
reflect this fact?
something I can avoid by keeping a keen eye on what FreeFileSync reports? For
example, suppose I've used the program to sync disk 1 to disk 2 some time ago.
I know that File A has not changed on either disk since that time. Now,
suppose File A becomes corrupt on disk 1. Will FreeFileSync report file 1 as
"changed" when I press "compare" or is the comparison done in a way that won't
reflect this fact?
- Site Admin
- Posts: 7212
- Joined: 9 Dec 2007
>Will FreeFileSync report file 1 as "changed" when I press "compare"
No, there is no way to detect that a file is corrupted except verify its
complete content. That is binary-compare against its allegedly uncorrupted
source or against a former checksum of same.
No, there is no way to detect that a file is corrupted except verify its
complete content. That is binary-compare against its allegedly uncorrupted
source or against a former checksum of same.
- Posts: 4
- Joined: 15 Mar 2009
Thanks again! How does FreeFileSync determine whether a file has changed?
- Posts: 4
- Joined: 15 Mar 2009
Thanks again! How does FreeFileSync determine whether a file has changed?