Dear community,
sometimes i have to copy several terabytes (lots of big files but also small) of data,
which are being copied in an alphabetic order.
While a comparison by time and size only takes some seconds, a
100%-comparison by filecontent takes a lot of time. Sometimes multiple days -
depanding of network or drive speeds.
At the moment, free file sync compares two folders (or more than two)
in an aphabetacal order from a to z / from the first folder to the last. I assume that if a drive or ram or something else makes problems during a backup task, the problem will remain till the end of the copy task. So i have to wait till 100% of the data is compared, to be sure, that ther was no fundamental problem during the copy task or at the end of the copy task.
For my needs,it would be enough to compare not all data, but a
randomized sample of data to verifiy that there was no fundamental problem
during copy process.
So, i would like to see a randomized comparison, which intially scans source
and destination folder, than compares this by size (and optionaly also by
filedate) to check which of the files are worth to compare by
filecontent and finally compares the files by filecontent in a
randomized order.
By default, this randomized compare should include 100% of the files and foldes in the source and destination, but it should also be possible to set a value to compary only n% of the matching files.
Randomized compare should also be possible with more than only one source and destination.
Also, it should be possible to stop the randomized comparison at any
time to see the results so far (and to resume it if needed).
Feature request: Randomized filecontent compare
- Posts: 1
- Joined: 17 Dec 2020
- Posts: 4055
- Joined: 11 Jun 2019
There are existing utilities designed for this. They are typically marketed with ransomware protection as their ideal