Feature request: Make <WarnSignificantDifference> to be triggered with updated files as well

Discuss new features and functions
Posts: 4
Joined: 16 Apr 2016

hereiam

Hello,

In first place, I want to thank Zenju for coding such a great tool. I find it really useful.

However, I'm getting really concerned about crypto-malware and I wanted to know what parameters use FFS to warn the user if <WarnSignificantDifference> is enabled, so I don't overwrite a good backup with a encrypted one.

To find these limits I've been testing with some bogus directories. I've created two folders (source and destination), placed 500 files on the source, and created a batch which mirrors the source to the destination folder. Then tried increasingly massive deletions (at first I tried deleting 10 files, then 50, etc.), updates (changing the timestamp of 10 files, then 50, etc.), and additions (creating 10 new files, then 50, etc.), and watched when FFS shows a warning about significant differences been detected.

Based on the results of my tests, it seems that if FFS detects over 50% of deleted or new files (with a minimum of 20 items) it warns the user about significant differences.

However, I haven't found yet any warnings about updated files, no matter how many files I updated at source directory. Please, take into account that when I talk about updated files I mean a file that has a different size and/or timestamp, since comparing the contents is too slow in most scenarios.

Currently, ransomware seems to encrypt your files and rename them (usually changing their extension), and this behavior would trigger the warning about significant differences since a lot of new (encrypted) and deleted (the originals) files would appear.

But, I wonder what would happen if next generation of ransomware would encrypt the files, and replace the originals without changing their names? In this case, FFS wouldn't find any deleted or new files as the encrypted ones would have the same names as the unencrypted. According to my tests, these encrypted files would be treated as updated files and wouldn't trigger any warning at all!

Would it be possible to warn the user about massive updated files as well? If you think this wouldn't be useful for everyone, maybe a new specific parameter (something like <WarnSignificantUpdates>) could be used at config file?

Thank you very much.

Best regards.
Posts: 23
Joined: 18 Mar 2016

zlhnxqgf

What about the preview? Don't you check the list for changes anyway? I don't just blindly hit "Synchronize".
Posts: 4
Joined: 16 Apr 2016

hereiam

Hello zlhnxqgf, while I understand and respect your point of view, in my personal case the only way to get full backups done (which are quite massive) on a regular basis is making them so easy to get done that I cannot find any excuse not to make them. To achieve this I try to automatize as much as possible, so batch sync without human intervention is by far the best scenario. In fact, if I had to supervise every change I would end not backing up, which is worse than anything else.

With all that said, I use mirror batch syncing for full backups, so I really appreciate them to finish without pop-ups if everything went as expected, and that's why warning about massive updates would be really useful (just like warning about massive deletes and additions) to notice that something is not going as usual, and take the proper action about it.

Anyway, this is only my personal opinion, so that's why I've suggested to include an option to turn on/off warning about massive updates, so everyone can use it or not, as they prefer.

Best regards.
Posts: 23
Joined: 18 Mar 2016

zlhnxqgf

I see the benefit of a warning but I don't see a way to make it work reliably.
How do you set a practical threshold for the warning? 25%, 50% or 75% of files changed?
What if the ransomware just has started and only encrypted 15% of your files?
There is no way to detect unwanted changes in an automatic backup setup.
A better solution is to use file versioning:
https://freefilesync.org/manual.php?topic=versioning
And also running a second backup (another target) manually once a week/month.
But again malware has the same access as FFS so it could just encrypt all backups over time.
User avatar
Site Admin
Posts: 7054
Joined: 9 Dec 2007

Zenju

I agree with zlhnxqgf, the best solution is to have multiple backups and versioning. FreeFileSync is not in the best position to detect ransomware since it only compares time and file size. Both of these metadata could be left unchanged by a more elaborate ransomware and the situation could not be detected unless when doing a binary comparison.
Posts: 4
Joined: 16 Apr 2016

hereiam

Hi again. Thanks for your tip, but I'm afraid that versioning is not an option when you work with lots of big files. What I described was just a brief summary of what I do (not to bore anybody with the details): I didn't mentioned that I use a backup rotation scheme, I encrypt the backup media, I keep those media off-line all the time (but when I use them in their respective turns), etc.

I think that you're right when you say that if you backup in a point where ransomware has only encrypted a percentage lower than the threshold which triggers the warning, you'll end with a partially encrypted backup (and that's one reason to use a rotation scheme). In fact, you could apply your very same argument to deleted and added files warning as well. What if the ransomware has only encrypted (I mean encrypted, and renamed as current generations do) 15%, or 20% of your files? Probably FFS wouldn't trigger the warning because it's below its threshold. However, it will finally trigger it at some point (probably at next backup, or maybe next one) once it has encrypted over 50%, so telling the user about massive differences it is really useful. I'm just asking to consider updating to be warned as well, just as deletion and addition are currently treated.

Anyway, this is only my personal choice so everybody can use whatever backup policy they prefer. That's why I think that setting an additional option at config file to warn the user about massive updates would give everyone the choice to do things the way they want.

Best regards.
Posts: 4
Joined: 16 Apr 2016

hereiam

Hi Zenju, I'm afraid I have answered zlhnxqgf at the same time you were also replying. ;¬)

Please, let me rebate your point. I think that if metadata are not changed, then FFS will not copy those encrypted files because it will think that they haven't changed since last synchronization, so the backups will remain clean.

If the ransomware also encrypts the backup while examining the differences (which would be a partial encryption since the backup media, by definition, should be off-line when not in use), you could fight that again with a proper rotation scheme.

Maybe FFS is not the best option to detect ransomware as you say, but please don't underestimate your tool! In any security model the more layers of protection you add, the harder will be to break the system.

I think that warning about massive updates would be beneficial, but as my opinion may not be generalized, adding it as a separate option would make FFS even more versatile.

Anyway, thank you for your time and consideration.

Best regards.