I am using the donation edition.
I syncronize (with Mirror) a local folder with Google Drive. During the synchronization, Freefilesync creates dublicate files (two or more files with the same name) on the Gdrive. This creates errors during the next synchronization like this:
Cannot find "(file name)". The name "(file name)" is used by more than one item in the folder.
The only solution is to delete the dublicate files (the files with the same names). But this is not a usable solution because my folder has over 18000 subfolders and over 500000 files!
I use 5 parallel file operations for gdrive and 100 for the local drive.(I don't know if this creates the problem but fewer than this makes the synchronization very slow)
The above problem makes Freefilesync unusable for backup! Isn't any solution to this? Why Freefilesync can't check if the files already exist on gdrive?
I came here to report the same bug, As best I can tell what is happening is I have multiple computers trying to update the file at the same time and google drive is making duplicate copies. Google drive stores files in a weird way and so having multiple files with the same name is technically allowed because it is just storing a record with a name associated with it.
This happens to me after a couple of days and I periodically need to clean up any duplicate files on google drive to have the process continue. The simple solution for this would be for the sync tool to just delete the oldest, generally the files are identical.
I was hoping this would be fixed by now but it has persisted for well over 15 versions.
Best you can do is to somehow make sure that only a single sync is running against a particular Google Drive folder at a time.
PS: If it were a local sync, FFS could work around by placing sync.ffs_lock files in the base directories to serialize access of multiple FFS instances. But this requires transactional file creation and is therefore not possible for Gdrive.
Best you can do is to somehow make sure that only a single sync is running against a particular Google Drive folder at a time.
Zenju, 02 Aug 2022, 17:51
The behavior of allowing multiple files on google drive isn't the issue, Google gonna google. I'm not asking you to fix google. The problem is that it becomes a blocking issue for FreeFileSync as it's unable to perform any sync functions when there is 2 files with the same name on google drive. I was hoping there was something that we could add to the program to any or multiple of the following
1) set a rule to clean up duplicate names
2) or always overwrite if duplicate names
3) or set a rule that it automatically deletes the older of the 2 files if a duplicate is found.
4) Block FFS from changing files on target when another instance of FFS is currently running a sync operation on the target and try again in a user defined amount of time.
None of these suggestions work because Google Drive does not offer transactional file creation. There's just a gap between checking if a file exists, and creating a new one. It's a race condition by Google's design.
None of these suggestions work because Google Drive does not offer transactional file creation. There's just a gap between checking if a file exists, and creating a new one. It's a race condition by Google's design.
Zenju, 03 Aug 2022, 07:06
You are focusing on the preventing it happening part.
While some suggestions where to prevent it happening I understand how that can not be possible.
I am mostly asking for a way to handle/clean up the condition after it happens.
Currently FFS just gets locked in an error condition and requires manual intervention.
I am not sure how the google drive API works at this time without looking into it personally.
But please explain to me why FFS is able to detect that more than one file exists with same name but is unable to delete one or both of those files if that happens. I would 100% be ok with setting a sync rule that just says if duplicate files exist that are older just nuke both and upload new file.
I would 100% be ok with setting a sync rule that just says if duplicate files exist that are older just nuke both and upload new file.
BWal, 03 Aug 2022, 15:54
This might be good enough for you, but it isn't good enough as a general rule. Just picking one of a set of equally-named files is like gambling with files.
I would 100% be ok with setting a sync rule that just says if duplicate files exist that are older just nuke both and upload new file.
BWal, 03 Aug 2022, 15:54
This might be good enough for you, but it isn't good enough as a general rule. Just picking one of a set of equally-named files is like gambling with files.
Zenju, 03 Aug 2022, 16:50
Which is why it could be an optional rule. Similar to how I have it set in my custom sync to always overwrite google drive in the event of a difference.
For my user case I'm trying to sync game config files between multiple computers so that I can switch to any computer and my settings will move over, this game lacks any form of cloud saves. There is never a condition that the same file name will be anything else. And I always want the newest file as that means I played on that computer most recently. In the absolute worst cast scenario the worst thing I could have is if an older file overwrites a newer one I would go back to a previous configuration from the day before. Which is a minor inconvenience at best. But what is major inconvenience have the syncs just suddenly stop working for a long period of time and fail silently because I have them running in unattended mode and then have the files get massively out of sync then having to manually log into google drive to find all the files that got duplicated and clean it up so that the syncs will work again.
Could we not have a condition where we chose what to do in the case of a duplicate file exists?
If the user is concerned about losing files/data then they can chose to skip files with duplicate names.
I came here to report the same bug, As best I can tell what is happening is I have multiple computers trying to update the file at the same time and google drive is making duplicate copies. Google drive stores files in a weird way and so having multiple files with the same name is technically allowed because it is just storing a record with a name associated with it.
This happens to me after a couple of days and I periodically need to clean up any duplicate files on google drive to have the process continue. The simple solution for this would be for the sync tool to just delete the oldest, generally the files are identical.
I was hoping this would be fixed by now but it has persisted for well over 15 versions.
BWal, 02 Aug 2022, 15:42
It isn't FreeFileSync it is Google Drive, and it is a Revision Control System. It saves older copies of files. If you screw up and overwrite a file you still have a copy of the previous revision. Google doesn't store files like what you would see in your hard drive. Every file you upload to Google Drive is stored in a database and isn't written to any kind of file system.
Screenshot 2024-11-09 142106.png (52.02 KiB) Viewed 259 times
Screenshot 2024-11-09 142344.png (71.28 KiB) Viewed 257 times
Google will automatically remove older revisions of the file itself.
I came here to report the same bug, As best I can tell what is happening is I have multiple computers trying to update the file at the same time and google drive is making duplicate copies. Google drive stores files in a weird way and so having multiple files with the same name is technically allowed because it is just storing a record with a name associated with it.
This happens to me after a couple of days and I periodically need to clean up any duplicate files on google drive to have the process continue. The simple solution for this would be for the sync tool to just delete the oldest, generally the files are identical.
I was hoping this would be fixed by now but it has persisted for well over 15 versions.
BWal, 02 Aug 2022, 15:42
It isn't FreeFileSync it is Google Drive, and it is a Revision Control System. It saves older copies of files. If you screw up and overwrite a file you still have a copy of the previous revision. Google doesn't store files like what you would see in your hard drive. Every file you upload to Google Drive is stored in a database and isn't written to any kind of file system.
Screenshot 2024-11-09 142106.png
Screenshot 2024-11-09 142344.png
Google will automatically remove older revisions of the file itself.
phpjunkie, 09 Nov 2024, 23:16
That is not what was happening. It was 2 files with exact same name in same "folder" it would break sync if this condition happened. It happened because more than one pc would try to upload a file at the exact same time. It was basically a race condition. I tried several things to prevent this but it would of likely required more strict scheduling and it was 6 computers all trying to sync at staggered intervals.
The solutions that worked was either use ftp site or just use windows file share drive mapping.
I only have the google drive sync run on 1 pc for when I need access outside my own network.