Versioning Issue - Wrong Folder Date

Get help for specific problems
User avatar
Posts: 5
Joined: 3 Sep 2024

goni05

I am just starting out to use FFS, and so far, it has closed a gap I had with syncing Google drive to a local NAS. I do eventually want to look at getting it setup in a docker container, but until then, I wanted to see about an issue I am seeing with the Versioning feature.

I have a Batch Job setup to run on a Windows machine to do a 2-way sync with Google Drive and a network share (NAS). I have been reviewing the logs regularly since I am still new at using the tool and want to make sure I am not messing something up. On Delete Files, I have setup Versioning with Folder Timestamps. I have also set to retain Max 5 copies. This worked fine for the first couple of days, but now that the time has passed with changed files, I am starting to see the logs fill with errors.

Specifically, I am getting this error:
8:02:22 AM    Error:    Cannot delete file "\\jarvis\gdrive_backup\_VERSIONS_\2024-09-03 080002\BACKUPS\UNRAID\APPDATA\ab_20240721_050002\VPN.tar.gz".
ERROR_FILE_NOT_FOUND: The system cannot find the file specified. [DeleteFile]
This same error is repeated a few times, which I thought was interesting. I started looking, and something stuck out as odd. This did not seem to be an issue of creating the copies of the file when it detects a change or deletion, but rather, its attempt to clean up old copies that conform to the Max Days, Minimum and Maximum numbers to keep. The file it was attempting to delete did not exist. I then realized the path was using the date it batch job runs as the Folder Date instead of the date at which the version was created (or the log message is).

Might just be me, but I think this is a bug.
User avatar
Posts: 3953
Joined: 11 Jun 2019

xCSxXenon

What is your versioning location set to in the settings?
Posts: 1
Joined: 4 Sep 2024

edwardg

I am having the same issue. I made a folder on the external SSD I am using called "D:\versions" and I am getting the exact same error on 45 files coming off google cloud.
User avatar
Posts: 5
Joined: 3 Sep 2024

goni05

What is your versioning location set to in the settings? xCSxXenon, 03 Sep 2024, 14:27
Here are my settings:
20240905_121747.png
You do not have the required permissions to view the files attached to this post.
User avatar
Posts: 3953
Joined: 11 Jun 2019

xCSxXenon

@Zenju
This only correlation here seems to be Gdrive access. I can't imagine how that causes FFS to behave any differently than using just local drives, but I would then expect that this would be an issue for every single user using versioning, which also doesn't seem to be the case. I would wait for the dev to chime in, I don't think this is something an end user can remedy.
User avatar
Posts: 5
Joined: 3 Sep 2024

goni05

Do you know if it's possible to read the sync.ffs_db file with any sort of software? I was thinking I could poke around inside to see if I can confirm what I was seeing, but have not found anything that worked.
User avatar
Posts: 3953
Joined: 11 Jun 2019

xCSxXenon

No, it's encrypted
User avatar
Posts: 2410
Joined: 22 Aug 2012

Plerry

"Encrypted" suggests the contents would be something secret or shielded.
Obviously that is not the case.
I suppose Zenju chose the most compact and convenient format for the sync.ffs_db files, and because those are not designed to be accessed by users, and not intended to be modified by users, did not publish any details about the format.
I never tried, but I suppose if you study the FFS code, you could find out.
User avatar
Site Admin
Posts: 7170
Joined: 9 Dec 2007

Zenju

The versioning cleanup doesn't construct paths, but scans all existing files in the given directory, i.e. "\\jarvis\gdrive_backup\_VERSIONS_". This means the to be deleted file in the error message did exist during scanning this folder, but not during the deletion attempt.
Most likely two FFS sessions were running near the same time, both cleaning up the same files, with the one showing the error message being the slower one.
User avatar
Posts: 5
Joined: 3 Sep 2024

goni05

So I have this running through a Batch script so I can capture some custom logs on versions, times, etc... Here is the code:
@ECHO OFF
SET SCRIPT_VERSION=1.1

ECHO [%DATE:~10,4%-%DATE:~4,2%-%DATE:~7,2% %TIME:~0,8%] Starting %~nx0, Version %SCRIPT_VERSION% >> "%~dp0log.txt"
"C:\Program Files\FreeFileSync\FreeFileSync.exe" "%~dp0Google Drive - Cloud Sync.ffs_batch"
ECHO [%DATE:~10,4%-%DATE:~4,2%-%DATE:~7,2% %TIME:~0,8%] %~nx0 complete. >> "%~dp0log.txt"
Now, when this runs, I get 2 processes running, and I don't know why, but it does based on the command line argument from the above batch script. When the job finishes, both processes disappear. Is this normal? Should I be calling the x64 executable directly instead? Also, if it was 2 processes overlapping, wouldn't I also get 2 reports generated too? In this case, I do not.
Screenshot 2024-09-06 143257.png
I have the job run every 2 hours (Windows Task Scheduler), and it finished each time in 9-10 minutes, so it shouldn't be overlapping processes.
You do not have the required permissions to view the files attached to this post.
Posts: 1006
Joined: 8 May 2006

therube

FreeFileSync.exe is simply a "stub" whose only purpose is to call either FreeFileSync_x64.exe or FreeFileSync_Win32.exe (depending on circumstances).

So that you are seeing 2 "FreeFileSync .exe" is normal & expected.
User avatar
Posts: 5
Joined: 3 Sep 2024

goni05

Ok, so I kept digging here, and I don't think the issue is related to what it was that I thought. I was originally only reading the Errors and Warnings summary, but when I look at the full log, I am starting to see a bit more of what is happening.

To summarize, I am attempting to 2-way sync Google Drive to a local NAS storage (SMB Share on the computer). What is happening is that I am getting a CURLE_OPERATION_TIMEDOUT message saying it cannot read the file from GD, followed immediately (same time stamp) that the file cannot be deleted. I'm assuming this is some cleanup activity that occurs so one does not think the file is a good copy when it is not, especially if it stops mid transfer instead of never actually starting in my case. Now, I'm not sure on expected behavior in all cases (if it creates a file and when), but it seems this could be alleviated by doing a simple file exists before deleteing it in the event we get this timeout, so we avoid this error message altogether. Instead, having only the error on not being able to "sync" the file might be good, but it seems this issue is only happening when it's copying to the directory where I have "Versioned" files going to, so I'm not sure if it's actually syncing anything or what is happening. What I can be sure on is that it's only happening on certain files and not all of them. I recognized this as I changed from Folder Date to Replace versioning, and it was still happening.