I have a datalogging application that is logging 1-second data to a SQLite database file. The file is updated continuously, and when the day rolls over, the file is closed and a new file is started. I want to periodically (say every 15 minutes) replicate the current, active file to a remote SFTP server, both for backup and for accessing the data. I'm trying to use FFS in batch mode, running as a scheduled task under Windows. A full day's file can be 800MB, and it can take 4 minutes to transfer. I've set up the folder pair, and assuming the previous day's files have already been transferred, the FFS comparison detects the single newest file as changed and starts to transfer it. At the end of the transfer though it fails on an error:
Cannot read file xxxx
Unexpected size of data stream
Expected: 728260628 bytes
Actual: 730116096 bytes (notifyUnbufferedRead)
I assume this is because the file is being continually added to, and has changed during the time between when the transfer started and ended. And I expect this is related to the FTP transfer, because I don't see this problem if I'm copying within the same drive.
This leads to a workaround: I have a batch job with two folder pairs, and a filter to select files with modification times in the last two days. The first folder pair copies the latest two day's files to a temporary folder. This happens relatively fast since it's on the same drive. The second folder pair syncs the temp folder to the SFTP server which it can do without problems, since this is now a static snapshot of a file. This works well, but has two issues. The current active file is copied only every other run, which suggests that the internal sequence is compare both folder pairs and then copy based on results of the comparisons (rather than compare and copy pair 1 before pair 2). The other issue is that the temp folder accumulates old files that need to be manually purged.
So questions:
1) Is there a different way to configure FFS that would make this work better? I could always create a batch file that runs two separate FFS jobs in sequence (and deletes the temp files after), but I'm wondering if there a clean way to do it just with FFS.
2) Is not supporting FTP transfers for continually changing files a bug in FFS, or is supporting them a feature that could be added?
3) Or is FFS just not the proper tool for (large) dynamically changing files?
SFTP sync fails when file is continually changing
- Posts: 1
- Joined: 15 Oct 2021
- Posts: 4056
- Joined: 11 Jun 2019
What you use Volume Shadow Service to copy instead?
- Posts: 4056
- Joined: 11 Jun 2019
"Copy locked files" in options
- Posts: 2
- Joined: 7 Dec 2023
While my problem title is exact same, the sync direction is opposite: Download.
CSV file is being constantly appended on 3rd-party SFTP-server.
I am not able to download such files during sync:
"Copy locked files" option did no effect.
Cloned thread: viewtopic.php?t=2483
CSV file is being constantly appended on 3rd-party SFTP-server.
I am not able to download such files during sync:
What can be done?16:11:36 Error Cannot read file "sftp://somesftp.com:1251/stats/some_table.csv".
Unexpected size of data stream: 20,793,931
Expected: 20,767,388 [notifyUnbufferedRead]
"Copy locked files" option did no effect.
Cloned thread: viewtopic.php?t=2483
- Posts: 4056
- Joined: 11 Jun 2019
That is because you can only use VSS on local storage devices, and VSS is required for that option to have any effect.