Hello, using FTP, not SFTP, the directory traversal starts out fast, and then bogs down. The server I'm synching with has in excess of 100,000 files on it, and FFS got to 50,000 real fast, and is now crawling. It'd be useful to have an option to control the number of connections used for directory traversal *and* downloads, like SFTP has?
Thanks!
M
Edit #1: Also, it would be useful if there was a connection retry when doing a scan on FTP. Right now if I get momentarily disconnected, a dialog pops up saying the connection timed out, and I have to click retry. It'd be handy for a setting to exist to auto retry.
Edit #2: If you are able to get this working for me into a state of the art FTP synch tool, I'd be happy to donate to via Bitcoin. I don't see Bitcoin as a way of receiving donations?
Slow FTP directory traversal
- Posts: 4
- Joined: 10 Jul 2017
- Site Admin
- Posts: 7212
- Joined: 9 Dec 2007
> option to control the number of connections used for directory traversal
Definitively! This is planned for the next release.
> *and* downloads, like SFTP has?
File upload/download speed should already be optimal, except if you're on Windows 7, in which case issues have been fixed in the current beta:
http://www.mediafire.com/file/w5ch7lyz9i29kuw/FreeFileSync_9.3_beta_Windows_Setup.exe
Definitively! This is planned for the next release.
> *and* downloads, like SFTP has?
File upload/download speed should already be optimal, except if you're on Windows 7, in which case issues have been fixed in the current beta:
http://www.mediafire.com/file/w5ch7lyz9i29kuw/FreeFileSync_9.3_beta_Windows_Setup.exe
- Posts: 4
- Joined: 10 Jul 2017
This is the error I keep getting during the directory scan. It works fine with filezilla:
Cannot open directory "<name omitted>"
CURLE_OPERATION_TIMEDOUT: 421 No-transfer-time exceeded. Closing control connection. [curl_easy_perform]
Clicking retry works. It'd be useful if it used the retry logic that I thought I set in the options?
M
Cannot open directory "<name omitted>"
CURLE_OPERATION_TIMEDOUT: 421 No-transfer-time exceeded. Closing control connection. [curl_easy_perform]
Clicking retry works. It'd be useful if it used the retry logic that I thought I set in the options?
M
- Posts: 4
- Joined: 10 Jul 2017
From my experience having multiple connections streaming at once is much faster than just one. I finally got to try the FTP sync part just recently, as it takes 2+ hours each time for it to traverse the directories. Then I find a setting that is wrong, so I change it and start again. 2+ hours later...> *and* downloads, like SFTP has?
File upload/download speed should already be optimal, except if you're on Windows 7, in which case issues have been fixed in the current beta:
http://www.mediafire.com/file/w5ch7lyz9i29kuw/FreeFileSync_9.3_beta_Windows_Setup.exe Zenju, 10 Jul 2017, 09:11
Now it's downloading. At 220KB/sec. This is on Windows 10. When I use filezilla, streaming 10 files at once, I average between 2-4MB/sec.
Unfortunately there seems to be more room for improvement on the FTP synch side. :(
M
- Site Admin
- Posts: 7212
- Joined: 9 Dec 2007
I've implemented support for multiple connections during FTP traversal which gives a nice N x speed up:
http://www.mediafire.com/file/b3edl60wv5ushko/FreeFileSync_9.3_beta_Windows_Setup.exe
http://www.mediafire.com/file/b3edl60wv5ushko/FreeFileSync_9.3_beta_Windows_Setup.exe
- Posts: 4
- Joined: 10 Jul 2017
Thanks. I just gave the new production version a try.
It traverses the directories much faster now. However it gives me random errors about directories not existing (that really do), and when I retry, it suddenly finds them.
Also, it still only downloads one file at a time via FTP. That's a problem. :( Downloading 10 files at once is faster than downloading 10 files individually. I don't know why, but that's the case. As of now, Filezilla is much quicker at downloading because of this.
M
It traverses the directories much faster now. However it gives me random errors about directories not existing (that really do), and when I retry, it suddenly finds them.
Also, it still only downloads one file at a time via FTP. That's a problem. :( Downloading 10 files at once is faster than downloading 10 files individually. I don't know why, but that's the case. As of now, Filezilla is much quicker at downloading because of this.
M
- Site Admin
- Posts: 7212
- Joined: 9 Dec 2007
See here: viewtopic.php?t=4575It traverses the directories much faster now. However it gives me random errors about directories not existing (that really do), and when I retry, it suddenly finds them. mdude, 15 Aug 2017, 21:44
One step at a time. But this is a todo for a future release.Also, it still only downloads one file at a time via FTP. That's a problem. :( Downloading 10 files at once is faster than downloading 10 files individually. I don't know why, but that's the case. As of now, Filezilla is much quicker at downloading because of this. mdude, 15 Aug 2017, 21:44
- Posts: 1
- Joined: 13 Nov 2016
So, was this feature removed in version 10.0? Why?