I've made a somewhat bizarre observation over the past few months, running my standard backup batch jobs from FreeFileSync's GUI. Let me explain:
I'm currently running Windows 10 21H1 and Windows Defender. I have 24 different FreeFileSync batch jobs that I use to back up my computer's data files and select configuration files. Some of the batch jobs have only one folder pair. Most of them have from around 6 to around 14 folder pairs. But one of them has around 100 folder pairs.
Well, when my computer is freshly booted, and all autostart items have finished doing their thing and either exited or gone semi-dormant, if I launch FFS and select that 100-folder-par batch job — whether right away or after selecting other batch jobs first — FFS parses the file and renders its Compare and Synchronize buttons "active" within maybe 3–4 seconds. If I do the same thing after the computer has been running for several hours, with maybe half a dozen programs launched, used, and exited, that same parsing takes 90–120 seconds. I noticed similar "inflated parsing delays" in some of my other "larger" (in terms of folder pairs) batch jobs.
I'm guessing this is the result of Microsoft's failure to do adequate "garbage collection" between reboots in certain key indexes or databases that FFS relies on. But honestly, having no background in coding, I just don't know.
Has anyone else run into this issue in Windows? In Linux? In MacOS? Has anyone found a workaround short of rebooting?
Anyway, it's a problem. When you can play and win a game of Freecell while waiting for FFS's GUI to parse and load a batch file, something is wrong! ;-)
Huge increase in time required to parse batch jobs if Windows has been running for a while
- Posts: 32
- Joined: 7 Aug 2018
- Site Admin
- Posts: 7212
- Joined: 9 Dec 2007
My guess would be GDI handles. They're an expensive resource, make for a slow GUI if too many are used, and cause crashes if their limit is exhausted. And 100 folder pairs is a lot.
The only way to handle such huge lists, would be for FFS to program a custom folder pair control. Not really an option. So the best solution would be to use filter settings instead of folder pairs.
The only way to handle such huge lists, would be for FFS to program a custom folder pair control. Not really an option. So the best solution would be to use filter settings instead of folder pairs.
- Posts: 1038
- Joined: 8 May 2006
You can check "handles" with a program like Process Hacker (or Process Explorer).
Right click the FreeFileSync_x64.exe name (on Windows, presumably), Properties, then on the Statistics tab, in the Other box it will show the number of handles.
I'm pretty sure 10K handles is the (single) process limit.
So if you're nowhere close to that, then it might be something else.
Right click the FreeFileSync_x64.exe name (on Windows, presumably), Properties, then on the Statistics tab, in the Other box it will show the number of handles.
I'm pretty sure 10K handles is the (single) process limit.
So if you're nowhere close to that, then it might be something else.
- Posts: 32
- Joined: 7 Aug 2018
First, I really appreciate the prompt response![T]he best solution would be to use filter settings instead of folder pairs. Zenju, 01 Jul 2022, 11:25
I would love to be able to use filters (especially exclusion filters) instead of "too many" folder pairs. Unfortunately, my "100-folder-pair" batch job is one of many that I designed to be runnable with RealTimeSync, and RealTimeSync doesn't support filters. Because the parent folder of most of the folder pairs in my 100-folder-pair batch job contains OTHER folders whose content is CONTINUOUSLY CHANGING, if I used filters instead of numerous folder pairs, RealTimeSync would trigger "filtered-batch-job" runs CONSTANTLY whenever those other programs are in use — even though nothing in those programs' continuously changing folders would ever get synced (because they are filtered out of the FFS batch job).
But wait. Are you suggesting that if I used inclusion filters in lieu of folder pairs, it might NOT trigger those mind-numbing "GDI-handle delays"? If so, I suppose I could use a RTS task to monitor all 100 folder pairs and to trigger the "filtered FFS batch job" whenever a change is detected. It would double the amount of RTS/FFS "editing work" I'd have to do, but if it made for substantially faster syncs, it might nonetheless be worth it.
Still, though, I'd greatly prefer it if RTS tasks could automatically import filters from the FFS batch jobs they're based on. I'd much rather be able to, for example, sync all of ...\AppData\Roaming and filter out the small minority of problematic subfolders (e.g., browsers and LibreOffice) than have to create dozens of folder pairs for all of the subfolders that aren't problematic. But again, with no background in coding, I have no idea whether this is practical and worthwhile in terms of resource usage, or even if it's technically feasible. I'm just sayin' it seems like it would be pretty nice!
I suppose I could always just chop the 100-folder-pair batch job into four 25-folder-pair batch jobs or five 20-folder-pair jobs, but it's not a very appealing prospect. I already have 24 distinct batch jobs to back up key files from my system drive (distinct from imaging), and I have four different copies of each of those for safely maintaining different backup drives. The thought of having an additional 12 or 16 batch jobs to wrangle makes me feel tired!
- Posts: 32
- Joined: 7 Aug 2018
Thanks! I already have Process Explorer on board and will use it to look at FFS's handles the next several times I run FFS — both immediately after a reboot and after Windows has been running for a while. I appreciate the suggestion.You can check "handles" with a program like ... Process Explorer). * * * * therube, 01 Jul 2022, 16:15
- Posts: 32
- Joined: 7 Aug 2018
INTERIM UPDATE:
I don't yet have enough data points to say much useful about GDI handles, other than that my "big" batch job seems to show a little over 500 handles for FreeFileSync x64 in Process Explorer shortly after a computer reboot, and, other things being equal, over 1500 handles when the computer has been running for many hours. I don't understand why this would be the case.
On the non-GDI front, as an experiment I made a modified copy of my "100-folder-pair" batch job, using exclusion filters instead of inclusionary folder pairs for my AppData\Roaming folders. That brought the total number of folder pairs in the job down from 103 to 31. When my computer has been running for many hours, it takes FFS's GUI around 15–20 seconds to parse and load the "filtered" job, versus 90–120 seconds for the "inclusionary-folder-pairs" job. That's a pretty big improvement, but it means that I can't make a usable RealTimeSync task for the job the quick and easy way. I would have to create an RTS task with nothing but inclusionary folder pairs — which is tedious but doable.
I'm still wondering, incidentally, whether this is a problem specific to Windows as opposed to Linux and MacOS. Was I on the right track in suspecting that Microsoft does a poor job of "garbage collection" [or consolidation/integration of continual changes in certain resources] between boots?
I don't yet have enough data points to say much useful about GDI handles, other than that my "big" batch job seems to show a little over 500 handles for FreeFileSync x64 in Process Explorer shortly after a computer reboot, and, other things being equal, over 1500 handles when the computer has been running for many hours. I don't understand why this would be the case.
On the non-GDI front, as an experiment I made a modified copy of my "100-folder-pair" batch job, using exclusion filters instead of inclusionary folder pairs for my AppData\Roaming folders. That brought the total number of folder pairs in the job down from 103 to 31. When my computer has been running for many hours, it takes FFS's GUI around 15–20 seconds to parse and load the "filtered" job, versus 90–120 seconds for the "inclusionary-folder-pairs" job. That's a pretty big improvement, but it means that I can't make a usable RealTimeSync task for the job the quick and easy way. I would have to create an RTS task with nothing but inclusionary folder pairs — which is tedious but doable.
It is really, really, REALLY not an option? Because reducing the duration and frequency of those "not responding" messages in FFS's GUI would be really, really, REALLY nice! ;-) This problem is bothering me on a laptop with a fast, recently optimized 4-lane NVMe SSD with 260GB of free space, a 10th-generation i7 CPU, and typically at least 6GB of free dual-channel DDR4 RAM to spare. Okay, so my external backup drives are all UASP USB 3.0, but still, I feel frustrated by FreeFileSync's performance when my computer has been up and running for a while (which is usually the case).100 folder pairs is a lot. The only way to handle such huge lists, would be for FFS to program a custom folder pair control. Not really an option. Zenju, 01 Jul 2022, 11:25
I'm still wondering, incidentally, whether this is a problem specific to Windows as opposed to Linux and MacOS. Was I on the right track in suspecting that Microsoft does a poor job of "garbage collection" [or consolidation/integration of continual changes in certain resources] between boots?
- Posts: 32
- Joined: 7 Aug 2018
I'm not a programmer, unfortunately, so I don't have a very good idea of how to systematically look for bottlenecks when performance issues like this come up. That said, I do have some newish observations:
• When my computer is fresh off a reboot, and no foreground programs have been launched (with the exception of FFS), it takes FFS's GUI around 1.5 seconds to parse and load the "filtered" version of my biggest batch job (which has only 31 folder pairs compared to 103 in the "unfiltered" version). When the computer has been running for over 120 hours, with various foreground programs (including FFS) launched, used, and exited in that time, it takes FFS's GUI a full 60 seconds to parse and load that same batch file. In fact, all of my batch jobs take around 2 seconds per folder pair to get parsed and loaded.
• Regarding GDI handles used by FreeFileSync_x64.exe, I'm not seeing a clear correlation with slower batch-job parsing. Shortly after a fresh boot, the process might start out using around 450 handles and perform normally (reasonably quickly). A day later, it might be up to >1700 handles and perform poorly. Still later in that same Windows session, it might be back down to around 700 handles and still perform just as poorly.
• I only just started experimenting with Sysinternals' Process Monitor (filtered to display only events spawned by processes whose name contains "freefilesync"). I've exported the over 18,000 lines of results to a CSV file and imported it into LibreOffice Calc in the hope that I might be able to identify what kind of operations are taking the most time. I don't have a lot of big chunks of time to do this kind of work, but I'll do my best to squeeze some in over the coming month and post any pertinent-looking findings.
In the meantime, any tips on using Process Monitor to best effect would be appreciated!
• When my computer is fresh off a reboot, and no foreground programs have been launched (with the exception of FFS), it takes FFS's GUI around 1.5 seconds to parse and load the "filtered" version of my biggest batch job (which has only 31 folder pairs compared to 103 in the "unfiltered" version). When the computer has been running for over 120 hours, with various foreground programs (including FFS) launched, used, and exited in that time, it takes FFS's GUI a full 60 seconds to parse and load that same batch file. In fact, all of my batch jobs take around 2 seconds per folder pair to get parsed and loaded.
• Regarding GDI handles used by FreeFileSync_x64.exe, I'm not seeing a clear correlation with slower batch-job parsing. Shortly after a fresh boot, the process might start out using around 450 handles and perform normally (reasonably quickly). A day later, it might be up to >1700 handles and perform poorly. Still later in that same Windows session, it might be back down to around 700 handles and still perform just as poorly.
• I only just started experimenting with Sysinternals' Process Monitor (filtered to display only events spawned by processes whose name contains "freefilesync"). I've exported the over 18,000 lines of results to a CSV file and imported it into LibreOffice Calc in the hope that I might be able to identify what kind of operations are taking the most time. I don't have a lot of big chunks of time to do this kind of work, but I'll do my best to squeeze some in over the coming month and post any pertinent-looking findings.
In the meantime, any tips on using Process Monitor to best effect would be appreciated!
- Posts: 1038
- Joined: 8 May 2006
It sounds as if your hardware is certainly capable.
Could it be something as simple as a drive sleeping, taking time to awaken?
(I'd not heard of UASP before.)
I do sleep. (And awakening, to a fully ready state, takes only a few seconds.)
(I run Win7.
Not any 10 / 11. Who knows what goes on with those beasts.)
So I would not think (I would hope that) GC would not be an issue in Win 10/11.
(I have seen - a program in development, that did eat 10K, & that certainly did cause issues.)
(After your system has been running) if you were to simply do a DIR /S on the source tree, & then on the target, do they both complete speedily?
One of the computers on my network is an old XP system. And for whatever reason, network connection speed between my particular computer & that XP system is very slow. I've never been able to figure that one out. Tried all types of drivers & settings of such, but always slow. (And of course that is most apparent when browsing large directories.) All other computers connect to the same XP system without issue. (I've not tried an entirely different NIC.)
Could it be something as simple as a drive sleeping, taking time to awaken?
(I'd not heard of UASP before.)
I rarely ever actually reboot. (Months or more at a time.)Microsoft's failure to do adequate "garbage collection" between reboots
I do sleep. (And awakening, to a fully ready state, takes only a few seconds.)
(I run Win7.
Not any 10 / 11. Who knows what goes on with those beasts.)
So I would not think (I would hope that) GC would not be an issue in Win 10/11.
So 700 or 1700 you would think wouldn't be an issue."handles"
I'm pretty sure 10K handles is the (single) process limit.
So if you're nowhere close to that, then it might be something else.
(I have seen - a program in development, that did eat 10K, & that certainly did cause issues.)
Maybe post your settings for said .ffs_batch & describe the hardware of the source & target.that same parsing takes 90–120 seconds. I noticed similar "inflated parsing delays" in some of my other "larger" (in terms of folder pairs) batch jobs.
(After your system has been running) if you were to simply do a DIR /S on the source tree, & then on the target, do they both complete speedily?
One of the computers on my network is an old XP system. And for whatever reason, network connection speed between my particular computer & that XP system is very slow. I've never been able to figure that one out. Tried all types of drivers & settings of such, but always slow. (And of course that is most apparent when browsing large directories.) All other computers connect to the same XP system without issue. (I've not tried an entirely different NIC.)
- Posts: 32
- Joined: 7 Aug 2018
@therube:
I really appreciate your reply and am sorry I didn't spot and respond to it earlier. (I'm a full-time family caregiver to an elderly parent with serious medical problems and don't have a lot of spare time and energy for "my own stuff"! ;-) I'm grateful for your input and will follow up on it as soon as I can. (I'm not sure exactly when that will be, but I will try not to let it sit for too long.)
Again, thank you, and all the best.
I really appreciate your reply and am sorry I didn't spot and respond to it earlier. (I'm a full-time family caregiver to an elderly parent with serious medical problems and don't have a lot of spare time and energy for "my own stuff"! ;-) I'm grateful for your input and will follow up on it as soon as I can. (I'm not sure exactly when that will be, but I will try not to let it sit for too long.)
Again, thank you, and all the best.
- Posts: 32
- Joined: 7 Aug 2018
First, I created two Windows command scripts and two voidtools Everything GUI search strings that list the contents of the 62 folders in my 31-folder-pair batch job (the biggest batch job I'm currently using). In each pair, one lists folders and subfolders only, and the other lists folders, subfolders, and files. (I've been able to use a timing utility in the command scripts, so those times-to-completion are precise. I haven't figured out the syntax for Everything's "se.exe" command-line utility yet, so those times-to-completion are approximate. Ditto for how long it takes FFS's GUI to parse and load the "big batch job" in question.)
Second, I created a spreadsheet with the following column headings:
• Windows 10 Uptime (time elapsed since last boot)
• FSS GUI (time required to parse and load the big job)
• Windows Dir /S /AD (time required to list the big job's folders and subfolders)
• Everything GUI nofiles: (time required to list the big job's folders and subfolders)
• Windows Dir /S (time required to list the big job's folders, subfolders, and files)
• Everything GUI (time required to list the big job's folders, subfolders, and files)
Now that I have a systematic way of recording my findings, I just need to build up some data sets for different uptimes.
Anecdotally, I can already say that the Windows "dir /s" script for folders, subfolders, and files took around 24 seconds to complete not long after a reboot and around 75 seconds to complete when the computer had been running for over 40 hours. Also (at the risk of stating the obvious), I can already confirm that Everything is always DRAMATICALLY faster than Windows at both types of search (folders/subfolders only, and folders/subfolders/files).
If anyone can think of other parameters I should be tracking, please let me know. The number of GDI handles used by FFS did not seem to correlate very closely to parsing slowdowns in FFS's GUI.
[A quick reminder: the problem I'm focusing on is how long it takes FreeFileSync's GUI to switch to a different batch job in the listing on the left after it's been clicked on, not how long it takes Compare and Synchronize to run. I'm talking about how long it takes FFS's GUI to parse the newly selected job and load its folder pairs in the righthand pane.]
Second, I created a spreadsheet with the following column headings:
• Windows 10 Uptime (time elapsed since last boot)
• FSS GUI (time required to parse and load the big job)
• Windows Dir /S /AD (time required to list the big job's folders and subfolders)
• Everything GUI nofiles: (time required to list the big job's folders and subfolders)
• Windows Dir /S (time required to list the big job's folders, subfolders, and files)
• Everything GUI (time required to list the big job's folders, subfolders, and files)
Now that I have a systematic way of recording my findings, I just need to build up some data sets for different uptimes.
Anecdotally, I can already say that the Windows "dir /s" script for folders, subfolders, and files took around 24 seconds to complete not long after a reboot and around 75 seconds to complete when the computer had been running for over 40 hours. Also (at the risk of stating the obvious), I can already confirm that Everything is always DRAMATICALLY faster than Windows at both types of search (folders/subfolders only, and folders/subfolders/files).
If anyone can think of other parameters I should be tracking, please let me know. The number of GDI handles used by FFS did not seem to correlate very closely to parsing slowdowns in FFS's GUI.
[A quick reminder: the problem I'm focusing on is how long it takes FreeFileSync's GUI to switch to a different batch job in the listing on the left after it's been clicked on, not how long it takes Compare and Synchronize to run. I'm talking about how long it takes FFS's GUI to parse the newly selected job and load its folder pairs in the righthand pane.]
- Posts: 1038
- Joined: 8 May 2006
One (among many) way to get "uptime" (in Windows):
https://docs.microsoft.com/en-us/sysinternals/downloads/psinfo
Everything reads the MFT, so Everything will be faster then anything else.
Oh, & a "timer", Timethis, https://www.softpedia.com/get/System/System-Miscellaneous/Microsoft-Timethis.shtml
psinfo.exe uptime
Everything reads the MFT, so Everything will be faster then anything else.
Instead of clicking on a different listing, if you were to Quit FFS, then restart it, & at that point switching to the new listing, is it again slow?I'm focusing on is how long it takes FreeFileSync's GUI to switch to a different batch job in the listing on the left after it's been clicked on
Oh, & a "timer", Timethis, https://www.softpedia.com/get/System/System-Miscellaneous/Microsoft-Timethis.shtml
- Posts: 32
- Joined: 7 Aug 2018
Thanks for the tips about timers and determining uptime. (I used a CLI utility called Gammadyne Timer for the timer and just looked at Task Manager > Performance > CPU > bottom panel for the uptime!)
Now that I think of it, in my spreadsheet, I should probably keep a record of how many times I've launched FFS in a given Windows session, and what batch jobs I ran. It's conceivable that it's FreeFileSync rather than Windows that's doing an imperfect job of "garbage collection." (I appreciate very much that FFS caches the scan results of Compare operations for a while, as it makes subsequent Compares much faster, but maybe it's using a counterproductive technique for "GUI parsing scans," like using a hodgepodge of temporary files that never get consolidated but that have to be read sequentially. I have no idea; I'm just spitballing.)
I haven't really done that particular test at different-length uptimes, but just now, the "big batch file" took around 15 seconds to parse-and-load during the same session I'd run it and around 28 other batch files in, and after I relaunched FreeFileSync's GUI, it took around 13 seconds. I see that kind of small variation within the same FFS GUI session, and the differences don't even begin to approach the difference between parsing time fresh after a boot (~1.5 seconds) and one I recall timing after around 1½ to 2 days of uptime (~60 seconds). In other words, exiting and restarting FreeFileSync's GUI does NOT fix the problem.Instead of clicking on a different listing, if you were to Quit FFS, then restart it, & at that point switching to the new listing, is it again slow?
Now that I think of it, in my spreadsheet, I should probably keep a record of how many times I've launched FFS in a given Windows session, and what batch jobs I ran. It's conceivable that it's FreeFileSync rather than Windows that's doing an imperfect job of "garbage collection." (I appreciate very much that FFS caches the scan results of Compare operations for a while, as it makes subsequent Compares much faster, but maybe it's using a counterproductive technique for "GUI parsing scans," like using a hodgepodge of temporary files that never get consolidated but that have to be read sequentially. I have no idea; I'm just spitballing.)
- Posts: 32
- Joined: 7 Aug 2018
A quick but important update:
First, I've been accumulating more data sets on my spreadsheet (and more columns for more non-FFS things my FFS slowdowns might correlate with).
Second, I very recently made a couple of fluke observations that might obviate the need for testing very much longer:
In the course of one testing run, at 118.5 hours of uptime, I got the longest parse-and-load delay for my "big" batch job yet: 157 seconds (versus ~1.5 seconds shortly after a boot). I had the idea of trying to launch Privazer — which clearly scans a lot of stuff just to load, before you've even told it what to scan for — to see how long it took to fully load, and it took dramatically longer than it usually does. (I didn't time that particular Privazer load, but it's one of my tests now.)
Then, in a subsequent Windows session, I mounted a FAT32-formatted thumb drive with nothing but data on it, and the four or five File Explorer windows I had open all began experiencing extreme lags and hangs. I checked Task Manager, and explorer.exe was thrashing the CPU to the tune of around 35% when no actual operations were taking place.
That prompted me to begin researching fixes for excessive CPU usage by Explorer. DISM and sfc.exe were not an issue, since I run those checks pretty regularly (and almost always get a clean bill of health), but I did disable the SysMain (formerly Superfetch) service, temporarily prevented System Explorer and Synctrayzor from autostarting, and turned off "Show all folders" in File Explorer. I also checked a couple of additional boxes in Disk Cleanup and ran that, and for good measure I reformatted my thumb drive as NTFS, in case Everything was somehow stumbling on it as an unindexable FAT32 drive.
I did that only yesterday, so it's too early to say whether explorer.exe was the problem underlying my FFS delays, and whether the "fixes" I applied have made a difference. I'll do my best to follow up here when I have more information.
First, I've been accumulating more data sets on my spreadsheet (and more columns for more non-FFS things my FFS slowdowns might correlate with).
Second, I very recently made a couple of fluke observations that might obviate the need for testing very much longer:
In the course of one testing run, at 118.5 hours of uptime, I got the longest parse-and-load delay for my "big" batch job yet: 157 seconds (versus ~1.5 seconds shortly after a boot). I had the idea of trying to launch Privazer — which clearly scans a lot of stuff just to load, before you've even told it what to scan for — to see how long it took to fully load, and it took dramatically longer than it usually does. (I didn't time that particular Privazer load, but it's one of my tests now.)
Then, in a subsequent Windows session, I mounted a FAT32-formatted thumb drive with nothing but data on it, and the four or five File Explorer windows I had open all began experiencing extreme lags and hangs. I checked Task Manager, and explorer.exe was thrashing the CPU to the tune of around 35% when no actual operations were taking place.
That prompted me to begin researching fixes for excessive CPU usage by Explorer. DISM and sfc.exe were not an issue, since I run those checks pretty regularly (and almost always get a clean bill of health), but I did disable the SysMain (formerly Superfetch) service, temporarily prevented System Explorer and Synctrayzor from autostarting, and turned off "Show all folders" in File Explorer. I also checked a couple of additional boxes in Disk Cleanup and ran that, and for good measure I reformatted my thumb drive as NTFS, in case Everything was somehow stumbling on it as an unindexable FAT32 drive.
I did that only yesterday, so it's too early to say whether explorer.exe was the problem underlying my FFS delays, and whether the "fixes" I applied have made a difference. I'll do my best to follow up here when I have more information.
- Posts: 32
- Joined: 7 Aug 2018
My apologies for the long hiatus; my family-caregiver duties have been even more demanding recently.
To cut to the chase, this appears to have been a Windows 10 issue. Having gotten fed up with accidental disconnects with my external USB backup drives, I began using a 16-port USB 3.0 hub, and because my laptop's (big, flat, rectangular, older) USB Type A ports are "looser" than its (smaller, oval, newer) USB Type C port, I began using a USB Type C to USB Type B cable to connect to the hub. Unfortunately, though the hub is supposed to be "dumb" and not require a driver, I eventually discovered that when I connected to it via my USB Type C port, Device Manager began flagging an unidentified USB device that would soon pop up again after being uninstalled. (That doesn't seem to happen when I connect via a USB Type A port.) My current hypothesis is that this degraded Windows/File Explorer's performance and that the longer the laptop had been running, the worse Explorer's performance got.
I switched back to connecting via a USB Type A port maybe a couple/few weeks ago. I don't think the "unidentified USB device" has popped up since the last time I uninstalled it, nor have I noticed any significant parsing-and-loading delays in FreeFileSync's GUI since then. (PrivaZer also loads a lot more quickly.) I did have one episode of laggy/hangy behavior in File Explorer a couple of days ago, but that may have been because I was running defrags on four external drives.
Anyway, while it's possible that the September 2022 Patch Tuesday updates fixed something buggy in File Explorer, I suspect the problem was caused by Windows 10 being unable to recognize when a combination Thunderbolt 3 / USB 3 Type C port is being used to connect to a "dumb," driverless USB 3.0 device. I wish I could nail down the problem more systematically and definitively, but even if I had the skills, I simply don't have the time.
My apologies to everyone who spent time trying to help me diagnose this issue. I did my best to figure out the problem as much as I could on my own, but as an ordinary user with limited administrator skills and even more limited free time, I just didn't do a very good job. Sorry! The good news is, I don't seem to be having significant parsing-and-loading delays in FreeFileSync anymore (and I hope I didn't just jinx it by writing that!).
Many thanks and all the best.
To cut to the chase, this appears to have been a Windows 10 issue. Having gotten fed up with accidental disconnects with my external USB backup drives, I began using a 16-port USB 3.0 hub, and because my laptop's (big, flat, rectangular, older) USB Type A ports are "looser" than its (smaller, oval, newer) USB Type C port, I began using a USB Type C to USB Type B cable to connect to the hub. Unfortunately, though the hub is supposed to be "dumb" and not require a driver, I eventually discovered that when I connected to it via my USB Type C port, Device Manager began flagging an unidentified USB device that would soon pop up again after being uninstalled. (That doesn't seem to happen when I connect via a USB Type A port.) My current hypothesis is that this degraded Windows/File Explorer's performance and that the longer the laptop had been running, the worse Explorer's performance got.
I switched back to connecting via a USB Type A port maybe a couple/few weeks ago. I don't think the "unidentified USB device" has popped up since the last time I uninstalled it, nor have I noticed any significant parsing-and-loading delays in FreeFileSync's GUI since then. (PrivaZer also loads a lot more quickly.) I did have one episode of laggy/hangy behavior in File Explorer a couple of days ago, but that may have been because I was running defrags on four external drives.
Anyway, while it's possible that the September 2022 Patch Tuesday updates fixed something buggy in File Explorer, I suspect the problem was caused by Windows 10 being unable to recognize when a combination Thunderbolt 3 / USB 3 Type C port is being used to connect to a "dumb," driverless USB 3.0 device. I wish I could nail down the problem more systematically and definitively, but even if I had the skills, I simply don't have the time.
My apologies to everyone who spent time trying to help me diagnose this issue. I did my best to figure out the problem as much as I could on my own, but as an ordinary user with limited administrator skills and even more limited free time, I just didn't do a very good job. Sorry! The good news is, I don't seem to be having significant parsing-and-loading delays in FreeFileSync anymore (and I hope I didn't just jinx it by writing that!).
Many thanks and all the best.
- Site Admin
- Posts: 7212
- Joined: 9 Dec 2007
@MartinPC:
Sounds as if the reason for the hang could be connected with the file accesses that FFS is making when populating the list controls. If that's the case (FWIW I'm not able to reproduce this, but I'm also not having USB issues), then FFS could in fact do something about it.
Can you send me a trace file for a slow FFS startup? https://freefilesync.org/faq.php#trace
Sounds as if the reason for the hang could be connected with the file accesses that FFS is making when populating the list controls. If that's the case (FWIW I'm not able to reproduce this, but I'm also not having USB issues), then FFS could in fact do something about it.
Can you send me a trace file for a slow FFS startup? https://freefilesync.org/faq.php#trace