Huge increase in time required to parse batch jobs if Windows has been running for a while

Get help for specific problems
MartinPC
Posts: 28
Joined: 7 Aug 2018

Post by MartinPC

I've made a somewhat bizarre observation over the past few months, running my standard backup batch jobs from FreeFileSync's GUI. Let me explain:

I'm currently running Windows 10 21H1 and Windows Defender. I have 24 different FreeFileSync batch jobs that I use to back up my computer's data files and select configuration files. Some of the batch jobs have only one folder pair. Most of them have from around 6 to around 14 folder pairs. But one of them has around 100 folder pairs.

Well, when my computer is freshly booted, and all autostart items have finished doing their thing and either exited or gone semi-dormant, if I launch FFS and select that 100-folder-par batch job — whether right away or after selecting other batch jobs first — FFS parses the file and renders its Compare and Synchronize buttons "active" within maybe 3–4 seconds. If I do the same thing after the computer has been running for several hours, with maybe half a dozen programs launched, used, and exited, that same parsing takes 90–120 seconds. I noticed similar "inflated parsing delays" in some of my other "larger" (in terms of folder pairs) batch jobs.

I'm guessing this is the result of Microsoft's failure to do adequate "garbage collection" between reboots in certain key indexes or databases that FFS relies on. But honestly, having no background in coding, I just don't know.

Has anyone else run into this issue in Windows? In Linux? In MacOS? Has anyone found a workaround short of rebooting?

Anyway, it's a problem. When you can play and win a game of Freecell while waiting for FFS's GUI to parse and load a batch file, something is wrong! ;-)
User avatar
Zenju
Site Admin
Posts: 6285
Joined: 9 Dec 2007

Post by Zenju

My guess would be GDI handles. They're an expensive resource, make for a slow GUI if too many are used, and cause crashes if their limit is exhausted. And 100 folder pairs is a lot.
The only way to handle such huge lists, would be for FFS to program a custom folder pair control. Not really an option. So the best solution would be to use filter settings instead of folder pairs.
therube
Posts: 426
Joined: 8 May 2006

Post by therube

You can check "handles" with a program like Process Hacker (or Process Explorer).

Right click the FreeFileSync_x64.exe name (on Windows, presumably), Properties, then on the Statistics tab, in the Other box it will show the number of handles.

I'm pretty sure 10K handles is the (single) process limit.
So if you're nowhere close to that, then it might be something else.
MartinPC
Posts: 28
Joined: 7 Aug 2018

Post by MartinPC

[T]he best solution would be to use filter settings instead of folder pairs. Zenju, 01 Jul 2022, 11:25
First, I really appreciate the prompt response!

I would love to be able to use filters (especially exclusion filters) instead of "too many" folder pairs. Unfortunately, my "100-folder-pair" batch job is one of many that I designed to be runnable with RealTimeSync, and RealTimeSync doesn't support filters. Because the parent folder of most of the folder pairs in my 100-folder-pair batch job contains OTHER folders whose content is CONTINUOUSLY CHANGING, if I used filters instead of numerous folder pairs, RealTimeSync would trigger "filtered-batch-job" runs CONSTANTLY whenever those other programs are in use — even though nothing in those programs' continuously changing folders would ever get synced (because they are filtered out of the FFS batch job).

But wait. Are you suggesting that if I used inclusion filters in lieu of folder pairs, it might NOT trigger those mind-numbing "GDI-handle delays"? If so, I suppose I could use a RTS task to monitor all 100 folder pairs and to trigger the "filtered FFS batch job" whenever a change is detected. It would double the amount of RTS/FFS "editing work" I'd have to do, but if it made for substantially faster syncs, it might nonetheless be worth it.

Still, though, I'd greatly prefer it if RTS tasks could automatically import filters from the FFS batch jobs they're based on. I'd much rather be able to, for example, sync all of ...\AppData\Roaming and filter out the small minority of problematic subfolders (e.g., browsers and LibreOffice) than have to create dozens of folder pairs for all of the subfolders that aren't problematic. But again, with no background in coding, I have no idea whether this is practical and worthwhile in terms of resource usage, or even if it's technically feasible. I'm just sayin' it seems like it would be pretty nice!

I suppose I could always just chop the 100-folder-pair batch job into four 25-folder-pair batch jobs or five 20-folder-pair jobs, but it's not a very appealing prospect. I already have 24 distinct batch jobs to back up key files from my system drive (distinct from imaging), and I have four different copies of each of those for safely maintaining different backup drives. The thought of having an additional 12 or 16 batch jobs to wrangle makes me feel tired!
MartinPC
Posts: 28
Joined: 7 Aug 2018

Post by MartinPC

You can check "handles" with a program like ... Process Explorer). * * * * therube, 01 Jul 2022, 16:15
Thanks! I already have Process Explorer on board and will use it to look at FFS's handles the next several times I run FFS — both immediately after a reboot and after Windows has been running for a while. I appreciate the suggestion.
MartinPC
Posts: 28
Joined: 7 Aug 2018

Post by MartinPC

INTERIM UPDATE:

I don't yet have enough data points to say much useful about GDI handles, other than that my "big" batch job seems to show a little over 500 handles for FreeFileSync x64 in Process Explorer shortly after a computer reboot, and, other things being equal, over 1500 handles when the computer has been running for many hours. I don't understand why this would be the case.

On the non-GDI front, as an experiment I made a modified copy of my "100-folder-pair" batch job, using exclusion filters instead of inclusionary folder pairs for my AppData\Roaming folders. That brought the total number of folder pairs in the job down from 103 to 31. When my computer has been running for many hours, it takes FFS's GUI around 15–20 seconds to parse and load the "filtered" job, versus 90–120 seconds for the "inclusionary-folder-pairs" job. That's a pretty big improvement, but it means that I can't make a usable RealTimeSync task for the job the quick and easy way. I would have to create an RTS task with nothing but inclusionary folder pairs — which is tedious but doable.
100 folder pairs is a lot. The only way to handle such huge lists, would be for FFS to program a custom folder pair control. Not really an option. Zenju, 01 Jul 2022, 11:25
It is really, really, REALLY not an option? Because reducing the duration and frequency of those "not responding" messages in FFS's GUI would be really, really, REALLY nice! ;-) This problem is bothering me on a laptop with a fast, recently optimized 4-lane NVMe SSD with 260GB of free space, a 10th-generation i7 CPU, and typically at least 6GB of free dual-channel DDR4 RAM to spare. Okay, so my external backup drives are all UASP USB 3.0, but still, I feel frustrated by FreeFileSync's performance when my computer has been up and running for a while (which is usually the case).

I'm still wondering, incidentally, whether this is a problem specific to Windows as opposed to Linux and MacOS. Was I on the right track in suspecting that Microsoft does a poor job of "garbage collection" [or consolidation/integration of continual changes in certain resources] between boots?
MartinPC
Posts: 28
Joined: 7 Aug 2018

Post by MartinPC

I'm not a programmer, unfortunately, so I don't have a very good idea of how to systematically look for bottlenecks when performance issues like this come up. That said, I do have some newish observations:

• When my computer is fresh off a reboot, and no foreground programs have been launched (with the exception of FFS), it takes FFS's GUI around 1.5 seconds to parse and load the "filtered" version of my biggest batch job (which has only 31 folder pairs compared to 103 in the "unfiltered" version). When the computer has been running for over 120 hours, with various foreground programs (including FFS) launched, used, and exited in that time, it takes FFS's GUI a full 60 seconds to parse and load that same batch file. In fact, all of my batch jobs take around 2 seconds per folder pair to get parsed and loaded.

• Regarding GDI handles used by FreeFileSync_x64.exe, I'm not seeing a clear correlation with slower batch-job parsing. Shortly after a fresh boot, the process might start out using around 450 handles and perform normally (reasonably quickly). A day later, it might be up to >1700 handles and perform poorly. Still later in that same Windows session, it might be back down to around 700 handles and still perform just as poorly.

• I only just started experimenting with Sysinternals' Process Monitor (filtered to display only events spawned by processes whose name contains "freefilesync"). I've exported the over 18,000 lines of results to a CSV file and imported it into LibreOffice Calc in the hope that I might be able to identify what kind of operations are taking the most time. I don't have a lot of big chunks of time to do this kind of work, but I'll do my best to squeeze some in over the coming month and post any pertinent-looking findings.

In the meantime, any tips on using Process Monitor to best effect would be appreciated!
therube
Posts: 426
Joined: 8 May 2006

Post by therube

It sounds as if your hardware is certainly capable.
Could it be something as simple as a drive sleeping, taking time to awaken?

(I'd not heard of UASP before.)
Microsoft's failure to do adequate "garbage collection" between reboots
I rarely ever actually reboot. (Months or more at a time.)
I do sleep. (And awakening, to a fully ready state, takes only a few seconds.)
(I run Win7.
Not any 10 / 11. Who knows what goes on with those beasts.)
So I would not think (I would hope that) GC would not be an issue in Win 10/11.
"handles"
I'm pretty sure 10K handles is the (single) process limit.
So if you're nowhere close to that, then it might be something else.
So 700 or 1700 you would think wouldn't be an issue.
(I have seen - a program in development, that did eat 10K, & that certainly did cause issues.)
that same parsing takes 90–120 seconds. I noticed similar "inflated parsing delays" in some of my other "larger" (in terms of folder pairs) batch jobs.
Maybe post your settings for said .ffs_batch & describe the hardware of the source & target.

(After your system has been running) if you were to simply do a DIR /S on the source tree, & then on the target, do they both complete speedily?

One of the computers on my network is an old XP system. And for whatever reason, network connection speed between my particular computer & that XP system is very slow. I've never been able to figure that one out. Tried all types of drivers & settings of such, but always slow. (And of course that is most apparent when browsing large directories.) All other computers connect to the same XP system without issue. (I've not tried an entirely different NIC.)
MartinPC
Posts: 28
Joined: 7 Aug 2018

Post by MartinPC

@therube:

I really appreciate your reply and am sorry I didn't spot and respond to it earlier. (I'm a full-time family caregiver to an elderly parent with serious medical problems and don't have a lot of spare time and energy for "my own stuff"! ;-) I'm grateful for your input and will follow up on it as soon as I can. (I'm not sure exactly when that will be, but I will try not to let it sit for too long.)

Again, thank you, and all the best.