Exponential slowdown when dealing with large numbers of small files

Get help for specific problems
mike loeven
Posts: 2
Joined: 4 Nov 2018

Post by mike loeven • 14 Apr 2019, 22:21

Been noticing some issues syncing a specific application's data folder between desktop and laptop

Running a backup with 8 threads and when initially starting the sync within a few seconds it will scan and analyze about 15K files however it will than take about 30 seconds to get to 25 and than about 5 minutes to get to 30k and the slow down keeps increasing exponentially as the number of small loose files being scanned increases. not sure what is causing the exponential performance loss however a simple file scan operation should theoretically maintain a consistent scan rate regardless of number of files in a directory especially since comparison mode size and date not actual file data

User avatar
Zenju
Site Admin
Posts: 4827
Joined: 9 Dec 2007

Post by Zenju • 16 Apr 2019, 17:34

Hard to tell in general. How are other tools behaving? As always you need to consider caching effects which could already explain this.

wm-sf
Posts: 71
Joined: 13 Nov 2003

Post by wm-sf • 16 Apr 2019, 18:26

I have had exactly the same problem since I installed 10.11 yesterday. Directories with many files bring it to a standstill. I noticed this most obviously on Google Chrome's \Code Cache\js directory which contained 65K or so very small files. I trimmed that last night to under 30K but my mirror this afternoon hadn't completed after more than an hour (normally it takes 10 minutes or so) and it was very obviously stepping through the files in that directly one by one. Win 8.1 no other system problems suddenly appeared in the last day or so. Probably best for me to go back to 10.8 (the last version I was using) which didn't display this problem (I kept the install file).