The problem case and suggested resolution is well stated in the original post, so here I will simply quote that suggested resolution for reference (abbreviated), outline the specific use case I have, and then explain why this feature would save me large amounts of time, noting some possible implementation variations as food for thought.
The feature could be integrated into the original difference detection scan as a compare option (e.g. "Compare contents for files with same length and different timestamps"), or allowed as a subsequent pass which compares all files with different timestamps and same length. I'd prefer the former.I often find myself having to resolve conflicts for files that are actually identical, simply because the timestamp is different. To resolve this, I think it would be useful to include a new comparison option in FFS, that would have advantages of both current comparison methods:
- Compare using timestamp/size and length the criterion
- For those files with the same length but different timestamps (including conflicting timestamps), and only for those, proceed to compare the file contents.
With this procedure, only a relatively small number of files would actually have to be compared using their contents, which would speed up the comparison tremendously.
Note 1: This is a middle ground between comparing only by timestamp+length and comparing all files by contents.
Note 2: The intent is to identify files with different timestamps but identical content without having to compare contents for all files, and to then either visually deprecate (e.g. grey-out) or hide those files which differ only in timestamp. When synchronizing, these files would only update the timestamp in one direction or the other.
Note 3: It's not so much about reducing transfer times (though that would happen), but about reducing overall time to determine material differences and make an intelligent and accurate determination about which file needs to move in which direction.
This conflict occurs routinely for me because (a) I use a version control system (git) which changes timestamps to current whenever a branch is changed; and (b) because, for efficiency of development testing I routinely copy new artifacts directly to remote systems upon build so I can immediately restart and test them, and those systems are later fully synced with FFS. This feature would also efficiently eliminate 99% of conflicts I encounter, working as I do on a home computer, work computer, and laptop, and syncing them all via central cloud-based servers.
Currently I have to resolve these conflicts manually, one at a time, by launching an external tool on each file which might be identical. This is time-consuming, error-prone (easy to miss some and unnecessarily compare others), tedious, and worse, I can't do anything else productive while working through the files. Usually there's a handful of actually different files, and sometimes there's a handful scattered among a 100+ other conflicted but identical files.
For some systems I am trying to detect actual differences in over 5,000 files in order to build a patch for an older system -- meaning, I need an accurate accounting of all files which actually changed since the last release, excluding those where the VCS changed the timestamp. This use case is really time consuming for me because I work remotely and the targets of comparison are on remote systems using Windows file systems; elimination of content comparison of upwards of 90% of the files because they have the same timestamp and length would be a huge time saving.
I cannot overstate how beneficial this feature would be for my workflow, and, I believe, all people who work remotely.