Improving Backup Workflow (Mirror mode vs Two way mode)

Discuss new features and functions
Posts: 9
Joined: 18 Jun 2018

citizen8

Hi, I am not able to try at the moment, but should the two-way mode increase the scanning speed because of the use of the database file or is the same as the mirror mode for folders with huge size?
Last edited by citizen8 on 19 Jun 2018, 08:41, edited 1 time in total.
User avatar
Posts: 2288
Joined: 22 Aug 2012

Plerry

I would expect a two-way compare to be equal in speed to marginally slower than a "normal" compare.
Still the entire file- and folder-tree needs to be read, but now first, for the left- and right-location, will be compared to the file- and folder-tree status per the end of the previous FFS-sync, as stored in the respective local databases, and then those two outcomes being compared to derive required actions.
Posts: 9
Joined: 18 Jun 2018

citizen8

Thanks for your reply. I have changed the title according to what i am interested to learn. I am wondering, as a developer, if there is a good workflow for doing an incremental backup of external HD without every time the need to scan also folders that have not changed from the previous backup. Let's assume I have an external HD of 100GB, and I do 2 backups or more every day. Actually i do a Mirror from my local folder (FFS left side) to my external one(FFS right side). Because on a daily basis i don't change a lot of files, scanning every time the whole local folder is time consuming, especially if i need to update 2 papers, some projects code, a bit of photos an maybe a movie. So i was thinking as an alternative worflow to save only a subset of the current local folder, things that i really need to update when i am working locally. But in this way, i think after a while of incremental updates, i will have a huge subset that eventually i will need to reorganize in the external folder leading to a more consuming time.


I am interested to understand, maybe also from Zenju, what is your workflow to update incrementally local changes, without the need to scan things that has not changed? From a developer point of view, i understand that FFS need to scan every folder to know what changes have been made, but doesn't exist a better way to do this? If i have 10 folders, and i have made changes only in 5 of them, what is the point of scanning all the remaining folders? Shouldn't be more efficient to ignore folders that doesn't have at least one change based on the Time Stamp of that folder and his children folders?

Thanks for your time in advance.
Posts: 9
Joined: 18 Jun 2018

citizen8

I have figured out how to improve my backup workflow. Essentially i use FFS to do an incremental Mirror mode backup from a local subset folder of the corresponding external HD folder. Then in the external HD to make the content consistent between the superset folder and the subset folder in the external HD, this time i use FFS with Update mode. In this way the subset folder in the external HD shares the same folders path of the superset; so after months of incremental Mirror mode update i don't need to manually reorganize the subset folder with the folders in the corresponding superset folder but i just do an FFS update and it's done. If you have better ways to do this, i would like to here that.