Keep previous versions of files

Discuss new features and functions
Posts: 24
Joined: 25 Nov 2009

bkeadle

I should add.. the way you put it, "Recycle bin for USB and network shares"
does sound attractive, but again, it seems an arbitrary way of keeping
backups, and is at risk of deleting an only backup of any (random) file.
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

I'm beginning to see your point: We're dealing with two similar but different
scenarios:
1. the user is manually managing his files and as a human is making mistakes. So he needs a way to quickly undo a deletion via Recycle Bin.
2. file management is handled by a synchronization tool. Conceptually this can be seen as one layer above 1: since the individual operations are guided by rules (automatic, mirror sync) and sync is automated, there is less demand for a facility to undo an accidental deletion. Further, sync is initiated at regular times when the user considers his data consistent. So the demand is more for keeping a backup of different versions than undoing inconsiderate deletions.

A "limit by total size" respects limited disk space more than the implicit
knowledge that more recent versions contain more relevant data. Considering
today's big backup hard drives, this may not be the right tradeoff anymore.

This leaves "limit revision count" and "by date". Former doesn't limit the
total size directly, it still scales with the total size of the user's backup
data. E.g. setting a revision count of 10 limits the total size of the
revisions directory to roughly 10 x user data size (assuming the file sizes
stay the same). Also it will keep at least one version per file. The overall
semantics look quite useful.

For "limit by x days" there are two variants: I) Apply to each file after
sync. This will delete all revisions of a particular file if it was not
updated in the recent x days. This seems to be similar a behavior like Recycle
Bin: Ensure to be able to recover data within a limited time frame (for
recycler the time frame is implicitly defined by its size and number and size
of new deletions) From a backup perspective it's less useful as you generally
may not find any "old" versions.

II) Apply only to newly revisioned files: This ensures there will always be at
least one old version per file, similar to "limit revision count". On the
other hand there is no (implicit) limit on the total size of revisioned data.
Large collections of old revisions will not be cleaned until a new revision is
added for a particular file.

So far a "limit revision count" seems to offer the greatest advantages and the
least drawbacks.
Posts: 24
Joined: 25 Nov 2009

bkeadle

Well said. On your later point, seems that II) is best option.
User avatar
Posts: 2453
Joined: 22 Aug 2012

Plerry

My earlier suggestion to use an AND condition of X-versions and Y-days:

> Versions only get deleted if they are at least Y-days old and if there are
at least X newer versions.


applied to each file after sync seems to allow the benefits of both option I)
and II) above.
* setting X to 0 would effectively give option I) :
all revisions older than Y-days will get deleted
(X=0 AND Y=0 might need to be flagged as deleting all / preventing any
revisions)
* setting X to any integer >0 would give option II) :
at least one most recent previous version is available (if the file was ever
changed ...)
(Y=0 would just keep the latest X versions, if any)
Disadvantage for X>0 and Y>0 might be that the size explodes due to frequently
getting changed.

Plerry
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

For v5.7 I've implemented the limit on revision count which has received the
greatest consensus. Here is the beta for testing:
[404, Invalid URL: http://freefilesync.sourceforge.net/FreeFileSync_5.7_beta_setup.exe]
Posts: 7
Joined: 12 Oct 2010

wolfcry

Looks like I am late to the party... I have created a script that purge the
deleted folder based on X copies, and maintain a minimum time gap between
copies. The is defined by the below command:

TimeGaps=Array(1,2,3,4,8,16,32,90)

‘ Array that define the time gaps for the backup copies
' Default to:
' The 1st copy is older than 1 day but younger than 2 days
' The 2nd copy is older than 2 day but younger than 3 days
' ...
' The 5nd copy is older than 8 day but younger than 16 days
' For all cases, if there are more than 1 copies within the range, we keep the
oldest copy.

The unit of measurement may be days, hours, minutes etc... below is the code
    Option explicit
   
    ' Unit of measurement: "d" for time gap in Days, "m" for Months, "h" for hours "n" for minutes
   
    Const stUOM="h"
   
   
    ' Array that defines the time gaps for the backup copies
    ' Default to:
    ' The 1st copy is older than 1 day but younger than 2 days
    ' The 2nd copy is older than 2 day but younger than 3 days
    ' ...
    ' The 5nd copy is older than 8 day but younger than 16 days
    ' For all cases, if there are more than 1 copies within the range, we keep the oldest copy.
   
    TimeGaps=Array(1,2,3,4,8,16,32,90)
   
   
    ' Keep a minimum number of copies, regardless of timegap
    Const MinNumOfCopies=3
   
   
    'Delete files older than 32 days, regardless of # of copies to keep
    Const DeleteExpiredCopies=TRUE
   
   
    Dim f , LogFile, CSVFile
    Dim FirstRun
   
    'index starts fronm 0
    Dim TimeGapsIndex
    Dim TimeGaps
    Dim NoOfCopiesToDelete
    Dim CopiesDeleted
    Dim bAlreadyKeptACopy
   
    Const Simulate=FALSE
    Const logCSV=TRUE
    Const logLOG=FALSE
   
    ' This script compliment FreeFileSync to purge the files deleted to user-defined directory.
    ' Credit to [url]http://sogeeky.blogspot.com/2006/08/vbscript-using-disconnected-recordset.html[/url] where I learn ador.recordset
    ' and [url]http://www.scriptinganswers.com/forum2/forum_posts.asp?TID=2099&PN=1[/url] where I learn how to delete the files.
    ' and [url]http://www.tek-tips.com/viewthread.cfm?qid=1472748&page=4[/url] on processing named arg
    '
    ' Parameters
    '
    ' /Path: The user-defined directory defined within FreeFileSync
    '
   
    Dim fso, startFolder, rs
    Set fso = CreateObject("Scripting.FileSystemObject")
   
    Dim Args
    'Process Argument
        set Args = wScript.Arguments.Named
           
        If Args.Exists("Path") Then
            startFolder= Args.Item("Path")
        else
            ' Default path to folder where the script is.
            startFolder= fso.GetParentFolderName(Wscript.ScriptFullName)
        end if
   
    Set Args=Nothing
   
    set rs = createobject("ador.recordset")
    '  Const for adro.recordset
    Const adVarChar = 200
    Const adWVarChar = 202
    Const adDate = 7
    Const adBSTR = 8
    Const adDouble = 5
    Const MaxCharacters = 255
    Const adNumeric=131
   
    with rs.fields
        .append "FileNameFullPath",adWVarChar , MaxCharacters
        .append "FileName",adWVarChar , MaxCharacters
        .append "FileAge",adDouble
    end with
    rs.open
   
    Const ForReading = 1, ForWriting = 2, ForAppending = 3
    Const Tristate=-1 ' 0=ASICC, -1:Unicode, -2: system default
   
    if logLOG then Set LogFile = fso.OpenTextFile(startFolder&"\DelOldFileStep.log", 8, True, Tristate)
    if logCSV then Set CSVFile = fso.OpenTextFile(startFolder&"\DelOldFileStep.CSV", 8, True, Tristate)
   
    For TimeGapsIndex=0 to UBound(TimeGaps)
        if logCSV then CSVFile.Write(TimeGaps(TimeGapsIndex)&":" )
    NExt
   
    if logCSV then CSVFile.Writeline()
   
   
    '---------------------- Collect file records------------------
    FirstRun=True  'used in GetFilesRecords
    GetFilesRecords startFolder
   
    rs.Sort = "FileName,FileAge DESC,FileNameFullPath DESC" ' DESC/ASC
   
    '----------------------- Off load records for debug ---------------
   
    rs.MoveFirst
   
    Do Until rs.EOF
        if logCSV then CSVFile.Writeline(rs.Fields.Item("FileName") &vbtab &rs.Fields.Item("FileAge")&vbtab & _
            left(rs.Fields.Item("FileNameFullPath"),len(rs.Fields.Item("FileNameFullPath"))-len(rs.Fields.Item("FileName")) ))
        rs.MoveNext
    Loop
    if logCSV then CSVFile.Writeline
   
    '----------------------- Purge files---------------
   
    f=""
    rs.MoveFirst
   
   
    TimeGapsIndex=UBound(TimeGaps)
   
    if logCSV then CSVFile.WriteLine("Purge "&startFolder&" on"&vbtab&Date&vbtab&time)
   
   
    Do Until rs.EOF
           if NOT (rs.Fields.Item("FileName") = f) then
            if DeleteExpiredCopies then  'If  you should delete files that are older that the oldest gap?
                NoOfCopiesToDelete=DeleteFileCount(MinNumOfCopies,TimeGaps(UBound(TimeGaps)))
            else
                NoOfCopiesToDelete=DeleteFileCount(MinNumOfCopies,0)'
            End if
           
            if NoOfCopiesToDelete > 0 then
                f=rs.Fields.Item("FileName")
                TimeGapsIndex=UBound(TimeGaps)
                bAlreadyKeptACopy=FALSE
                CopiesDeleted =0
                'Do not move next so that the first entry is tested too
                'rs.MoveNext
            End if       
        end if
        if  NoOfCopiesToDelete>0 then       
            if rs.Fields.Item("FileAge")>=TimeGaps(TimeGapsIndex) then
                if logCSV then CSVFile.Write(f &vbtab & rs.Fields.Item("FileAge") &vbtab & left(rs.Fields.Item("FileNameFullPath"),len(rs.Fields.Item("FileNameFullPath"))-len(f) ))
                if bAlreadyKeptACopy then
                ' If we already kept a copy for this range...
                    if NoOfCopiesToDelete> CopiesDeleted then
                        if NOT Simulate then fso.DeleteFile(rs.Fields.Item("FileNameFullPath"))
                        CopiesDeleted =CopiesDeleted +1
                        if logCSV then CSVFile.Write(vbtab&"deleted "&CopiesDeleted& "/" &NoOfCopiesToDelete)
                    End if
                else
                    ' Delete all files older than the oldest time id DeleteExpiredCopies=TRUE
                    if ((TimeGapsIndex)=UBound(TimeGaps) AND DeleteExpiredCopies) AND NoOfCopiesToDelete> CopiesDeleted then
                        bAlreadyKeptACopy=FALSE
                        if NOT Simulate then fso.DeleteFile(rs.Fields.Item("FileNameFullPath"))
                        CopiesDeleted =CopiesDeleted +1
                        if logCSV then CSVFile.Write(vbtab&"deleted "&CopiesDeleted& "/" &NoOfCopiesToDelete)
                    else
                        bAlreadyKeptACopy=TRUE
                        if logCSV then CSVFile.Write(vbtab&"Keep Index "&TimeGapsIndex& " "&TimeGaps(TimeGapsIndex))
                    End if               
                end if
                rs.MoveNext
                if logCSV then CSVFile.Writeline()
            else
                if TimeGapsIndex > 0 then
                    TimeGapsIndex=TimeGapsIndex-1
                    bAlreadyKeptACopy=FALSE
                end if
            end if       
        End if   
    Loop
   
    if logCSV then CSVFile.WriteLine("Purge "&startFolder&" ended on "&Date&" "&time)
    if logLOG then LogFile.WriteLine("Purge "&startFolder&" ended on "&Date&" "&time)
    if logLOG then LogFile.Close
    if logCSV then CSVFile.Close
    set fso=Nothing
    set rs=Nothing
   
   
    '----------------------
    Function  DeleteFileCount(ByVal NoOfCopies,ByVal MaxAge)
    Dim BookMark
    Dim iCount
    Dim ExitLoop
    Dim refFile
    ExitLoop=False
   
    iCount =0
    Bookmark=rs.Bookmark
    refFile=rs.Fields.Item("FileName")
   
    'Assume EOF when function ends
    DeleteFileCount=0
   
    Do Until rs.EOF  OR ExitLoop
        if (rs.Fields.Item("FileName") = refFile)  then
            iCount=iCount+1
            if (MaxAge> 0 AND rs.Fields.Item("FileAge") > MaxAge )  then iCount=iCount+1
            rs.MoveNext
        else
            if iCount > NoOfCopies then 
                rs.Bookmark=Bookmark
                ExitLoop=TRUE
                DeleteFileCount=iCount-NoOfCopies
                if DeleteFileCount < 0 then DeleteFileCount=0
            else
                iCount =0
                DeleteFileCount=0
                Bookmark=rs.Bookmark
                refFile=rs.Fields.Item("FileName")
            end if
        End if
    loop
    End Function
   
   
   
    '------------------------
    Function GetFilesRecords(folderName)
    Dim folder, file, fileCollection, folderCollection, subFolder
    Dim FileRelPath
   
        Set folder = fso.GetFolder(folderName)
        Set fileCollection = folder.Files
       
        if NOT (startFolder=folderName) then
                For Each file In fileCollection
                FileRelPath=right(file.Path,len(file.Path)-len(startFolder)-2)
                if len(FileRelPath) > 15 then
                rs.addnew
                rs("FileNameFullPath").Value=CStr(file.Path)
                rs("FileName").Value=CStr(right(FileRelPath,len(FileRelPath)-instr(FileRelPath,"\")+1))
                rs("FileAge").Value=DateDiff(stUOM,file.DateLastModified,Now)/24
                rs.update
                end if
   
                Next
        end if
   
        Set folderCollection = folder.SubFolders
        For Each subFolder In folderCollection
        ' Add a simple check to ensure that the start folder is correct.
        ' FreeFileSync folder is named as yyyy-mm-dd tttttt
        If FirstRun AND Not mid(subFolder.Path,len(subFolder.Path)-6,1)=" " then
            wscript.echo subFolder.Path&" does not look like a folder from FreeFileSync"
            wscript.quit
        else
            FirstRun=False
        End if
            GetFilesRecords subFolder.Path
        ' Delete empty folders
            If fso.getfolder(subFolder.Path).SubFolders.Count = 0 AND fso.getfolder(subFolder.Path).Files.Count = 0 Then
                fso.DeleteFolder(subFolder.Path)
        End If
   
        Next
    End Function

Anonymous

Hi, I have a question about the new behavior of versioning in 5.7

If there are two configurations:

-moms-stuff: syncs /users/mom/pictures and moves older versions to /versions
-dads-stuff: syncs /users/dad/pictures and moves older versions to /versions

If I run both, FFS 5.7 will create a directory called /versions/pictures,
where mom's and dad's pictures will be mixed. It would seem more sensible to
create a directory for each sync configuration:

- /versions/moms-stuff/pictures
- /versions/dads-stuff/pictures

This is the same behavior of 5.6, without the timestamp.

Congratulations for this program, after trying many similar ones, it's been a
very long time since I stopped searching, FFS does exactly what I need.
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

> sensible to create a directory for each sync configuration


Likewise the user might want to put both into the same versions directory. By
not adding the jobname (which is not even available for a non-saved GUI
config) both scenarios can be fulfilled.

Anonymous

Would it make sense to have variables like these?

/versions/%job_name% %job_timestamp%

This would add flexibilit,y.
Posts: 74
Joined: 17 Mar 2008

mfreedberg

I am seeing something odd with regards to file modified dates after migrating
to the 5.7 version and using the new versioning option. The new version is
being created in the right place, using the new (correct, as in original)
folder name with versions of the file there, but the versioned file does not
seem to have its original modified date intact, all dates on the file are now
the original created date.

Using PowerShell to get the lastwritedate for the versioned file:
(get-item "...:\backup\old
versions\2012-September\Documents\...\the_dream_was_the_same_every_n.txt
2012-09-08 203257.txt").lastwritetime

Sunday, January 30, 2011 11:58:44 AM

The modified file that was backed up:
(get-item "...\backup\...\Documents\...\the_dream_was_the_same_every_n.txt").l
astwritetime

Saturday, September 08, 2012 7:53:54 PM

Is this by design or an issue with the file copy/versioning routine? That does
not seem correct to me, that the version-backup file no longer has its
original last modified date - is this a bug or something FFS is doing by
design? Changing last modified dates on existing files does not seem like a
good idea to me.
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

> Would it make sense to have variables like these?


It's probably a sign that a thread has become too long, if items discussed at
the beginning are rediscovered at the end ;)

@mfreedberg:
It seems you're comparing the modification time of the revisioned file against
the modification time of the current active version, which is unrelated.
Generally FFS preserves file modification times.
Posts: 74
Joined: 17 Mar 2008

mfreedberg

@Zenju - the current active version does have the right created date and
last modified date, but the revisioned file does not. What do you see in your
tests with revisioned files?

What date *should* we see on the revisioned file? I totally agree, FFS
normally preserves the file modification times, but I do not see that to be to
the case on the revisioned file.

Maybe this should be a new thread (grin).
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

There is probably a misunderstanding as to what versioning does: When FFS
updates a file, it revisiones the old file first, by moving it to a custom
folder, preserving it's modification time. Then the new version is copied
over. This is the sequence of steps, at least conceptually. This means the
versioning folder never has the most current version, but only the second-most
current one.

Anonymous

> It's probably a sign that a thread has become too long



It's a sure sign of laziness and disrespect on my side, my apologies.

Regarding the %variables% issue, I liked them because using them in the
Deletion handling dropdown wouldn't clutter the UI and would be transparent to
those who did not need them. Nevertheless, I understand the arguments made for
simplicity.
Posts: 74
Joined: 17 Mar 2008

mfreedberg

thanks for the update on the approach:

> When FFS updates a file, it revisiones the old file first, by moving it to a
custom folder, preserving it's modification time. Then the new version is
copied over. This is the sequence of steps, at least conceptually. This means
the versioning folder never has the most current version, but only the second-
most current one.



The problem I am seeing on my system is that the old file, when moved to the
custom folder and also renamed, does not seem to be retaining its last
modified date. I definitely understand that it is the second most current
version of the file, but I am not sure that the process is successfully
retaining the last modified date.

I will do some more testing at home, but if you can also verify on your end
that files moved to the custom folder and renamed retain their last modified
date, that would be great!
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

> if you can also verify


I checked, it works fine in my tests.
Posts: 5
Joined: 12 Sep 2012

marquesjm

Hi all!

Sorry to bother but i prefered the way FFS versioned the files and folders in
version 5.6 .
I could easily overwrite a full version structure tu a full backed up folder
and recover its full state in time.
To increment a version number in filename is to subvert the original data
since version 5.7 is changing the original filename.

I would sugest that the new way of versioning could give the user the option
to choose wich way he wants to versions its file changes.

I know its difficult to please everyone and for that im still using 5.6
instead 5.7.
Anyway ass i see that you are great in accepting suggestions here is mine.

I also want to give my congratulations for the amazing application that is
FFS. I've tried a lot of them and for me FFS is the Top.

Thank you all.

Best Regards

Jorge
Posts: 5
Joined: 12 Sep 2012

marquesjm

Sorry for my typos :\
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

I don't see how you could take advantage of the old (<5.7) versioning scheme:
If you want to restore a specific state, you would have to start with the
current version, and then manually copy over all version folders beginning
with the most recent down to the date you want to restore. Also this approach
would leave files behind that have been newly created in between "now" and the
date you are restoring.
It's possible I'm overlooking an important scenario, but right now I don't see
much of a functional loss. But clearly, the new scheme places even more
emphasis on single file revisioning.
Posts: 5
Joined: 12 Sep 2012

marquesjm

Ok In my options i have

Mirror
Versioning

With mirror the destination always equals the origin. In thbe versioning all
the files that are deleted or changed are moved to a version folder with a
date.

If i want to recover the full origin to a certain date that i have the version
i just need to copy the version folder over the full mirrored version.

I have several schedules, one per day of week in order to have recover folders
in versioning (5.6).

:)
Posts: 5
Joined: 12 Sep 2012

marquesjm

Im using FFS as a file backup solution :)
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

> copy the version folder over the full mirrored version.


There are two types of versioning that need to be distinguished:
1. single file versioning: this is possible with both the old and the newer versioning scheme. The newer scheme however makes it easier since all versions are listed in the same folder.
2. restore all files for a given time: This is possible neither with the old nor with the new versioning scheme. In the old scheme one might pull stunts like described in my last post, but essentially this is no real solution.

This is how it currently looks. 2 doesn't sound very good, but if there is a
better way, I'll probably implement it once I become aware of it.
Posts: 4
Joined: 4 Apr 2010

jobz

Hmm, just discovered new versioning algorithm after upgrading to 5.7. Guess
it's too late to vote no.

I guess it's age, but I now hate change unless it brings significant benefits.
With new versioning algorithm, you still can't properly restore because there
is no full backup to restore from, so what's the point?

Rant over, on a more constructive note, read up 'DeLorean Copy' at http://schinagl.priv.at/nt/ln/ln.html#deloreancopy. It may give you insight on how to improve FreeFileSync
further.
Posts: 1
Joined: 26 Aug 2008

billybuerger

Thought I would add that I upgraded to 5.7 and didn't think to check my batch
runs afterwards. Noticed a couple weeks later that all of my versioning wasn't
working. I was getting folders but no files. I have everything setup using
ffs_batch files that run on a schedule. I see the format of the xml changed
and it left it working but not fully and with no messages that I've seen. I
just updated all my ffs_batch files and will be checking the next couple days
to make sure I didn't miss anything. Just something to be aware of if you
upgrade, make sure you open your files in FFS and save them to reformat the
xml. Or as I did, save one and then manually update the rest with the updated
format.
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

> I guess it's age, but I now hate change unless it brings significant
benefits. With new versioning algorithm, you still can't properly restore
because there is no full backup to restore from, so what's the point?


If you want to find all versions for a particular file, you got them now all
listed next to each other. So it's at least one improvement without giving up
much.


> DeLorean Copy


Looks like what has been discussed here: viewtopic.php?t=1843


> Noticed a couple weeks later that all of my versioning wasn't working


This is unfortunately a very lame config migration bug.


> make sure you open your files in FFS and save them to reformat the xml


And this is the workaround until the fix in the next release.
Posts: 1
Joined: 11 May 2009

tgeri

Zeniju: i think he problem with versioning currently is with the limit: now I can limit the number of versions. The problem is if I have small files which change frequently and other large files change rarely. If I set 10 for limit it is not ok: I will have 10 versions of the small file but for example the oldest is from yesterday. It is not enough. And I have 10 versions of the large one but the oldest is a year old for example.

The other problem now that if i create a file and the run ffs it is mirreored for exmaple. Then I delete this file. It goes to the versioning folder. And that's it. It won't be deleted EVER. Right? If the user could set "keep versions for 2 weeks' then It would be deleted after 2 weeks. Hmm?

Br,
TG
Posts: 7
Joined: 12 Oct 2010

wolfcry

For your first problem, you may like to consider setting two set of sync with both the same source and target folders. You can limit the first set to file size less than a certain size and limit the version to, say 50. You can then limit the second set to file size greater than a certain size and limit the version to, say 5.


I did post a vb script that purge the folder based on the age of the file. For example, by defining TimeGaps=Array(1,2,8,16,32,90)

The 1st copy is older than 1 day but younger than 2 days, if indeed there is such a file
The 2nd copy is older than 2 days but younger than 3 days, if indeed there is such a file
The 3rd copy is older than 3 days but younger than 8 days, if indeed there is such a file
The 4th copy is older than 8 days but younger than 16 days, if indeed there is such a file
The 5nd copy is older than 16 days but younger than 32 days, if indeed there is such a file

See viewtopic.php?t=1827&p=7447#p7447
Posts: 2
Joined: 20 Nov 2012

valeriy-egorov

Deletion handling from version 5.6 to make as a new type of "Deletion handling" = Backup.

example:
Have a file SyncName.ffs_gui with the following contents:
1 line) from \\Server\PC1\DirSync\*.* To \\PC1\DirSync
2 line) from \\Server\PC2\DirSync\*.* To \\PC2\DirSync
backup folder - \\Server\Backup.

After the first synchronization (variant=mirror) files have changed as follows:
\\Server\PC1\DirSync\File1.ini - changed content
\\Server\PC1\DirSync\DirA\File2.bin - new file
\\Server\PC2\DirSync\File1.ini - changed content
\\Server\PC2\DirSync\DirA\File2.bin - file deleted

After synchronization (variant=mirror and "Deletion handling"=Backup) we get the files in the \\Server\Backup:
\\Server\Backup\SyncName_2012-12-10_204101\1\File1.ini
\\Server\Backup\SyncName_2012-12-10_204101\2\File1.ini
\\Server\Backup\SyncName_2012-12-10_204101\2\DirA\File2.bin


If create a log file (\\Server\Backup\SyncName_2012-12-10_204101\SyncName.log), it is possible to do an automatic rollback using a batch file.
Example a compact version of the protocol (listed only new files):
\\PC1\DirSync\DirA\File2.bin

Example a full report:
B|1\\
F|\\Server\PC1\DirSync\
T|\\PC1\DirSync\
C|File1.ini
N|DirA\File2.bin
B|2\
F|\\Server\PC2\DirSync\
T|\\PC2\DirSync\
C|File1.ini
D|DirA\File2.bin

*Where:
B-backup path
F-from path
T-to path
C-change file
N-new file
D-delete file*
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

I'm coming back to this issue and want to see if there isn't an improvement, maybe even a "solution" to:
[404, Invalid URL: https://sourceforge.net/p/freefilesync/feature-requests/315/]
[404, Invalid URL: https://sourceforge.net/p/freefilesync/feature-requests/316/]

Summing up the discussion so far, there are two relevant naming conventions:

1. C:\REVISIONS\folder\file.txt <timestamp>.txt
2. C:\REVISIONS\<job name> <timestamp>\folder\file.txt

and two relevant limits

I. Limit versions per file
II. Cleanup after x days

Unfortunately 1 is tied to I and 2 is tied to II for performance reasons, which severely places constraints on a more generic solution. So a replacement to what is currently implemented could be 2/II.

Question: Am I missing some clever idea or are there only these two major building blocks which happen to be incompatible? Seing how the limits depend on the naming convention seems to indicate that the naming convention needs to be hard-coded.
User avatar
Site Admin
Posts: 7212
Joined: 9 Dec 2007

Zenju

Of the two limits, II is the most useful: There is an inverse correlation between the time passed since a file was deleted and the likelihood that one might need it again. A "limit versions" on the other hand is largely artificial and can only barely be justified by a weak correlation of version count and time passed.

In real-world terms: If you want to clean up your house and get rid of superfluous items, you don't set a numeric limit for each item type and throw away everything after x items (=limit versions). You also don't place a size limit and throw away everything that doesn't fit in x cubic meters (=size limit; yes one could argue that the size of the rooms is a natural size limit, but in practice if the stuff is useful, you'll see to get more space rather than throw away stuff). But you may decide to throw everything away which you haven't used for more than a year (= limit by days).

The naming conventions have the following advantages:

1.
-> easy to find a specific version of a file via a file browser

2.
-> easy to delete all versions older than a certain date (either manually or via a new limit in FFS)

-> easy to do an "undo" by copying over deleted and overwritten files from last sync (can be useful in practice even if this does not cleanup the newly created files)

-> a single revisions directory can be used by multiple folder pairs and sync jobs since the job name is part of the temporary directory name.

So reevaluating the requirements and the *feedback* I have tendency to implement 2/II replacing the current 1/I handling of revisions.