Is it possible ...

Discuss new features and functions
Posts: 7
Joined: 24 Jun 2023

c0rupted

Hello all,
It goes without saying, I'm new here, but like what I have seen possible so far.

Here's my question of "is it possible to achieve" for FFS from the experienced and advanced users:

Lets say, I have 50+ machines that I want to copy a dataset from.

PC1 C:\Folder\Dataset1
PC2 C:\Folder\Dataset2
PC3 C:\Folder\Dataset3
...
PC50 C:\Folder\Dataset50

I want to copy the Dataset# folder to \\Server\Path\Backups\Dataset#

The issue is (this issue seems to exist across many 'Sync' programs I have looked at or used), inside of each one of those Dataset# folders, is a folder for SQL Database Backups (i.e. C:\Folder\Dataset1\DatabaseBackup) that is backed up each hour of everyday, but the DatabaseBackup folder contains 48 hours of backups (so, C:\Folder\Dataset#\DatabaseBackup\backup1.7z, ..\backup2.7z, etc).
snap.png
snap.png (14.89 KiB) Viewed 1292 times
So, I only want the newest "Backup%.7z" file out of that folder.

If this wasn't a big enough issue, the 2nd "Is it possible" Issue I might have is, all of the remote machines sit randomly in a range of IP's (i.e. 10.100.x.211, 10.101.x.211, 10.102.x.211, etc to something like 10.200.x.211).

So, I normally can do most of the Dataset# folders using powershell script, but am looking for a better solution than Powershell so I can get all the data instead of part of the dataset.
#Define the $Tab character.  This is only used to space different values in the terminal in a presentable way.
$Tab = [char]9
$LocalFolder1 = "\\ServerPath\Folder_For_Storage\" #enclose in quotes incase of spaces in folder path names
$RemoteFolder1 = "C:\Folder_To_Copy_From\" #enclose in quotes incase of spaces in folder path names

#Clear the terminal screen.
Clear-Host

#Import the list of IPs from the file IPs.csv in the same folder as the script. File must have column header with "IP_Address" and list of addresses below.
$ComputerIPs = Import-Csv  "C:\Scripts\IPs_To_Check.csv" #Currently checking against 2032 different IP addresses for 'Life'

#Save the start time and display it so at the end I can calculate total process time.
$StartTime = Get-Date
Write-Host "`n`n`n`n`n`nStart Time: $StartTime"

#Count the number of IPs in the file so I know how many times to iterate through the main process.
$IPCount = $ComputerIPs.Count

#Get the date so I can save the results to a file with the name yyyymmddhhmmss - Results.csv to the same folder as teh script.
$Date = Get-Date -UFormat "%Y%m%d%H%M%S"

<#
This is the guts of it.
 - Read one IP
 - Ping it
 - If it returns succesfully, copy files to remote PC.
 - Display the results on the terminal in a readable fashion
 - Write the results to the "yyyymmddhhmmss - Results.csv".
#>
for ($i = 0; $i -lt $IPCount; $i++)
{
    #Read the IP from the list now in memory.
    $IP = $ComputerIPs.IP_Address.GetValue($i)

    #Ping, if successful, carry on, else print a message in terminal and skip to next IP.
    if(Test-Connection -ComputerName $IP -Quiet -Count 1){
        $IP = $ComputerIPs.IP_Address.GetValue($i)
        $Test = (Test-Connection -ComputerName $IP -Count 1 | Measure-Object -Property ResponseTime -Average).average
        $Response = ($Test -as [int] )
       
        Write-Host "The response time for$Tab" -ForegroundColor Green -NoNewline
        Write-Host "$IP$Tab is " -ForegroundColor Green -NoNewline
        Write-Host "$Response ms" -ForegroundColor Black -BackgroundColor white

        $RemotePath1 = "\\$($IP)\c$\$($RemoteFolder1)"

        #Delete Remote Connection if present.  It won't work if multiple connections are created.
        Write-Host "Checking if remote connection already exists for $($IP), if so deleting it (cannot have multiple connections)..."
        NET USE /delete \\$($IP)\c$ 2>&1>null

        #Create Remote Connection
        Write-Host "Creating remote connection for $($IP)..."
        NET USE \\$($IP)\c$ /u:$($IP)\RemoteUserName RemotePassword

        #Copy File
        Write-Host "Copying to files $($IP))..."
        robocopy $RemotePath1 $LocalFolder1 /E /Z /COPY:DAT /R:10 /W:30 /V /ETA /TEE /XD DatabaseBackUps /UNILOG+:".\Transfer.log"
       
        #Delete Remote Connection (clean up).
        Write-Host "Deleting remote connection to $($IP)..."
        NET USE /delete \\$($IP)\c$

    }
    else{
        Write-Host "Could not connect to$Tab$IP" -ForegroundColor Red
    }

    #Update Progress Bar
    $Percent = 100*$i/$IPCount
    $Percent = [math]::Round($Percent, 2)
    Write-Progress -Activity "Testing Connections and Gathering Job Info..." -Status "Progress: $Percent %" -PercentComplete $Percent
}

#Set progress bar to complete.
Write-Progress -Activity "Writing Locations..." -Status "Ready" -Completed

#Save the end time and calculate execution time
$EndTime = Get-Date
Write-Host "End Time: $EndTime"

$ExecutionTime = $EndTime - $StartTime
Write-Host "Execution Time: $ExecutionTime"
User avatar
Posts: 2288
Joined: 22 Aug 2012

Plerry

1)
If I understand you correctly, the name of all 48 backupXX.7z files is different/unique (per Dataset#), and this is arranged in one way or another outside FreeFileSync (FFS).
Don't expect FFS to do this unique renaming for you.
What is not clear is:
• Does each C:\Folder\Dataset#\DatabaseBackup folder already contain these 48 files, or does it contain just one and does just its counterpart \\Server\Path\Backups\Dataset#\DatabaseBackup folder contain all 48?
• Are the backupXX.7z files constantly renamed every hour (e.g. XX becoming XX+1), or are the newer files assigned the next higher available XX?

This needs to be clear before I may be able to help you get further.

2)
It seems you are running FFS on your server.
Did you consider to run FFS on the PCs instead?
Assuming your server has a fixed IP or network name, this would then avoid the issue related to the dynamic IP of the PCs.
PC1 would run an FFS sync C:\Folder\Dataset1 <-> \\Server\Path\Backups\Dataset1,
PC2 would run an FFS sync C:\Folder\Dataset2 <-> \\Server\Path\Backups\Dataset2,
...
PC50 would run an FFS sync C:\Folder\Dataset50 <-> \\Server\Path\Backups\Dataset50

These syncs can be scheduled to run via the Task Scheduler on each of the PCs.
No challenging server script required.
These syncs could even be run simultaneously, as there does not seem to be an overlap between the Dataset# folders on the server.
Next to that, you can then also look into using FFS Macro's and define the local and server base-locations as PC environment variables. This would allow you to use a single *.ffs_batch sync configuration (which could even be stored on your server) for all PCs.
Posts: 7
Joined: 24 Jun 2023

c0rupted

1)
If I understand you correctly, the name of all 48 backupXX.7z files is different/unique (per Dataset#), and this is arranged in one way or another outside FreeFileSync (FFS).
Don't expect FFS to do this unique renaming for you.
What is not clear is:
• Does each C:\Folder\Dataset#\DatabaseBackup folder already contain these 48 files, or does it contain just one and does just its counterpart \\Server\Path\Backups\Dataset#\DatabaseBackup folder contain all 48?
• Are the backupXX.7z files constantly renamed every hour (e.g. XX becoming XX+1), or are the newer files assigned the next higher available XX?

This needs to be clear before I may be able to help you get further. Plerry, 24 Jun 2023, 12:08
The DatabaseBackup folder will contain between 1 and 48 uniquely named files (So, the DataSet# folder will be unique as will each and every Database Backup 7z file - however, there is a very extremely small chance that I could end up with 2 Database Backups named exactly the same, the DataSet# folder will make the paths unique anyway). The DataSet# folder is also unique per 'server'.

2)
It seems you are running FFS on your server. Plerry, 24 Jun 2023, 12:08
I haven't actually set it up yet, but, the server/pc 'master' would have to do the retrieve due to firewall permissions on each of the various remote machines (things like LDAP ports are blocked from the remote side).
Did you consider to run FFS on the PCs instead?Plerry, 24 Jun 2023, 12:08
Yes. I am not opposed to it and I have spare PC's that can run the task(s), however, I don't want to use more than 1 PC to do it if possible.
Assuming your server has a fixed IP or network name, this would then avoid the issue related to the dynamic IP of the PCs.
PC1 would run an FFS sync C:\Folder\Dataset1 <-> \\Server\Path\Backups\Dataset1,
PC2 would run an FFS sync C:\Folder\Dataset2 <-> \\Server\Path\Backups\Dataset2,
...
PC50 would run an FFS sync C:\Folder\Dataset50 <-> \\Server\Path\Backups\Dataset50 Plerry, 24 Jun 2023, 12:08

Can't - the \\Server\ is blocked from PC1, etc. That is due to Firewall rules in place on the remote machines access (these remote machines sit behind cellular 'hot spots' that are firewalled so many things are blocked).
These syncs can be scheduled to run via the Task Scheduler on each of the PCs.
No challenging server script required.
These syncs could even be run simultaneously, as there does not seem to be an overlap between the Dataset# folders on the server.
Next to that, you can then also look into using FFS Macro's and define the local and server base-locations as PC environment variables. This would allow you to use a single *.ffs_batch sync configuration (which could even be stored on your server) for all PCs. Plerry, 24 Jun 2023, 12:08
The issue that I foresee is this:
1. The Remote IP addresses, might rotate in various ranges. So they are not static all the time, per say. They can be on PC1 for a month, then PC30 the next month.
2. I have applied for a firewall exception that would unblock a specific server, but have not tested it successfully yet (they say it's opened, but I can't seem to connect -- working through this with my Networking team, but they are slow AF).

IF, and only IF I can get this firewall exception working, then yes, I would reverse the setup. I would run FFS on the PC's and have them 1-way to the server location as mentioned in your #2.
Posts: 7
Joined: 24 Jun 2023

c0rupted

Oh, and lets not forget that I have to basically authenticate to each machine too.
        #Create Remote Connection
        Write-Host "Creating remote connection for $($IP)..."
        NET USE \\$($IP)\c$ /u:$($IP)\LocalUserName LocalPassword

        #Copy File
        Write-Host "Copying to files $($IP))..."
        robocopy $RemotePath1 $LocalFolder1 /E /Z /COPY:DAT /R:10 /W:30 /V /ETA /TEE /XD DatabaseBackUps /UNILOG+:".\Transfer.log"
       
        #Delete Remote Connection (clean up).
        Write-Host "Deleting remote connection to $($IP)..."
        NET USE /delete \\$($IP)\c$
User avatar
Posts: 2288
Joined: 22 Aug 2012

Plerry

The answer to my question 1) is still unclear.

If PC-side files get renamed after having been synced (e.g. XX becoming XX+1), using a sync program makes little sense, unless you are able to use FFS's 'Detect Moved Files' option (requires both sides to have cross-session stable file IDs).
Otherwise you may just as well always copy-over the full relevant PC content. Obviously FFS can do that for you, but a simple xcopy command would then suffice.

Conversely, if PC-side files have stable names, and added files are named differently than already existing files, FFS could be a good option for you, as it will then only copy over the files that were added since the last sync.
Posts: 7
Joined: 24 Jun 2023

c0rupted

The answer to my question 1) is still unclear.

If PC-side files get renamed after having been synced (e.g. XX becoming XX+1), using a sync program makes little sense, unless you are able to use FFS's 'Detect Moved Files' option (requires both sides to have cross-session stable file IDs).
Otherwise you may just as well always copy-over the full relevant PC content. Obviously FFS can do that for you, but a simple xcopy command would then suffice.

Conversely, if PC-side files have stable names, and added files are named differently than already existing files, FFS could be a good option for you, as it will then only copy over the files that were added since the last sync. Plerry, 28 Jun 2023, 12:04
The remote PC sets the file name. It does that via an automated SQL backup task. But for every new backup file created, 1 is delete. I only ever want copied on a scheduled task, the newest created file copied while removing the old file (so, basically a mirror of the "DatabaseBackup" folder but only for the newest backup.7z file).