How to transfer 200GB file from Linux to Windows 10 ?

postcd

Banned
Messages
63
Location
CZ
Hello,

i need to monthly download backups of couple files where one is 200GB in size. I counted that such file download would take me like 30 hours and im not interested to be bothered to manually keep ccomputer turned on.

Im looking for the way to download with resume option or with continue option so interupted/failed download will be resumed. SCP SWs not worked for this purpose.

SOURCE:
remote (over internet, different country) CentOS Linux server where i have only SSH access to (can install additional software on it)

DESTINATION:
Windows 10 home computer

i was using rsync between linuxes, but cant find any good software for windows.

There are two Win SWs, but cant find it supports resume of failed transfer like rsync do: Difference between Syncrify and DeltaCopy

PS: im looking for the non-paid (prefferably opensource) solution/SW

I can use Virtualbox, but it seems like not much convenient way to run Virtualized Linux, run it just for the backup purpose. Unless it would be really small footprint and can autorun somehow on background and it would somehow easilly allow accessing host machine HDD or creating flexible guest OS storage. (because backups size would be quite high).

Another soluton was to run something like raspberry pi with raspbian and i assume it would contain rsync, but i would have to run some noisy external HDD with moving parts.
 
Last edited:
I think syncrify or deltacopy would work for you because they transfer the files in small chunks. So if there's a failure, it would just retry that small chunk, and not much time would be lost. I looked into deltacopy before but ended up going with Crashplan for more simplicty. Unfortunately I think you would need a GUI interface to set that up.
 
I had to search to remember this. And may actually have seen the name and thought "that was it" when actually it is not...

Win get.
Supports pause of downloads etc. Also supports multi connection downloads. So if threads are throttled on the server you can connect with two streams. One stream getting 0-50% of the file with the other stream downloading 50-100%...

You can pause and resume.

Why do you feel you'd need to manually keep the machine on instead of pressing download and walking away anyway?

And is the file still 200gb when zipped? You could zip it and split it into smaller chunks?

(You should be able to use scp to copy as well as http using that program if I remember correctly.)

You can also use winscp, it creates file parts so when you get bored terminate the program,
Next day copy the file and when it says resume or overwrite press resume.
 
Last edited:
You can also use winscp, it creates file parts so when you get bored terminate the program,
Next day copy the file and when it says resume or overwrite press resume.

I didn't know winscp could do that. Good to know.
 
Back
Top Bottom