Solution for transferring large backups files to other instances?

Hi
I am hosting from home and now want to transfer large backup file around 25 gb to a VPS, restore it on this VPS and use the services from there.
Now my upload speed is very poor at home and I am scared of the frequent disconnections too.
What would be a good solution to transfer the large file?
I am thinking to transfer it through torrent, of course in encrypted form.

Have you thought about sftp? With two servers IT should be quite fast…

But there have been quite lot of disconnections from past few weeks. Do sftp resumes download from where it left?

En ce qui me concerne j’utilise sftp avec screen installé sur le VPS.
Screen permet d’avoir un terminal “resistant” aux déconnections.
Cela te permettra de reprendre un sftp interrompu plus facilement.

borg can transfer large backups (it can restart where is the last interuption).

For first big backup you need to evaluate uplink rate in Mbps and calculate how many hours it will take.

If you are on a good ADSL (1Mbps up) it should be possible to send 4,5GB per 10 hours.

Yes my internet speed is 1Mbps upload. Nice guess. :wink:
But my internet disconnects lot of times during night.

Can you please give me quick “how to” on how should I transfer files from /etc/yunohost.backup/archives?

1 Like

this is a good question. i would be grateful for any kind of tutorial…

1 Like

If you want to backup with borg on an other server:

If you just want to move a big tar.gz and support interruption, rsync seems your friend:

 rsync -P --rsh=ssh userid@remotehost.com:bigdata.tgz ./bigdata.tgz