Hello selfhosted! Sometimes I have to transfer big files or a large amounts of small files in my homelab. I used rsync but specifying the IP address and the folders and everything is bit fiddly. I thought about writing a bash script but before I do that I wanted to ask you about your favourite way to achieve this. Maybe I am missing out on an awesome tool I wasn’t even thinking about.
Edit: I settled for SFTP in my GUI filemanager for now. When I have some spare time I will try to look into the other options too. Thank you for the helpful information.
You must log in or register to comment.
Depends on what I’m transferring and to/from where:
scp
is my go-to since I’m a Linux household and have SSH keys setup and LDAP SSO as a fallbacksshfs
if I’m too lazy to connect via SMB/NFS (or I don’t feel like installing the tools for them) or I’m traversing a WANrsync
for bulk transfer and backups- Snapdrop/Pairdrop for one-off file/text shares between devices with GUIs (mostly phone <–> PC)
- SMB if I’m on a client PC and need to work with the files directly from the fileserver
- NFS between servers
- To get bulk data to my phone (e.g. updating my music library), I connect via USB in MTP mode and copy from the server via SMB or sshfs.