Hello selfhosted! Sometimes I have to transfer big files or a large amounts of small files in my homelab. I used rsync but specifying the IP address and the folders and everything is bit fiddly. I thought about writing a bash script but before I do that I wanted to ask you about your favourite way to achieve this. Maybe I am missing out on an awesome tool I wasn’t even thinking about.

Edit: I settled for SFTP in my GUI filemanager for now. When I have some spare time I will try to look into the other options too. Thank you for the helpful information.

  • @sugar_in_your_tea@sh.itjust.works
    link
    fedilink
    English
    112 months ago

    What’s wrong with rsync? If you don’t like IP addresses, use a domain name. If you use certificate authentication, you can tab complete the folders. It’s a really nice UX IMO.

    If you’ll do this a lot, just mount the target directory with sshfs or NFS. Then use rsync or a GUI file manager.

    • Grumuk
      link
      fedilink
      English
      02 months ago

      I never even set up DNS for things that aren’t public facing. I just keep /etc/hosts updated everywhere and ssh/scp/rsync things around using their non-fqdn hostnames.

  • @boreengreen@lemm.ee
    link
    fedilink
    English
    3
    edit-2
    2 months ago

    rsync is indeed fiddly. Consider SFTP in your GUI of choice. I mount the folder I need in my file browser and grab the files I need. No terminal needed and I can put the folders as favorites in the side bar.

  • @Xanza@lemm.ee
    link
    fedilink
    English
    32 months ago

    rclone. I have a few helper functions;

    fn mount { rclone mount http: X: --network-mode }
    fn kdrama {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/KDrama/$x --filter-from
    ~/.config/filter.txt }
    fn tv {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/TV/$x --filter-from ~/.config/filter.txt }
    fn downloads {|x| rclone --multi-thread-streams=8 --checkers=2 --transfers=2 --ignore-existing --progress copy http:$x nas:Media/Downloads/$x --filter-from ~/.config/filter.txt }
    

    So I download something to my seedbox, then use rclone lsd http: to get the exact name of the folder/files, and run tv "filename" and it runs my function. Pulls all the files (based on filter.txt) using multiple threads to the correct folder on my NAS. Works great, and maxes out my connection.

  • @lemmylommy@lemmy.world
    link
    fedilink
    English
    32 months ago

    WinSCP for editing server config

    Rsync for manual transfers over slow connections

    ZFS send/receive for what it was meant for

    Samba for everything else that involves mounting on clients or other servers.

  • @motsu@lemmy.world
    link
    fedilink
    English
    32 months ago

    smb share if its desktop to desktop. If its from phone to PC, I throw it on nextcloud on the phone, then grab it from the web ui on pc.

    Smb is the way to go if you have identity set up, since your PC auth will carry over for the connection to the smb share. Nextcloud will be less typing if not since you can just have persistent auth on the app / web.

    • @boreengreen@lemm.ee
      link
      fedilink
      English
      1
      edit-2
      2 months ago

      As I understand it, the establishing of the connection is reliant on a relay server. So this would not work on a local network without a relay server and would, by default, try to reach a server on the internet to make connections.

  • @PerogiBoi@lemmy.ca
    link
    fedilink
    English
    22 months ago

    Ye old samba share.

    But I do like using Nextcloud. I use it for syncing my video projects so I can pick up where I left off on another computer.

  • @neidu3@sh.itjust.works
    link
    fedilink
    English
    2
    edit-2
    2 months ago

    rsync if it’s a from/to I don’t need very often

    More common transfer locations are done via NFS