• 4 Posts
  • 67 Comments
Joined 2 years ago
cake
Cake day: December 12th, 2023

help-circle
  • I use rsync for many of the reasons covered in the video. It’s widely available and has a long history. To me that feels important because it’s had time to become stable and reliable. Using Linux is a hobby for me so my needs are quite low. It’s nice to have a tool that just works.

    I use it for all my backups and moving my backups to off network locations as well as file/folder transfers on my own network.

    I even made my own tool (https://codeberg.org/taters/rTransfer) to simplify all my rsync commands into readable files because rsync commands can get quite long and overwhelming. It’s especially useful chaining multiple rsync commands together to run under a single command.

    I’ve tried other backup and syncing programs and I’ve had bad experiences with all of them. Other backup programs have failed to restore my system. Syncing programs constantly stop working and I got tired of always troubleshooting. Rsync when set up properly has given me a lot less headaches.


  • I don’t have root access on my phone but I still copy backups of my media and apps that export data to accessible files.

    I keep my process very simple using Termux with rsync openssh and termux-services packages.

    I created a folder dedicated on my for syncing between phone to computer called sync but you can change this for your needs.

    From a fresh Termux install, the setup should look something like the following:

    # Update package list and packages
    pkg update && pkg upgrade
    # Install required packages
    pkg install rsync openssh termux-services
    # Setup Termux's access to your phone's files
    termux-setup-storage
    # Make the required folder
    mkdir ~/storage/shared/sync/
    cd ~/storage/shared/sync/
    # Automatically start your SSH server when you open Termux
    sv-enable sshd
    
    • Get your phone’s username:
    ~ $ whoami
    u0_a205
    
    • Optional: Setup a password with the command passwd (I can’t remember if this step is important)

    A quick note: Termux on android has a file system quite different than a computer so file and directory names can get quite long. The pwd command would show /data/data/com.termux/files/home/storage/shared/sync for my sync folder.

    This can be made simpler by using the realpath command. realpath /data/data/com.termux/files/home/storage/shared/sync then shows /storage/emulated/0/sync as a result. If you’re using CLI, this may make your commands easier to read.

    Now you can start to build your rsync command to transfer your files. When setting up an rsync command, ALWAYS use the --dry-run- option. This performs a “transfer” without any files being moved.

    • From my computer (data transfer direction: Phone -> Computer):
    rsync --dry-run --archive --verbose --human-readable --partial --progress --compress -e 'ssh -p 8022' u0_a205@192.168.40.210:/storage/emulated/0/sync/ /home/computer_username/backup/
    
    • From my phone (data transfer direction: Phone -> Computer):
    rsync --dry-run --archive --verbose --human-readable --partial --progress --compress /storage/emulated/0/sync/ computer_username@192.168.40.205:/home/computer_username/backup/
    

    Explanation:

    • --archive preserves several file attributes
    • --verbose --human-readable --partial --progress creates a readable output to see what is happening
    • --compress compresses the data during the actual transfer (good for over a network)
    • -e 'ssh -p 8022' SSH on termux runs on port 8022
    • u0_a205@192.168.40.210:/storage/emulated/0/sync/ and computer_username@192.168.40.205:/home/computer_username/backup/ are how rsync identifies remote folders. Basic format is <username>@<remote IP address>:/path/to/folder/
    • /home/computer_username/backup/ and /storage/emulated/0/sync/ are the local folders, relative to what machine the rsync command is being run from.

    In order to reverse the direction of a command relative to the machine you are running on, simple swap the remote folder and local folder in the command. Example: From only my computer:

    # Direction: Phone -> Computer
    rsync --dry-run --archive --verbose --human-readable --partial --progress --compress -e 'ssh -p 8022' u0_a205@192.168.40.210:/storage/emulated/0/sync/ /home/computer_username/backup/
    
    # Direction: Computer -> Phone
    rsync --dry-run --archive --verbose --human-readable --partial --progress --compress -e 'ssh -p 8022' /home/computer_username/backup/ u0_a205@192.168.40.210:/storage/emulated/0/sync/
    

    In order to actually transfer files, remove the --dry-run option from the previous rsync commands. The output in your terminal will show additional information regarding transfer status.

    Additionally, you can also add the --delete option to the rsync command, this will “deduplicate” files, meaning the source folder will force the destination folder match, file by file. That means deleting any file in the destination folder that does not match the source folder list of files.

    A command WITHOUT --dry-run and WITH --delete would look like the following (CAUTION: THIS CAN DELETE FILES IF UNTEST):

    rsync --delete --archive --verbose --human-readable --partial --progress --compress -e 'ssh -p 8022' u0_a205@192.168.40.210:/storage/emulated/0/sync/ /home/computer_username/backup/
    
    

    I personally manually transfer my backups into an encrypted external drive which I manually decrypt. /u/emhl@feddit.org has a suggestion for automated encrypted backups if that’s more to your needs.


  • Yeah, a few weeks ago a achieved my state of “secure” for my server. I just happened to notice a dramatic decrease in activity and that’s what prompted this question that’s been sitting in the back of my mind for weeks now.

    I do think it’s important to talk about it though because there seems to be a lack of talk about security in general for self hosting. So many guides focus on getting services up and running as fast as possible but don’t give security much thought.

    I just so happened to have gained an interest for the security aspect of self hosting over hosting actual services. My risks for self hosting is extremely low so I’ve reached a point of diminishing returns on security but the mind is still curious and wants to know more.

    I might write up a guide/walkthrough of my setup in the future but that’s low priority. I have some other not self hosting related things I want to focus on first.


  • I think I am already doing that. My Kiwix docker container port is set to 127.0.0.1:8080:8080 and my reverse proxy is only open to port 12345 but will redirect kiwi.example. com:12345 to port 8080 on the local machine.

    I’ve learned that docker likes to manipulate iptables without any notice to other programs like UFW. I have to be specific in making sure docker containers only announce themselves to the local machine only.

    I’ve also used this guide to harden Caddy and adjusted that to my needs. I took the advice from another user and use wildcard domain certs instead of issuing certs for each sub domain, that way only the wildcard domain is visible when I search it up at https://crt.sh/ . That way I’m not advertising my sub domains that I am using.



  • My ISP blocks incoming data to common ports unless you get a business account. That’s why I used Cloudflare’s tunnel service initially. I changed my plans with the domain name I currently own and I don’t feel comfortable giving more power and data to an American Tech company so this is my alternative path.

    I use Caddy as my reverse proxy so I only have one uncommon port open. My plans changed from many people accessing my site to just me and very few select friends of mine which does not need a business account.


  • I get that.

    I was generally (in my head) speaking about all my devices. If someone stole my computer, the full disk encryption is more of a deterrence than the idea of my data being fully secured. My hope is that the third party is more likely to delete than to access. If I catch the attention of someone that actually wants my data, I have bigger issues to worry about than security of my electronic devices.


  • I agree with the last point, I only mentioned that because I don’t really know what other setting in my SSHD config is hiding my SSH port from nmap scans. That just happened to be the last change I remember doing before running an nmap scan again and finding my SSH port no longer showed up.

    Accessing SSH still works as expected with my keys and for my use case, I don’t believe I need an additional passphrase. Self hosting is just a hobby for me and I am very intentional with what I place on my web facing server.

    I want to be secure enough but I’m also very willing to unplug and walk away if I happen to catch unwanted attention.


  • Thanks for the insight. It’s useful to know what tools are out there and what they can do. I was only aware of nmap before which I use to make sure the only ports open are the ports I want open.

    My web facing device only serves static sites and a file server with non identifiable data I feel indifferent about being on the internet. No databases or stress if it gets targeted or goes down.

    Even then, I still like to know how things work. Technology today is built on so many layers of abstraction, it all feels like an infinite rabbit hole now. It’s hard to look at any piece of technology as secure these days.


  • I use a different port for SSH, I also have use authorized keys. My SSHD is setup to only accept keys with no passwords and no keyboard input. Also when I run nmap on my server, the SSH port does not show up. I’ve never been too sure how hidden the SSH port is beyond the nmap scan but just assumed it would be discovered somehow if someone was determined enough.

    In the past month I did rename my devices and account names to things less obvious. I also took the suggestion from someone in this community and setup my TLS to use wildcard domain certs. That way my sub domains aren’t being advertised on the public list used by Certificate Authorities. I simply don’t use the base domain name anymore.


  • Early when I was learning self hosting, I lost my work and progress a lot. Through all that I learned how to make a really solid backup/restore system that works consistently.

    Each device I own has it’s own local backup. I copy those backups to a partition on my computer dedicated to backups, and that partition gets copied again to an external SSD which can be disconnected. Restoring from external SSD to my Computer’s backup partition to each device all works to my liking. I feel quite confident with my setup. It took a lot of failure to gain that confidence.

    I also spent time hardening my system. I went through this Linux hardening guide and applied what I thought would be appropriate for my web facing server. Since the guide seems more for a personal computer (I think), the majority of it didn’t apply to my use case. I also use Alpine Linux so there was even less I could do for my system but it was still helpful in understanding how much effort it is to secure a computer.


  • That’s been my main goal throughout securing my personal devices including my web facing server. To make things inconvenient as possible for potential outside interference. Even if it means simply wasting their time.

    With how complex computers and other electronic devices have become, I never expect anything I own to be 100% secure even if I take steps I think will make me secure.

    I’ve been on the internet long enough to have built a habit of obscuring my online or digital presence. It won’t save me but it makes me less or a target.


  • I found BashWrite which is just a very simple static site generator written completely in bash as a single file script.

    The only dependency is having an up-to-date sed command which most systems should have. I use Alpine Linux which comes with a minimal sed command so I had to download the full command through my package manager.

    It’s simple, basic and has support for the majority of markdown formatting. There’s some limitations due to it being written in Bash only but I am personally okay with that.

    I found it on this list of static site generators if you’re curious to see more options.



  • Since my logs barely move, I just made aliases to where the logs are so it’s quick display and scan them within the terminal. I’m basically just viewing the system logs, fail2ban log and Caddy’s log so it’s fairly quick and simple for me.

    The only change I’d like to do is change the output of Caddy’s log file so it’s not a long single line of information per output. I’ll have to do a bit more reading on that so I know what information I want to keep and how I want to visually organize it. At least for the moment, I am familiarising myself with what I am looking at and am slowly figuring out what information is relevant to me.

    I like to keep my systems as simple and lean as possible which seems to strongly reflect my general approach to life. I find that kind of interesting.


  • I feel like my little Pi server is set up nicely now. At least I’m at the point where I’m not concerned about technically maintaining it. It’s as secure as I want it to be and I’ve tweaked my maintenance scripts slightly to avoid any unexpected issues.

    I tried installing snikket but I couldn’t figure out how to get it to work with my Caddyfile using my current wildcard domain cert configuration. I’ll try again another time when I’m motivated again. It’s a low priority to me.

    The last changes I made were adding logs and making them accessible to myself. So far they are all boring and predictable. Which is good news. It’s also nice to see that I’m the only person accessing it. The bots haven’t found my little corner of the internet yet.

    Right now I’m taking a break from self-hosted stuff to work on my gardens and two artsy projects. A wooden carving for a friend’s birthday and an overly complicated shell script that has no real purpose. Although I’ve learned lots from it already so it’s not a complete waste of time.


  • I use rsync too. It’s older and from what I understand was designed at a time when data storage was much smaller so it may not be as fast as other backup options. It also doesn’t have encrypted backups like other backup options (I think).

    Rsync has been the most reliable option for me though. Every syncing option I’ve tried seems too complicated and breaks down every time I look away. Since my entire backup size is around 550gb and I’m not concerned with encrypted backups, I think rsync just works just fine.

    I even created my own tool that puts my rsync commands into easy to read/modify files so I can organize my most common transfers. I can easily backup my phone, HomeAssistant server, home server and computer to my two backup locations in a single alias or cronjob now.

    A bit of a pain to learning how to make proper backups that restore successfully every time, but once I figured it out, I’ve been very confident in my backup strategy.



  • That on it’s own is fine.

    But I said no. I shouldn’t have to say no more than once because it’s annoying to continually say no. It is weird that they put nearly two weeks of effort into trying to get me to do something when I already said no.

    We already worked a physically demanding job and I rode a bike to and from work. I was already happy with my body but they weren’t happy with my arms.