

I’m a ~/tmp man myself.


I’m a ~/tmp man myself.


Maybe not a service in the typical sense, but setting up your router+server to route your home network traffic through a VPN is a fun project.
My router (MikroTik) supports WireGuard, so I can use it with Mullvad for the whole house—but wg is demanding and it’s a slow router, so while it can NAT at ~1Gbps, it can’t do WireGuard at more than ~90Mbps. So, I set up WireGuard/Mullvad on a little SBC with a fast processor, and have my router use that instead. Using policy based routing and/or mangling, I can have different VLANs/subnets/individual hosts selectively routed through the VPN.
It’s a fun exercise, not sure I implemented it in a smart way, but it works :)
Whenever I have a Linux box without Internet I just USB tether an Android phone—if the phone is on WiFi then it uses that (not cell), so it’s basically just a WiFi adapter that’s almost universally supported. (I think it NATs, so in some circumstances won’t work, but good enough for most emergency use cases.)


I would recommend PoE security cameras. You probably want support for RTSP / ONVIF.
I have some Amcrest cameras talking to Frigate. It is completely local—cameras on a separate VLAN that can’t talk to the Internet, footage is recorded on a server running Frigate. Works very well for me. No vendor lock-in is also nice!


640k 780k ought to be enough for anybody…
I know right? What a poser!
/s


If you search around you might find free ones. Oracle has/had a free tier (though it’s Oracle, so…).


Sadly not really. I use the free tier Oracle, which honestly has worked very well, but I’m not going to recommend using Oracle aside to say that it functionally works for me.
If I were to switch I would probably go to racknerd.


Yes, but you can run multiple VPS, from different providers, simultaneously.
What I like is that while it does depend on an external provider, it doesn’t depend on a specific external provider. Any VPS with a public IPv4 would work.


VPS+VPN (WireGuard for me), with Tailscale as an emergency alternative, has worked very well for me. Knock on wood the only outages have been my own fault.


VPS+VPN, this is what I do.
VPS has public IP and runs WireGuard “server”* and a reverse proxy (and fail2ban…). Reverse proxy points to my home computer over the WireGuard link. No open ports on my home router.
For private facing/LAN-only services I just don’t have an entry in the VPS reverse proxy. DNS on the router points everything to my local server, so if at home I access everything directly. To access internal services remotely requires VPN (i.e., WireGuard to the VPS).
Works well; I have a tiny free tier VPS but even so, no complaints.
*Yes I know there are no wg clients or servers, only peers, but it plays a server-likr role.


I used Photoprism years ago, so my knowledge is probably pretty outdated.
My experience of Photoprism was that mobile was not tightly integrated. At the time I used Syncthing to sync photos — it worked ok for me, but I wasn’t going to set it up on my partner’s phone, for example.
Immich Just Works on both mobile and desktop. Multi user is great, sharing is great, and the local ML and face detection work remarkably well.
Whatever works for you is the best of course! Immich fits the bill for me, and it was very much worth it for me to “buy” it.


There’s a joke about whitespace here somewhere, I just know it.
xscreensaver of course! Note that this is not an option on Windows—jwz hates Microsoft, and any xscreensaver port to Windows is against his wishes.
I use yabai and sketchybar for a tiling WM feel. It’s nowhere as nice as my preferred i3, but it’s ok. Unfortunately it often breaks with major OS updates, so I’m sure to hold back updating my system until yabai is working.
IIRC sshfs will work on macOS but it’s more work to install. Worth it if allowed by your IT policies and your work can benefit from it.
Vim, tmux, and the usual *NIX stuff you might want.
The coreutils are not the GNU coreutils you typically find on a Linux system, so you may find a few differences. I believe sed is slightly different, and the flags for ls must be before the filename arguments, but I’ve found it’s mostly silly stuff like that (I used zsh before using macOS, so no problem there).


Regarding DNS servers, what router do you have? Some routers have simple enough DNS capabilities — I have a MikroTik, and have it set up with DNS entries for internal services (including wildcard). Publicly accessible services just use my registrar’s DNS (namecheap — no complaints).
Especially after adding in all the power draw of the automation requires…
What exactly is the incremental power draw for automation? My network gear and server (a little nuc) are sunk power costs as I self host other services.
Idling, my home uses around 100W with the fridge off. One 10W light is an additional 10% of my power budget, and I have a lot more than one light in my house. I also pay about $0.40/kWh.
I can be a bit neurotic about turning off lights when I leave a room, so Home Assistant was a nice way to free up brain space for me. A few motion sensors here and there + some simple automations, and the lights mostly handle themselves. Zigbee sensors and Zigbee or Matter-over-WiFi bulbs, so everything is local. A free VPS+WireGuard setup means I can access them remotely should I need to, with TailScale as a backup.
Cloud failures mean I can’t access remotely, but local control is unaffected—if my smart devices stop working it’s almost certainly my fault :)


Matter is also local—provisioning can be a PITA but once done I’ve been pretty happy with even the cheap Matter WiFi smart bulbs. Home Assistant supports them very well.
Cheap bulbs can be a little buggy, which usually means I need to power cycle some of them now and then.


My lights and motion sensors were obviously unaffected (HomeAssistant). My Emporia Vue2 power monitor would possibly have stopped working, except I flashed it with ESPHome firmware, so it’s local only, and of course it was fine. My security cameras (Frigate) were also fine.
If my smart home devices are going to stop working, it will almost certainly be my fault, thank you very much!
ncis useful. For example: if you have a disk image downloaded on computer A but want to write it to an SD card on computer B, you can run something likeuser@B: nc -l 1234 | pv > /dev/$sdcardAnd
user@A: nc B.local 1234 < /path/to/image.img(I may have syntax messed up–also don’t transfer sensitive information this way!)
Similarly, no need to store a compressed file if you’re going to uncompress it as soon as you download it—just pipe
wgetorcurltotarorxzor whatever.I once burnt a CD of a Linux ISO by
wgeting directly tocdrecord. It was actually kinda useful because it was on a laptop that was running out of HD space. Luckily the University Internet was fast and the CD was successfully burnt :)