Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Since you seem knowledgeable on this topic I'd like to ask - how risky is it to expose a computer on your network to the internet, if you're somewhat tech-savvy but not very familiar with networking? Is it relatively "safe" with modern tools and VMs or do you need to stay on top and (for eg) always ensure you're updating software weekly?

I've thought of setting up and running a server for a long time and finally have a spare laptop so I'm thinking of actually running a NAS at least.



I've been doing it for about 13 years now with HTTP/s (80, 443), SSH (22), MOSH (lol idk), and IRC (6697) exposed to the internet. You don't need it, but something like fail2ban or crowdsec is a good idea. You will get spammed with attempts to break in using default passwords for commodity routers (Ubiquiti's `ubnt` is rather popular), but if you're up to date and take a few minor precautions it's not all that hard and/or dangerous. That being said, there are alternatives such as Tailscale that are strictly more secure but far less flexible. I've heard of people using Cloudflare tunnels as well, but I'd rather not rely on big players for stuff like that if I'm going through the effort to self host (and don't have any real risk of DDoS).

I would try to set up automatic updates for critical security patches or update about weekly. I know people that self host and do it monthly and they seem fine too. Most anything super scary vulnerability wise is on the front page here for awhile, so if you read regularly you'll probably see when a quick update is prudent. I personally use NixOS for all of my servers and have auto-updates configured to run daily.

An old laptop is exactly how I got started 13 years ago, they're great because they tend to be pretty power efficient and quiet too.


My stuff is always out of date and hasn't gotten hacked yet.

I don't see why you'd want to run ssh on port 22. I run it on a different port and never get login attempts. Yes, if someone targeted me specifically of course they'd find out, but I guess that hasn't happened yet.


> I don't see why you'd want to run ssh on port 22.

I run ssh on port 22 because I like wasting the time of those script kiddies. Also I like to brag about half a million "hacker attacks" on my server per month.



jwz does not like hacker news, maybe copy that link instead of clicking it...


Thanks, I had no idea I was flirting with posting abusive content!


> I've heard of people using Cloudflare tunnels as well...

As a Cloudflare Tunnels user who only recently discovered Tailscale - just go with Tailscale straight off the bat. It's magic, and smooth as butter.


Tailscale Funnel [0] is limited to TLS-based protocols (maybe even just HTTPS) which is a non-starter for many cases.

[0]: https://tailscale.com/kb/1223/funnel


Which cases? Tailscale has eliminated all my fears I had about self hosting and I've been using it a ton. The only issue I've run in to has been a single service (Withings) that uses a web hook to trigger updates for my sleep mat. Their server isn't on my tablet so I would need to expose atleast one service to the wider Internet.


I'm talking specifically about Tailscale Funnel which gives ingress access to services on the tailnet from outside (ie. on the general internet). Any case that doesn't use TLS for a transport won't work. SSH being a notable one, but I can think of several others.


Check out the selfhosted-gateway. You can do arbitrary tcp/udp port forwarding from a VPS: https://github.com/fractalnetworksco/selfhosted-gateway


I'd rather use https://tuns.sh, same idea.


How does tailscale help with securely self-hosting from home? I have it setup to interface securely with my PCs across networks (like at my inlaws), but not sure how it helps if i were to expose something to the world.

Thanks!


On top of this, having ipv6 configured makes things harder to discover but not impossible (As long as you don't use ${ipv6_subnet}::xxxx for your hosts). You can avoid NAT and just expose the nodes you need. Most ISP assign /56 or /64 which is a humongous amount of ips. It's nice if you are just using a flat virtual network in your home lab. The amount of scanners I see for my subnet are non existent at the moment.


That's if your ISP supports IPv6. My current one does, but my last one did not.


The approach most people use is to tunnel into the server. You install a daemon on your computer which establishes a tunnel to log-into from outside your network. Cloudflare and Tailscale have solutions for this that are very popular among the self-hosted crowd.

https://developers.cloudflare.com/cloudflare-one/application...

https://tailscale.com/kb/1151/what-is-tailscale


A god option is to setup a wireguard connection between workstation and servers. All traffic has to go through wireguard.

Because wireguard is UDP and only responds to valid requests, there isn't any open port from the outside. Not even ssh.


Additionally you can use Tailscale for added convenience. Tailscale is a payed service, for a simple home server you can get away with the free plan and their mobile apps work rather well.

Not affiliated with Tailscale at all just shouting them out because they do make things very easy and I often recommend them to hobbyist.


I've been at it for over a decade. Home router has firewall exceptions for SSH (not port 22 though), TLS IRC, and 80/443, which are forwarded to my home server with fail2ban.

I run SSH (requires PKI outside local network), IRC, nextcloud, and ampache (though don't really use ampache anymore :( ).

Home server is encrypted RAID6 Arch Linux. If I had to do it again I'd forego rolling releases and use something more stable, like Debian.

Encrypted backups are done to backblaze once a month. I also have a backup drive that I plug in on occasion, encrypted of course.

Which reminds me my RAID6 drives are getting old now... I'm tempted to move to a VPS.


It is very service-dependent. If you are wanting to run a NAS for e.g. a media server, you may want to look into Cloudflare Tunnels or Tailscale.

I set up Jellyfin and Kavita, and those are internet-exposed, but also Nextcloud, and Portainer, and Calibre, and those are behind github SSO auth, via Cloudflare. Basically, before you can hit the nextcloud login page, you have to auth to github (as me) with 2FA first, so no one can sit there and try to brute-force my nextcloud login.


Keep things up to date and ideally, having your public facing servers in a DMZ/their own VLAN (separate network from your private stuff).

Administrative things like SSH and RDP are best accessed with a VPN but you can configure SSH in particular to be key-based authentication only, which is very secure.


> Is it relatively "safe" with modern tools and VMs or do you need to stay on top and (for eg) always ensure you're updating software weekly?

First step to figure out if you actually need to be able to access it from the outside at all. If you just want a NAS, chances are you can put it on a separate VLAN/network that is only accessible within your LAN, so it wouldn't even be accessible from the outside.

If you really need it to be accessible from the outside, make sure you start with everything locked down/not accessible at all from the beginning, then step-by-step open up exactly what you want, and nothing else. Make sure the endpoints accessible is run by software you keep up to date, at least weekly if not daily.


You'll want to make sure everything stays up to date in case someone finds a vulnerability in whatever software you're currently using. If you have to expose stuff to the outside world, only open the ports you need to. Only allow access to a specific user with a non-default username (or at the very least disable root ssh access), and use long passwords or ssh keys. I think that's generally the bare minimum, but there are online guides to harden your stuff further like using wireguard and fail2ban and stuff


I sat on the fence for a long time wanting to do this, and finally pulled the trigger and picked up a Synology NAS last year. I've had a blast setting up a handful of handy little self-hosted services on the thing. Highly recommend giving it a go!

I haven't had any security issues yet (knock on wood). But it seems pretty low-risk if you follow basic best practices. The only thing I have exposed to the internet is a reverse proxy that proxies to a handful of docker containers.


Just add `sudo apt update && apt upgrade` to your crontab.


A better solution is probably: https://wiki.debian.org/UnattendedUpgrades




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: