Comment by chrisweekly

Comment by chrisweekly 5 days ago

36 replies

> "Warning

Do not put this on the Internet if you do not know what you are doing.

By default this container has no authentication and the optional environment variables CUSTOM_USER and PASSWORD to enable basic http auth via the embedded NGINX server should only be used to locally secure the container from unwanted access on a local network. If exposing this to the Internet we recommend putting it behind a reverse proxy, such as SWAG, and ensuring a secure authentication solution is in place. From the web interface a terminal can be launched and it is configured for passwordless sudo, so anyone with access to it can install and run whatever they want along with probing your local network."

I hope everyone intrigued by this interesting and potentially very useful project takes heed of this warning.

satertek 5 days ago

That warning applies to anything you run locally. And going further, in this day and age, I would never put up any home service without it being behind Cloudflare Access or some form of wireguard tunnel.

  • Timber-6539 5 days ago

    Just put up basic auth infront of your services and be done with it.

    • KronisLV 5 days ago

      I've done that in the past, even for securing the admin pages of some software (there was once an issue where the admin page auth could be bypassed, this essentially adds another layer). With TLS it's okay for getting something up and running quickly.

      Of course, for the things that matter a bit more, you can also run your own CA and do mTLS, even without any of the other fancy cloud services.

    • baq 5 days ago

      the fact that we have to keep reinventing kerberos all the time because it doesn't speak http is starting to legitimately annoy me.

      • rlkf 5 days ago

        Firefox can be configured to use Kerberos for authentication (search for "Configuring Firefox to use Kerberos for SSO"); on Windows, Chrome is supposed to do so too by adding the domain as an intranet zone.

      • j16sdiz 5 days ago

        HTTP auth can work with kerberos.

        Chrome, Firefox, Internet Explorer -- all support some form of kerberos auth in HTTP/HTTPS.

    • mschuster91 5 days ago

      Good luck when the TCP or SSL stack has an issue. These bugs are rare but they do exist and you're getting fucked royally if your entire perimeter defense was a basic auth prompt.

      Windows and Linux have both had their fair share of network stack bugs, OpenSSL had Heartbleed and a few other bugs, and hell you might even run into bugs in Apache or whatever other webserver you are using.

      • nurettin 5 days ago

        It would have taken several days to heartbleed your private key in 2013 if you also added fail2ban. Your home lab probably isn't on the high priority target list.

        • mschuster91 4 days ago

          > Your home lab probably isn't on the high priority target list.

          Yeah but these days with botnets widely available to hire? Everything is fair game and whatever you run gets indexed on Shodan and whatever almost immediately. The game has never been easier for skiddies and other low-skill attackers, and mining cryptocoins or hosting VPN exit nodes makes even a homelab a juicy target.

          My homelab for example sports four third-hand HP servers with a total of about 256GB RAM and 64 CPU cores on a 200/50 DSL link. That's more than enough horsepower to cause serious damage from.

hifikuno 5 days ago

Yeah, I made a mistake with my config. I had setup SWAG, with Authelia (i think?). Got password login working with 2fa. But my dumbass didn't realize I had left ports open. Logged in one day to find a terminal open with a message from someone who found my instance and got in. Called me stupid (I mean they're not wrong) and all kinds of things and deleted everything from my home drive to "teach me a lesson". Lesson painfully learnt.

But before that happened Webtop was amazing! I had Obsidian setup so I could have access on any computer. It felt great having "my" computer anywhere I went. The only reason I don't have it set up is because I made the mistake of closing my free teir oracle cloud thinking I could spin up a fresh new instance and since then I haven't been able to get the free teir again.

  • 7bit 5 days ago

    > deleted everything from my home drive to "teach me a lesson". Lesson painfully learnt.

    I had a mentor in my teenage year that was the same kind of person. To this day the only meaningful memory I have of him is that he was an asshole. You can teach a lesson and be empathetic towards people that make mistakes. You don't have to be an asshole.

    • Dalewyn 5 days ago

      The lessons we learn best are those which we are emotionally invested in and sometimes that emotion can be negative, but a lesson will be learned regardless.

      • 7bit 3 days ago

        LOL! That's why we still smack kids' hands with a stick if they answer a question in school wrong. Because it emotionally sticks and definitely does not cause any psychological issues.

      • ano-ther 5 days ago

        Sure. But you don’t have to deliberately destroy all data and be mean about it as in GP‘s case to get an emotional reaction.

  • elashri 5 days ago

    > The only reason I don't have it set up is because I made the mistake of closing my free teir oracle cloud thinking I could spin up a fresh new instance and since then I haven't been able to get the free teir again.

    People are automating the process of requesting new arm instances on free tier [1]. You would find it near impossible to compete without playing same game

    [1] https://github.com/mohankumarpaluru/oracle-freetier-instance...

    • 7thpower 5 days ago

      Well, I know what I’m doing tomorrow when I get up.

      • hrrsn 5 days ago

        I had the same thing happen to me. I tried running a script for a month without luck (Sydney region). What did work was adding a credit card to upgrade to a paid account - no issues launching an instance, and it's still covered under the free tier.

  • Maakuth 5 days ago

    There are operations that put cryptominers into any unauthenticated remote desktops they can find. Ask me how I know... Way friendlier than wiping your data though.

    • unixhero 5 days ago

      There are groups of people who hunt for writeable ftp servers to be used for random filesharing. At least this used to be a thing

      • [removed] 5 days ago
        [deleted]
  • dspillett 5 days ago

    > Lesson painfully learnt.

    There are actually two lessons there:

    1. Be careful what you open to the public internet, including testing to make sure you aren't accidentally leaving open defaults as they are.

    2. Backups. Set them up, test them, make sure someone successfully gaining access to the source box(es) can't from there wipe all the backups.

    • doubled112 5 days ago

      An offline backup is incredibly inconvenient, but also very effective against shenanigans like these.

      Also agree that backups should be "pulled" with no way to access them from the machine being backed up.

      • dspillett 5 days ago

        I use a soft-offline backup for most things: sources push to an intermediate, backups pull from the intermediate, neither source not backup can touch each other directly.

        Automated testing for older snapshots is done by verifying checksums made at backup time, and for the latest by pushing fresh checksums from both ends to the middle for comparison (anything with a timestamp older than last backup that differs in checksum indicates an error on one side or the other, or perhaps the intermediate, that needs investigating, as does any file with a timestamp that differs more than the inter-backup gap, or something that unexpectedly doesn't exist in the backup).

        I have a real offline backups for a few key bits of data (my main keepass file, encryption & auth details for the backup hosts & process as they don't want to exist in the main backup (that would create a potential hole in the source/backup separation), etc.).

  • nsteel 5 days ago

    But you can have Obsidian access from any device already if you easily setup syncing using the official method (and support the project by doing so) or one of the community plugins. Doing it this normal way avoids opening up a massive security hole too.

    • jazzyjackson 5 days ago

      * any device you have admin rights to install software on, they are talking about being able to log in from any computer, not just their own

      It surprises and annoys me that obsidian, logseq, etc don't have self hosted web front ends available. I think logseq will once they wrap up the db fork, and maybe someday we'll have nuclear fusion powerplants too.

      • nsteel 5 days ago

        Ahhh that makes perfect sense, thank you. I'm so used to always having my phone this didn't even cross my mind.

gbraad 5 days ago

I created personalized image with tailscale and kasmvnc for this particular reason, ... not on a public VPS. you can find images on my github as inspiration; do not directly copy unless you understand what you are doing.

fulafel 5 days ago

Also note that their example docker config will allow anyone from the internet to connect, and even add a incoming rule in your host firewall to allow it. This is because they don't specify the port like -p 127.0.0.1:hostport:containerport (or the analog in the docker-compose config).

asyx 5 days ago

No they won’t. Octoprint (3d printing server) had a similar warning but they had to introduce actual user accounts to secure the system because people ignored it.

macinjosh 5 days ago

If a good password is used HTTP basic auth is plenty secure over HTTPS so that everything is encrypted.