• 0 Posts
  • 29 Comments
Joined 4 months ago
cake
Cake day: December 4th, 2025

help-circle


  • There’s a lot of options. There’s countless paid services that offer exactly that.

    If you wanna build something yourself for free, you could probably set up a site accessible via HTTP on your server and create a script on your phone that pings it every 30 seconds or so. Afaik, termux has a termux-notification function that lets you send a notification.

    Codewise, it would look somewhere like this I think:

    #!/usr/bin/env bash  
    
    # Config 
    NOTIFY_TITLE="Server Alert"  
    NOTIFY_MESSAGE="Server returned a non‑200 status."  
    
    HOST="funnysite.com"  
    PORT=8080  
    PATH="/healtcheck"  
    
    URL="http://${HOST}:${PORT}${PATH}"  
    # Config  
    
    HTTP_CODE=$(curl -s -o /dev/null -w "%{http_code}" "$URL")  
    
    if [[ "$HTTP_CODE" != "200" ]]; then  
        termux-notification -t "$NOTIFY_TITLE" -c "$NOTIFY_MESSAGE $HOST:$PORT"  
    fi  
    
    exit 0  
    

    Afaik, termux doesn’t ship the cron daemon, but you can install cronie or use an external task scheduler. There, just set to run the script every 60 seconds or so. Whatever you need.

    I haven’t tested anything of this, but in my head, it sounds like it should work fine.










  • Heimdall or Dashy are the first things that come to mind. However, what I would do in your case is using local URLs that you can resolve via a local DNS like pihole. That way, you don’t have to remember IPs and ports, but just services. If you need different ports, you might need a proxy in between, which is also set up fairly quickly with nginx.




  • You’re still querying search engines with your IP

    IP in itself might not be as much of a problem, unless you have a static IP, which most consumers don’t. And even if you do, you are also hiding a lot of baggage relating to user agents or other fingerprintable settings. IP alone is rarely used as a sole point to link your traffic to other datapoints. On top of that, you can still just decide to exclude google, bing etc from your search results and rely more “open” ones like DDG or ecosia.

    Another huge upside of searxng is the aggregation of results. The search results of google are all up to, well, google. Same with bing, which is controlled by microsoft. If these companies now decide to “surpress” certain information, people using only those engines directly would no longer see those news. However, if you get your results from multiple search engines, you are not - or lets say less - affected by that kind of nonsense.

    As always with news and information, the truth usually lies somewhere in the middle. And that’s where searxng helps out tremendously.





  • Adding certificates is a 5 step process: Settings -> Privacy and Security -> View Certificates -> Import -> Select file and confirm. That’s on firefox at least, idk about chrome, but probably not significantly more complex. With screenshots, a small guide would be fairly easy to follow.

    Don’t get me wrong, I do get your point, but I don’t feel like making users add client certs to their browser storage is more work than helping them every 2 weeks because they forgot their password or shit like that lol. At least, that’s my experience. And the cool thing about client certs is they can’t really break it, unlike passwords which they can forget, or change them because they forgot, just to then forget they changed it. Once it runs, it runs.


  • The “average user” shouldn’t selfhost anything. Might sound mean or like gatekeeping, but it’s the truth. It can be dangerous. There’s a reason why I hire an electrician to do my house installation even tho I theoretically know how to do it myself - because I’m not amazingly well versed in it and might burn down my house, or worse, burn down other peoples houses.

    People who are serious about selfhosting need to learn how to do it. Halfassing it will only lead to it getting breached, integrated into a botnet and being a burden on the rest of humanity.