• 0 Posts
  • 64 Comments
Joined 1 year ago
cake
Cake day: August 9th, 2023

help-circle

  • Deckweiss@lemmy.worldtoSelfhosted@lemmy.worldMulti system synced/living OS possible?
    link
    fedilink
    English
    arrow-up
    1
    arrow-down
    1
    ·
    edit-2
    15 hours ago

    Even when my internet doesn’t suck for a minute, I have yet to find a linux remote software that is not sluggish or ugly from compression artifacts, low res and inaccurate colors.

    I tried my usual workflows and doing any graphic design or 3d work was impossible. But even stuff like coding or writing notes made me mistype A LOT, then backspace 3-5 times, since the visual feedback was delayed by at least half a second.


  • I run this somewhat. The question I asked myself was - do I R-E-A-L-L-Y need a clone of the root disk on two devices? And the answer was: no.


    I have a desktop and a laptop.

    Both run the same OS (with some package overlap, but not identical)

    I use syncthing and a VPS syncthing server to sync some directories from the home folder. Downloads, project files, bashrc, .local/bin scripts and everything else that I would actually really need on both machines.

    The syncthing VPS is always on, so I don’t need both computers on at the same time to sync the files. It also acts as an offsite backup this way, in case of a catasprophical destruction of both my computers.

    (The trick with syncthing is to give the same directories the same ID on each machine before syncing. Otherwise it creates a second dir like “Downloads_2”.)

    That setup is easy and gets me 95% there.

    The 5% that is not synced are packages (which are sometimes only needed on one of the computers and not both) and system modifications (which I wouldn’t even want to sync, since a lot of those are hardware specific, like screen resolution and display layout).


    The downsides:

    • I have to configure some settings twice. Like the printer that is used by both computers.

    • I have to install some packages twice. Like when I find a new tool and want it on both machines.

    • I have to run updates seperately on both systems so I have been thinking about also setting up a shared package cache somehow, but was ultimately too lazy to do it, I just run the update twice.


    I find the downsides acceptable, the whole thing was a breeze to set up and it has been running like this for about a year now without any hiccups.

    And as a bonus, I also sync some important document to my phone.









  • Deckweiss@lemmy.worldtoSelfhosted@lemmy.worldNon-Cloudflare AI blocking?
    link
    fedilink
    English
    arrow-up
    8
    arrow-down
    1
    ·
    edit-2
    12 days ago

    The only way I can think of is blacklisting everything by default, directing to a challanging proper captcha (can be selfhosted) and temporarily whitelisting proven human IPs.

    When you try to “enumerate badness” and block all AI useragents and IP ranges, you’ll always leave some new ones through and you’ll never be done with adding them.

    Only allow proven humans.


    A captcha will inconvenience the users. If you just want to make it worse for the crawlers, let them spend compute ressources through something like https://altcha.org/ (which would still allow them to crawl your site, but make DDoSing very expensive) or AI honeypots.







  • Here is my personal approach to this.

    • I have set my bash history to a ridiculous 1000000 max length, so that I can use CTRL+R to search for commands that I have ran before

    • I write down a lot of commands in a searchable note text document

    • Ask chatGPT

    • Use the tldr command

    • Added A LOT of verbose custom aliases and scripts. For example instead of

    inotifywait -m -r --exclude "(/tmp.*|/var/cache.*|/dev/pts/|/var/log.*)" -e MOVED_TO -e CREATE -e CLOSE_WRITE -e DELETE -e MODIFY . (nobody can remember that alphabet gibberish)

    I just type watch_for_changes .

    Since it is verbose, straight from my brain, I always remember it and it works with autocomplete. I have like ~30 such commands so far.