InfoSec Person | Alt-Account#2

  • 4 Posts
  • 29 Comments
Joined 2 years ago
cake
Cake day: September 28th, 2023

help-circle
  • Also fuck off with this attitude man. I’m not attacking you, learn how to speak to people.

    Sorry. I get quite triggered when people add pseudo-labels to distributions, mainly Debian being outdated. Looking back, I was quite harsh and I apologize.

    However, you’re actively spreading the false narrative by saying Debian’s not good for “general computing” - this is what triggers me. A distribution is nothing but its package manager and some defaults. Some have different defaults and package managers.

    Older packages can be difficult for new users who want a computer to “just work”.

    The only place this makes a difference is with the latest hardware which OP does not have. I have more recent hardware than OP and Debian 13 + KDE Plasma 6 works out of the box.

    It’s fine for general computing, but not great.

    Again, I really hate this sentence. I will tone down the rudeness this time in explaining why. I have daily-driven Debian for years with AMD + Intel CPUs, Nvidia GPUs (1070, 3060) with use cases ranging wildly through the years. I cannot fathom what kind of general computing cannot work. If you say specialized computing, I would still disagree as there are always ways to make things work.

    Just off the top of my head where things are iffy with Debian: bat cannot be installed via a package manager, but not on most distros anyway. There’s a deb package though which works. Similar with dust, although more distros have it in their package manager.

    Debian, like you said, is rock-solid stable. In my many years of developing code, university courses, daily work (research), maintaining servers with wildly different usages, Debian’s “outdated” packages have only let me down once and that was with a LaTeX package which could be installed via ctan anyway.


  • Debian is rock-solid stable, but lacks newer packages. It’s great for a server, not so great for […] general computing.

    What the fuck??? I’ve been daily driving Debian for years now on my personal laptops, desktop, mini PC, and mutliple servers. I’ve found and reported Linux kernel vulnerabilities on my trusty Debian systems.

    What do you mean it’s not so great for general computing? What can’t you do with Debian computing-wise that you can do with other distros? The only issues I’ve ever had was with some LaTeX packages being older versions. You just get that from CTAN and install that manually.

    This is such a ridiculous comment. What do you do on a server that’s not general computing? You’re doing a subset of general computing??? How does a fucking distro actively prevent you from doing general computing???








  • Installed it on my desktop and the process was painful (my fault) because I ran out of space on my boot ssd (128Gigs) while doing the upgrades.

    I don’t really have much on my boot ssd and all my important data is on my laptop, backed up to my servers, or on my desktop’s HDD. I did a fresh install with a kde live usb stick and that went smooth, until something with the nvidia drivers prevented the display server from launching.

    Thankfully, I’ve been through this charade multiple times in the past, and I’m significantly more experienced in dealing with the kernel these days. Adding the nvidia-drm modeset kernel command line launch param worked, and my system is running deb 13. I’m so happy I have KDE plasma 6.

    Overall, a one hour process. Could have been faster if I had free space on my system lol. I’m a bit more reluctant to upgrade my servers at the moment, but I may in the upcoming months.

    One minor thing: they updated their apt sources (https://repolib.readthedocs.io/en/latest/deb822-format.html, https://unix.stackexchange.com/questions/498021/deb822-style-etc-apt-sources-list#583015). Idk why, but the installer didn’t create & populate the .sources file. After a quick check of the man page, I created the file and it worked.


  • A Basil Plant@lemmy.worldtoTechnology@lemmy.worldfake keepass repo on github
    link
    fedilink
    English
    arrow-up
    46
    arrow-down
    1
    ·
    8 months ago

    I need a recognisable domain name website that google or duckduckgo has picked as the product.

    This doesn’t always work. For example, I used to (and still do) see a lot of fake websites when I l type revanced (https://revanced.app/) on duckduckgo, and I’ve nearly fallen for two of the fake ones before (I think two of .com / .org / .to…?)

    Thankfully ublock origin warns users of this:

    Otherwise, I’d have 100% downloaded some malware-loaded crap.


  • Not exactly what you asked, but do you know about ufw-blocklist?

    I’ve been using this on my multiple VPSes for some time now and the number of fail2ban failed/banned has gone down like crazy. Previously, I had 20k failed attempts after a few months and 30-50 currently-banned IPs at all times; now it’s less than 1k failed after a year and maybe 3-ish banned at any time.

    There was also that paid service where users share their spammy IP address attempts with a centralized network, which does some dynamic intelligence monitoring. I forgot the name and search these days isn’t great. Something to do with “Sense”? It was paid, but well recommended as far as I remember.

    Edit: seems like the keyword is " threat intelligence platform"










  • That seems to be the consensus online. But thanks for that tidbit! It feels even more bizarre now knowing that.

    I wonder why a handful of people think the way I presented in the post. Perhaps American/British influences in certain places? Reading books by british authors and books by american authors at the same time? Feels unlikely.



  • My bachelor’s thesis was about comment amplifying/deamplifying on reddit using Graph Neural Networks (PyTorch-Geometric).

    Essentially: there used to be commenters who would constantly agree / disagree with a particular sentiment, and these would be used to amplify / deamplify opinions, respectively. Using a set of metrics [1], I fed it into a Graph Neural Network (GNN) and it produced reasonably well results back in the day. Since Pytorch-Geomteric has been out, there’s been numerous advancements to GNN research as a whole, and I suspect it would be significantly more developed now.

    Since upvotes are known to the instance administrator (for brevity, not getting into the fediverse aspect of this), and since their email addresses are known too, I believe that these two pieces of information can be accounted for in order to detect patterns. This would lead to much better results.

    In the beginning, such a solution needs to look for patterns first and these patterns need to be flagged as true (bots) or false (users) by the instance administrator - maybe 200 manual flaggings. Afterwards, the GNN could possibly decide to act based on confidence of previous pattern matching.

    This may be an interesting bachelor’s / master’s thesis (or a side project in general) for anyone looking for one. Of course, there’s a lot of nuances I’ve missed. Plus, I haven’t kept up with GNNs in a very long time, so that should be accounted for too.

    Edit: perhaps IP addresses could be used too? That’s one way reddit would detect vote manipulation.

    [1] account age, comment time, comment time difference with parent comment, sentiment agreement/disgareement with parent commenters, number of child comments after an hour, post karma, comment karma, number of comments, number of subreddits participated in, number of posts, and more I can’t remember.