• 1 Post
  • 42 Comments
Joined 8 months ago
cake
Cake day: March 22nd, 2024

help-circle
  • brucethemoose@lemmy.worldtoTechnology@lemmy.world*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    10
    arrow-down
    5
    ·
    edit-2
    15 hours ago

    I’m not sure how you’d solve the problem of big corpos becoming cheap content farms while avoiding harming the people who use these tools to make something rich and beautiful, but I have to believe there’s a way to thread that needle.

    Easy, local AI.

    Keep generative AI locally runnable instead of corporate hosted. Make it free, open and accessible. This gives the little guys the cost advantage, and takes away the scaling advantages of mega publishers. Lemmy users should be familiar with this concept.

    Whenever I hear people rail against AI, I tell them they are handing the world to Sam Altman and his dystopia, who do not care about stealing content, equality, or them. I get a lot of hate for it. But they need to be fighting the corporate vs open AI battle instead.






  • brucethemoose@lemmy.worldtoEurope@feddit.org*Permanently Deleted*
    link
    fedilink
    English
    arrow-up
    8
    ·
    edit-2
    6 days ago

    Or just stop being so freaking stingy withholding weapons.

    What are they waiting for? The UK to declare war on France? NO, they stockpiled all these freaking arms for Soviet aggression or fascist threats, and now its at their doorstep.

    I don’t understand what good Grippens and Typhoons, tanks, missiles sysstems and such do rusting in storage when they could do exactly what they were built to do, right now.





  • Maybe I am just out of touch, but I smell another bubble bursting when I look at how enshittified all major web services are simultaneously becoming.

    It feels like something has to give, right?

    We have YouTube, Reddit, Twitter, and more just racing to enshittify like I can’t even believe, Google Search is racing to destroy the internet, yet they’re also at the ‘critical mass’ of ‘too big to fail’ and shoved out all their major competitors already (other than Discord I guess).







  • It’s useful.

    I keep Qwen 32B loaded on my desktop pretty much whenever its on, as an (unreliable) assistant to analyze or parse big texts, to do quick chores or write scripts, to bounce ideas off of or even as a offline replacement for google translate (though I specifically use aya 32B for that).

    It does “feel” different when the LLM is local, as you can manipulate the prompt syntax so easily, hammer it with multiple requests that come back really fast when it seems to get something wrong, not worry about refusals or data leakage and such.