• 0 Posts
  • 6 Comments
Joined 24 days ago
cake
Cake day: February 9th, 2025

help-circle
  • One bad thing doesn’t make a different but also bad thing ok. And in my opinion it is worse, imagine if their world view could only come from 5 second videos. Throw those history books away.

    And I don’t know that it’s overstated and it’s not at all perpetual. Look at… everything these days. People “disagree” with fundamental facts and are blindly allowing our planet to be burnt to the ground.

    It takes concentrated effort to build and maintain an educated populace. The wide availability of books and increased literacy directly caused the Renaissance, pulling down the status quo and giving us access to modern medicine and literally every right + luxury you enjoy today.


  • I mean it’s the same use; it’s all literacy. It’s about how much you depend on it and don’t use your own brain. It might be for a mindless email today, but in 20 years the next generation can’t read the news without running it through an LLM. They have no choice but to accept whatever it says because they never develop the skills to challenge it, kind of like simplifying things for a toddler.

    The models can never be totally fixed, the underlying technology isn’t built for that. It doesn’t have “knowledge” or “reasoning” at all. It approximates it by weighing your input against a model of how those words connect together and choosing a slightly random extension of them. Depending on the initial conditions, it might even give you a different answer for each run.


  • It can’t ever accurately convey any more information than you give it, it just guesses details to fill in. If you’re doing something formulaic, then it guesses fairly accurately. But if you tell it “write a book report on Romeo and Juliet”, it can only fill in generic details on what people generally say about the play; it sounds genuine but can’t extract your thoughts.

    Not to get too deep into the politics of it but there’s no reason most people couldn’t get there if we invested in their core education. People just work with what they’re given, it’s not a personal failure if they weren’t taught these skills or don’t have access to ways to improve them.

    And not everyone has to be hyper-literate, if daily life can be navigated at a 6th grade level that’s perfectly fine. Getting there isn’t an insurmountable task, especially if you flex those cognitive muscles more. The main issue is that current AI doesn’t improve these skills, it atrophies them.

    It doesn’t push back or use logical reasoning or seek context. Its specifically made to be quick and easy, the same as fast food. We’ll be having intellectual equivalent of the diabetes epidemic if it gets widespread use.


  • LLMs work by extrapolation, they can’t output any better than the context you give them. They’re used in completely inappropriate situations because they’re dead easy and give very digestible content.

    Your brain is the only thing in the universe that knows the context of what you’re writing and why. At a sixth grade level, you could technically describe almost anything but it would be clunky and hard to read. But you don’t need an LLM to fix that.

    We’ve had tools for years that help with the technical details of writing (basic grammar, punctuation, and spelling). There are also already tools to help with phrasing and specifying a concept (“hey Google, define [X]” or “what’s the word for when…”).

    This is more time consuming than an LLM, but guarantees that what you write is exactly what you intend to communicate. As a bonus, your reading comprehension gets better. You might remember that definition of [X] when you read it.

    If you have access to those tools but can’t/won’t use them then you’ll never be able to effectively write. There’s no magic substitute for literacy.


  • The reason it feels like that is because it’s addressed to someone who you don’t know personally, even if you know them professionally. You never really know if a specific reference would offend them, if their dog just died, how “this email finds” them, etc…

    And in the context of both of you doing your jobs, you shouldn’t care. Its easier to get day-to-day stuff done with niceties even if it’s hollow.

    That’s just the tone tho. People trying to insist they give a shit when everyone knows they don’t is what bothers me. If you’re firing someone don’t sugar coat it.


  • In all of those examples, the user knows exactly what they want and the tool is a way to expedite or enable getting there. This isn’t quite the same thing.

    If we were talking a tool like augmented audio to text I’d agree. I’d probably even agree if it was an AI-proofreader style model where you feed it what you have to make sure it’s generally comprehensible.

    Writing as a skill is about solidifying and conveying thoughts so they can be understood. The fact that it turns into text is kind of irrelevant. Hand waving that process is just rubber stamping something you kinda-sorta started the process of maybe thinking about.