• MortUS@lemmy.world
    link
    fedilink
    English
    arrow-up
    0
    ·
    7 days ago

    Once both major world militaries and hobbists are using it, it’s jover. You can’t close Pandora’s Box. Whatever you want to call the current versions of “AI”, it’s only going to get better. Short of major world catastrophes, I expect it to drive not only technological advances but also energy/efficiency advances as well. The big internet conglomerates are already integrating it into search, and I fully expect within the next 5 years to have search transformed into an assistant-like chatbot (or something thereof).

    I think it’s shortsighted not to see the potential of accumulating society’s knowledge and being able to present that to people in an understandable way.

    I don’t expect it to happen overnight. I’m not expecting iRobot or Android levels of consciousness any time soon, but the world is progressing toward the automation of many things - driven by Capital(ism) - which is powerful in itself.

    • Blackmist@feddit.uk
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      Many of our customers store their backups in our “cloud storage solution”.

      I think they’d be rather less impressed to see the cloud is in fact a jumble of PCs scattered all around our office.

    • Colonel Panic@lemm.ee
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      Naming it “The Cloud” and not “Someone else’s old computer running in their basement” was a smart move though.

      It just sounds better.

      • Jankatarch@lemmy.world
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        8 days ago

        There is still difference.

        Cloud was FOR the IT people. Machine learning is for predicting patterns following data.

        Maybe stock predictors will adapt or replace but average programmer didn’t have to switch to replit because it’s “cloud IDE”

      • Ferk@programming.dev
        link
        fedilink
        arrow-up
        0
        ·
        edit-2
        8 days ago

        I mean, isn’t that what “get on or get left behind” means?

        It does not necessarily mean you’ll lose your job. Nor does “get on” mean you have to become a specialist on it.

        The post picks specifically on things that didn’t catch on (or that only catched on for a period of time but were eventually superseeded), but does not apply it to other successful technologies.

    • Rusty@lemmy.ca
      link
      fedilink
      English
      arrow-up
      0
      ·
      8 days ago

      I don’t think it was supposed to replace everyone in IT, but every company had system administrators or IT administrators that would work with physical servers and now there are gone. You can say that the new SRE are their replacement, but it’s a different set of skills, more similar to SDE than to system administrators.

      • MinFapper@startrek.website
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        And some companies (like mine) just have their SDEs do the SRE job as well. Apparently it incentivizes us to write more stable code or something

  • humanspiral@lemmy.ca
    link
    fedilink
    arrow-up
    0
    ·
    8 days ago

    I’m skeptical of author’s credibility and vision of the future, if he has not even reached blink tag technology in his progress.

  • Maxxie@lemmy.blahaj.zone
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    8 days ago

    (Allow me to preach for a bit, I have to listen to my boss gushing about AI every meeting)

    Compare AI tools: now vs 3 years ago. All those 2022 “Prompt engineer” courses are totally useless in 2025.

    Extrapolate into the future and realize, that you’re not losing anything valuable by not learning AI tools today. The whole point of them is they don’t require any proficiency. It “just works”.

    Instead focus on what makes you a good developer: understanding how things work, which solution is good for what problem, centering your divs.

    • Dr. Moose@lemmy.world
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 days ago

      Key skill is to be able to communicate your problem and requirements which turns out to be really hard.

      • Pennomi@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        7 days ago

        It’s also a damn useful skill whether you’re working with AI or humans. Probably worth investing some effort into that regardless of what the future holds.

        • jmp242@sopuli.xyz
          link
          fedilink
          arrow-up
          0
          ·
          7 days ago

          Though it’s more work with current AI at least compared to another team member - the AI cannot have access to a lot of context due to data security rules.

  • teodorista@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    edit-2
    9 days ago

    Thanks for summing it up so succinctly. As an aging dev, I’ve seen quite a lot of tech come and go. I wish more people interested in technology would spend more time learning the basics and the history of things.

    • sidelove@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      8 days ago

      Which is honestly its best use case. That and occasionally asking it to generate a one-liner for a library call I don’t feel like looking up. Any significant generation tends to go off the rails fast.

      • merc@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        8 days ago

        If you use it basically like you’d use an intern or junior dev, it could be useful.

        You wouldn’t allow them to check anything in themselves. You wouldn’t trust anything they did without carefully reading it over. You’d have to expect that they’d occasionally completely misunderstand the request. You’d treat them as someone completely lacking in common sense.

        If, with all those caveats, you can get this assistance for free or nearly free, it might be worth it. But, right now, all the AI companies are basically setting money on fire to try to drive demand. If people had to pay enough that the AI companies were able to break even, it might be so expensive it was no longer worth it.

      • T156@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        Getting it to format documentation for you seems to work a treat. Nothing too complex, just “move this bit here, split that into points”.

      • Omgpwnies@lemmy.world
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        I’ve been using it to write unit tests, I still need to edit them to mock out some things and change a bit of logic here and there, but it saves me probably 50-75% of the time it used to take, just from not having to hand-write all that code.

    • Kissaki@programming.dev
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      9 days ago

      I’d love to read a list of those instances/claims/tech

      I imagine one of them was low-code/no-code?

      /edit: I see such a list is what the posted link is about.

      I’m surprised there’s not low-code/no-code in that list.

      • pinball_wizard@lemmy.zip
        link
        fedilink
        arrow-up
        0
        ·
        8 days ago

        You’re right. It belongs on the list.

        I was told several times that my programming career was ending, when the first low-code/no-code platforms released.

        • Kissaki@programming.dev
          link
          fedilink
          English
          arrow-up
          0
          ·
          edit-2
          8 days ago

          At my work we explored a low-code platform. It was not low on code at all. Beyond the simplest demos you had to code everything in javascript, but in a convoluted, intransparend, undocumented environment with a horrendous editing UI. Of course their marketing was something different than that.

          That was not the early days of low-code mind you. It was rather recently; maybe three or four years ago.

      • jubilationtcornpone@sh.itjust.works
        link
        fedilink
        arrow-up
        0
        ·
        8 days ago

        “We’re gonna make a fully functioning e-commerce website with only this WYSIWYG site builder. See? No need to hire any devs!”

        Several months later…

        “Well that was a complete waste of time.”

      • andioop@programming.dev
        link
        fedilink
        English
        arrow-up
        0
        ·
        edit-2
        8 days ago

        I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history, or that did get adopted but do not have mass adoption. Hindsight is 20/20, but we live in the present and have to make our guesses about what will succeed and what will fail, and it would be nice to have better guesses.

        • pinball_wizard@lemmy.zip
          link
          fedilink
          arrow-up
          0
          ·
          edit-2
          8 days ago

          I do wonder about inventions that actually changed the world or the way people do things, and if there is a noticeable pattern that distinguishes them from inventions that came and went and got lost to history,

          Cool thought experiment.

          Comparing the first iPhone with the release of BlockChain is a pretty solid way to consider the differences.

          We all knew that modern phones were going to be huge. We didn’t need tech bros to tell us to trust them about it. The usefulness was obvious.

          After I got my first iPhone, I learned a new thing I could do with it - by word-of-mouth - pretty much every week for the first year.

          Even so, Google supposedly under-estimated the demand for the first Android phones by almost a factor of 10x.

          BlockChain works fine, but it’s not changing my daily routine every week.

          AI is somewhere in between. I do frequently learn something new and cool that AI can do for me, from a peer. It’s not as impactful as my first pocket computer phone, but it’s still useful.

          Even with the iPhone release, I was told “learn iPhone programming or I won’t have a job.” I actually did not learn iPhone programming, and I do still have a job. But I did need to learn some things about making code run on phones.

        • Lightfire228@pawb.social
          link
          fedilink
          arrow-up
          0
          ·
          8 days ago

          Quality work will always need human craftsmanship

          I’d wager that most revolutionary technologies are either those that expand human knowledge and understanding, and (to a lesser extent) those that increase replicability (like assembly lines)

          • Transtronaut@lemmy.blahaj.zone
            link
            fedilink
            arrow-up
            0
            ·
            8 days ago

            It’s tricky, because there’s no hard definition for what it means to “change the world”, either. To me, it brings to mind technologies like the Internet, the telephone, aviation, or the steam engine. In those cases, it seems like the common thread is to enable us to do something that simply wasn’t possible before, and is also reliably useful.

            To me, AI fails on both those points. It doesn’t really enable us to do anything new. We already had chat bots, we already had Photoshop, we already had search algorithms and auto complete. It can do some of those things a lot more quickly than older technologies, but until they solve the hallucination problems it doesn’t seem reliable enough to be consistently useful.

            These things make it come off more as a potential incremental improvement that is still too early in it’s infancy, than as something truly revolutionary.

            • jmp242@sopuli.xyz
              link
              fedilink
              arrow-up
              0
              ·
              7 days ago

              It needs to be more trustworthy. If I have to double check everything, I still have to figure out how to do whatever it’s doing, then figure out how it’s doing the thing, then verify if it did it right. By then, I could have just done it in step 1.5 probably.

            • zqwzzle@lemmy.ca
              link
              fedilink
              English
              arrow-up
              0
              ·
              8 days ago

              Well it’ll change the world by consuming a shit ton of electricity and using even more precious water to fill the data centres. So changing the world is correct in that regard.

  • someacnt@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    0
    ·
    9 days ago

    It pains me so much when I see my colleagues pay OpenAI to do programming assignments… they see it is faster to ask gpt, than learn it properly. Sadly, I can say nothing to them, or I would risk worsening relations with them.

  • entwine413@lemm.ee
    link
    fedilink
    arrow-up
    0
    ·
    9 days ago

    I can see this partly being true in that it’ll be part of a dev’s toolkit. The devs at my previous job loved using it to do busy work coding.

    • adrian@50501.chat
      link
      fedilink
      English
      arrow-up
      0
      ·
      edit-2
      8 days ago

      I agree that it will continue to be a useful tool. I’ve gotten a similar productivity boost using AI auto-complete as I did from regular auto-complete. It’s also pretty good at identifiying potential uses with code, again, a similar productivity boost as a good linter. The chatbot does make a good sounding board, especially when you don’t remember the name of the concept you are trying to implement or need to pro-con two solutions and you can’t find articles about it.

      But all these claims of 10x improvements in development speed are horse shit. Yeah, you might be able to shit out a 5-10,000 LOC tutorial app in an hour or two with prompt engineering, but try implementing a feature in a 100,000 LOC codebase and it promptly shits the bed: hallucinating internal frameworks, microservices, ignoring internal practices, writing straight up non-functional code, etc. I’d you spend enough time prompting it, you can eventually massage the solution you need out of it; problem is, it took longer to do that than writing the damn thing yourself.

    • Valmond@lemmy.world
      link
      fedilink
      arrow-up
      0
      ·
      9 days ago

      “busy work coding” is that what you do when you try to look like you’re working (like a real dev)?

      • 3abas@lemm.ee
        link
        fedilink
        arrow-up
        0
        ·
        9 days ago

        Real world development isn’t creating exciting apps all the time, it’s writing the same exact boring convention based code sticking to an established pattern.

        It can be really boring and unchallenging to create your millionth respiratory, or you can prompt your ide to create a new repo and with one sentence it will create stub out 10 minutes worth of tedious prep work. It makes programming fun again.

        In one prompt, it can look at my finished code and stub out half decent documentation that otherwise wouldn’t have been completed at. It does hallucinate sometimes, or it completely misunderstands the code, so you have to correct a few sentences, but the brain drain of coming to with the sentence structure to write useful documentation is completely lifted, and the code is now well documented.

        AI programming is more than just vibe coding, and it’s way more useful than everyone here insists it’s not.

      • dermanus@lemmy.ca
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        We’re using it for closing security flaws identified by another tool. It’s boring, unchallenging work that is nonetheless still important. It’s also repetitive and uncreative enough that I’m comfortable having a machine do it.

        There’s still human review but when it’s stuff like “your error messages should escape variables” or “write a longer function name” having a tool that can do most of the grunt work is valuable.

    • TheSealStartedIt@feddit.org
      link
      fedilink
      arrow-up
      0
      ·
      9 days ago

      Oh god the hate in this sub. It is definitely another tool for a dev to use. Like autocomplete or a lot of other stuff a good IDE does to help you. If you don’t want to use it, fine. Perhaps you’re such a pro that you don’t need anything but a text editor. If you’re not, and you’re ignoring it for whatever petty reasons, you’ll probably fall behind all the devs who learned how to use it to get more productive (or, in developer terms, lazier)

      • fmstrat@lemmy.nowsci.com
        link
        fedilink
        English
        arrow-up
        0
        ·
        8 days ago

        Agreed. Like it or not, old school auto complete was the same thing, just not as advanced. That being said, comment op probably didn’t click the link.