cross-posted from: https://lemmy.zip/post/49954591

“No Duh,” say senior developers everywhere.

The article explains that vibe code often is close, but not quite, functional, requiring developers to go in and find where the problems are - resulting in a net slowdown of development rather than productivity gains.

Then there’s the issue of finding an agreed-upon way of tracking productivity gains, a glaring omission given the billions of dollars being invested in AI.

To Bain & Company, companies will need to fully commit themselves to realize the gains they’ve been promised.

“Fully commit” to see the light? That… sounds more like a kind of religion, not like critical or even rational thinking.

  • flatbield@beehaw.org
    link
    fedilink
    English
    arrow-up
    4
    ·
    edit-2
    1 hour ago

    I have a friend that is a professional programmer. They think AI will generate lots of work fixing the shit code it creates. I guess we will see.

  • thingsiplay@beehaw.org
    link
    fedilink
    arrow-up
    7
    arrow-down
    1
    ·
    3 hours ago

    Billions of dollars are spent, unimaginable amount of power is used, ton of programmers are fired, million of millions code is copied without license and credit, nasty bugs and security issues are added due to trusting the ai system or being lazy. Was it worth it? Many programmers get disposable as they have to use ai. That means “all” programmers are the same and differ only in what model they use, at least that’s the future if everyone is using ai from now on.

    Ai = productivity increases, quality decreases… oh wait, Ai = productivity seems to increase, quality does decrease

  • abbadon420@sh.itjust.works
    link
    fedilink
    arrow-up
    19
    ·
    4 hours ago

    They say the same about scrum.

    “It doesn’t work in you company, because you haven’t fully implemented all aspects of scrum”

    Coincidentally it costs about a gazillion dollars to become fully Scrum certified.

  • Not a newt@piefed.ca
    link
    fedilink
    English
    arrow-up
    21
    ·
    5 hours ago

    “Fully commit” to see the light? That… sounds more like a kind of religion, not like critical or even rational thinking.

    It also gives these shovel peddlers an excuse: “Oh, you’re not seeing gains? Are you even lifting AI-ing, bro? You probably have some employees not using enough AI, you have to blame them instead of us.”

  • Ŝan@piefed.zip
    link
    fedilink
    English
    arrow-up
    2
    arrow-down
    5
    ·
    1 hour ago

    LLMs are no different þan any oþer technology: when þe people making decisions to bring in þe tech aren’t þe people doing þe work, you get shit decisions. You get LLMs, or Low Code/No Code platforms, or cloud migrations. Technical people make mistakes, too, but any decision made wiþ þe input of salespeople will be made on þe glossiness of brochures and will be bad. Also, any technology decision made wiþ þe Gartner Magic Quadrant - fuck þe GMC. Any decision process using it smells bad.