Just want to clarify, this is not my Substack, I’m just sharing this because I found it insightful.

The author describes himself as a “fractional CTO”(no clue what that means, don’t ask me) and advisor. His clients asked him how they could leverage AI. He decided to experience it for himself. From the author(emphasis mine):

I forced myself to use Claude Code exclusively to build a product. Three months. Not a single line of code written by me. I wanted to experience what my clients were considering—100% AI adoption. I needed to know firsthand why that 95% failure rate exists.

I got the product launched. It worked. I was proud of what I’d created. Then came the moment that validated every concern in that MIT study: I needed to make a small change and realized I wasn’t confident I could do it. My own product, built under my direction, and I’d lost confidence in my ability to modify it.

Now when clients ask me about AI adoption, I can tell them exactly what 100% looks like: it looks like failure. Not immediate failure—that’s the trap. Initial metrics look great. You ship faster. You feel productive. Then three months later, you realize nobody actually understands what you’ve built.

  • ignirtoq@feddit.online
    link
    fedilink
    English
    arrow-up
    117
    ·
    18 hours ago

    We’re about to face a crisis nobody’s talking about. In 10 years, who’s going to mentor the next generation? The developers who’ve been using AI since day one won’t have the architectural understanding to teach. The product managers who’ve always relied on AI for decisions won’t have the judgment to pass on. The leaders who’ve abdicated to algorithms won’t have the wisdom to share.

    Except we are talking about that, and the tech bro response is “in 10 years we’ll have AGI and it will do all these things all the time permanently.” In their roadmap, there won’t be a next generation of software developers, product managers, or mid-level leaders, because AGI will do all those things faster and better than humans. There will just be CEOs, the capital they control, and AI.

    What’s most absurd is that, if that were all true, that would lead to a crisis much larger than just a generational knowledge problem in a specific industry. It would cut regular workers entirely out of the economy, and regular workers form the foundation of the economy, so the entire economy would collapse.

    “Yes, the planet got destroyed. But for a beautiful moment in time we created a lot of value for shareholders.”

    • UnspecificGravity@piefed.social
      link
      fedilink
      English
      arrow-up
      18
      ·
      16 hours ago

      Yep, and now you know why all the tech companies suddenly became VERY politically active. This future isn’t compatible with democracy. Once these companies no longer provide employment their benefit to society becomes a big fat question mark.

    • HasturInYellow@lemmy.world
      link
      fedilink
      English
      arrow-up
      21
      ·
      edit-2
      16 hours ago

      According to a study, the lower top 10% accounts for something like 68% of cash flow in the economy. Us plebs are being cut out all together.

      That being said, I think if people can’t afford to eat, things might bet bad. We will probably end up a kept population in these ghouls fever dreams.

      Edit: I’m an idiot.

      • Prior_Industry@lemmy.world
        link
        fedilink
        English
        arrow-up
        8
        ·
        16 hours ago

        Once Boston Dynamic style dogs and Androids can operate over a number of days independently, I’d say all bets are off that we would be kept around as pets.

        I’m fairly certain your Musks and Altmans would be content with a much smaller human population existing to only maintain their little bubble and damn everything else.

      • kreskin@lemmy.world
        link
        fedilink
        English
        arrow-up
        7
        ·
        edit-2
        15 hours ago

        Edit: I’m an idiot.

        Same here. Nobody knows what the eff they are doing. Especially the people in charge. Much of life is us believing confident people who talk a good game but dont know wtf they are doing and really shouldnt be allowed to make even basic decisions outside a very narrow range of competence.

        We have an illusion of broad meritocracy and accountability in life but its mostly just not there.

    • Randelung@lemmy.world
      link
      fedilink
      English
      arrow-up
      9
      ·
      17 hours ago

      Also, even if we make it through a wave of bullshit and all these companies fail in 10 years, the next wave will be ready and waiting, spouting the same crap - until it’s actually true (or close enough to be bearable financially). We can’t wait any longer to get this shit under control.