Just want to clarify, this is not my Substack, I’m just sharing this because I found it insightful.
The author describes himself as a “fractional CTO”(no clue what that means, don’t ask me) and advisor. His clients asked him how they could leverage AI. He decided to experience it for himself. From the author(emphasis mine):
I forced myself to use Claude Code exclusively to build a product. Three months. Not a single line of code written by me. I wanted to experience what my clients were considering—100% AI adoption. I needed to know firsthand why that 95% failure rate exists.
I got the product launched. It worked. I was proud of what I’d created. Then came the moment that validated every concern in that MIT study: I needed to make a small change and realized I wasn’t confident I could do it. My own product, built under my direction, and I’d lost confidence in my ability to modify it.
Now when clients ask me about AI adoption, I can tell them exactly what 100% looks like: it looks like failure. Not immediate failure—that’s the trap. Initial metrics look great. You ship faster. You feel productive. Then three months later, you realize nobody actually understands what you’ve built.



I found the article interesting, but I agree with you. Good programmers have to and can debug other people’s code. But, to be fair, there are also a lot of bad programmers, and a lot that can’t debug for shit…
@JuvenoiaAgent@piefed.ca @technology@lemmy.world
Often, those are developers who “specialized” in one or two programming languages, without specializing in computer/programming logic.
I used to repeat a personal saying across job interviews: “A good programmer knows a programming language. An excellent programmer knows programming logic”. IT positions often require a dev to have a specific language/framework in their portfolio (with Rust being the Current Thing™ now) and they reject people who have vast experience across several languages/frameworks but the one required, as if these people weren’t able to learn the specific language/framework they require.
Languages and framework differ on syntax, namings, paradigms, sometimes they’re extremely different from other common languages (such as (Lisp (parenthetic-hell)), or
.asciz "Assembly-x86_64"), but they all talk to the same computer logic under the hood. Once a dev becomes fluent in bitwise logic (or, even better, they become so fluent in talking with computers that they can say41 53 43 49 49 20 63 6f 64 65without tools, as if it were English), it’s just a matter of accustoming oneself to the specific syntax and naming conventions from a given language.Back when I was enrolled in college, I lost count of how many colleagues struggled with the entire course as soon as they were faced by Data Structure classes, binary trees, linked lists, queues, stacks… And Linear Programming, maximization and minimization, data fitness… To the majority of my colleagues, those classes were painful, especially because the teachers were somewhat rigid.
And this sentiment echoes across the companies and corps. Corps (especially the wannabe-programmer managers) don’t want to deal with computers, they want to deal with consumers and their sweet money, but a civil engineer and their masons can’t possibly build a house without willing to deal with a blueprint and the physics of building materials. This is part of the root of this whole problem.
The hard thing about debugging other people’s code is understanding what they’re trying to do. Once you’ve figured that out it’s just like debugging your own code. But not all developers stick to good patterns, good conventions or good documentation, and that’s when you can spend a long time figuring out their intention. Until you’ve got that, you don’t know what’s a bug.
That feels like the trap here. There is no intention, just patterns.