It’s interesting that all the devs I already respected don’t use it or use it very sparingly and many of the devs I least respected sing it’s praises incessantly. Seems to me like “skill issue” is what leads to thinking this garbage is useful.
Everyone is talking past each other because there are so many different ways of using AI and so many things you can use it for. It works ok for some, it fails miserably for others.
Lots of people only see one half of that and conclude “it’s shit” or “it’s amazing” based on an incomplete picture.
The devs you respect probably aren’t working on crud apps and landing pages and little hacky Python scripts. They’re probably writing compilers and game engines or whatever. So of course it isn’t as useful for them.
That doesn’t mean it doesn’t work for people mocking up a website or whatever.
I’d rather hone my skills at writing better, more intelligible code than spend that same time learning how to make LLMs output slightly less shit code.
Whenever we don’t actively use and train our skills, they will inevitably atrophy. Something I think about quite often on this topic is Plato’s argument against writing. His view is that writing things down is “a recipe not for memory, but for reminder”, leading to a reduction in one’s capacity for recall and thinking. I don’t disagree with this, but where I differ is that I find it a worthwhile tradeoff when accounting for all the ways that writing increases my mental capacities.
For me, weighing the tradeoff is the most important gauge of whether a given tool is worthwhile or not. And personally, using an LLM for coding is not worth it when considering what I gain Vs lose from prioritising that over growing my existing skills and knowledge
your experience isnt other peoples experience. just because you can’t get results doesnt mean the trchnology is invalid, just your use of it.
“skill issue” as the youngers say
It’s interesting that all the devs I already respected don’t use it or use it very sparingly and many of the devs I least respected sing it’s praises incessantly. Seems to me like “skill issue” is what leads to thinking this garbage is useful.
Everyone is talking past each other because there are so many different ways of using AI and so many things you can use it for. It works ok for some, it fails miserably for others.
Lots of people only see one half of that and conclude “it’s shit” or “it’s amazing” based on an incomplete picture.
The devs you respect probably aren’t working on crud apps and landing pages and little hacky Python scripts. They’re probably writing compilers and game engines or whatever. So of course it isn’t as useful for them.
That doesn’t mean it doesn’t work for people mocking up a website or whatever.
I’d rather hone my skills at writing better, more intelligible code than spend that same time learning how to make LLMs output slightly less shit code.
Whenever we don’t actively use and train our skills, they will inevitably atrophy. Something I think about quite often on this topic is Plato’s argument against writing. His view is that writing things down is “a recipe not for memory, but for reminder”, leading to a reduction in one’s capacity for recall and thinking. I don’t disagree with this, but where I differ is that I find it a worthwhile tradeoff when accounting for all the ways that writing increases my mental capacities.
For me, weighing the tradeoff is the most important gauge of whether a given tool is worthwhile or not. And personally, using an LLM for coding is not worth it when considering what I gain Vs lose from prioritising that over growing my existing skills and knowledge