

because it does make prototyping/MVP faster, and unfortunately it’s difficult for some to see that having no human who can actually understand the codebase is a terrible idea in the medium and long term


because it does make prototyping/MVP faster, and unfortunately it’s difficult for some to see that having no human who can actually understand the codebase is a terrible idea in the medium and long term


I’ve never had to look for disabling middle click paste because it broke panning or orbiting in 2d or 3d canvas. Same thing for games, middle click works just fine there.


yeah, I use it all the time to copy username and password in one go instead of fiddling with the clipboard history


704kB !



But if you are looking for a job that AI will definitely create, by the millions, I have a suggestion: digital asbestos removal.
already happening



more managers should be aware of this


or just topgrade to update everything
I use it everyday and I still hate pacman’s flags with a passion
sounds like he’s frustrated that a clean architecture didn’t magically solve all of his problems


“Anywhere you have your password manager” is useful


Zen has also committed to not include AI features


It could be considered biochemical warfare


along with the compose.yaml file, unless I need it in a different drive for any reason


almost as if using a memory safe language actually reduces the CVEs related to memory


it’s not like the whole driver is written in unsafe rust


Maybe. The problems I have with codeberg are the lack of support to private repos and the 100 repo limit. I have some personal stuff in version control that I prefer to keep private, like notes, dotfiles, and shell history.
At the same time, I’m not sure I want to maintain a self-hosted forge.


but the whole thing is self-hosted, not just the action runner, right?
No, this is about adding guidelines for tool-generated submissions to the kernel. The tailwind conversation was on making their documentations more accessible to AI tools.
Linus doesn’t want to add guidelines to not fuel any side of the whole discussion, and says that adding guidelines won’t solve the problem because a lot of times it’s not trivial to detect whether or not a contribution was written with AI tools, after all, “documentation is for good actors”, hinting that anyone contributing AI slop is not expected to respect it anyway.