

IMO automated changelogs like these are not especially useful. Better than no changelog I guess, but nowhere near as good as a proper changelog. But proper changelogs take actual effort.


IMO automated changelogs like these are not especially useful. Better than no changelog I guess, but nowhere near as good as a proper changelog. But proper changelogs take actual effort.
IDEs tend to work out of the box while the likes of vim or emacs need configuration and have an initially steep learning curve.
Not in my experience. It’s very easy to design systems that break IDE support. People love adding all sorts of ad hoc build scripts that mean you can’t just press F5 or whatever. It takes discipline and caring about IDEs to not do that.
And while people might love tweaking Emacs and Vim, it isn’t required.
There’s definitely an element of snobbery, and also of being lazy about tooling. Do you think once you become a talented dev you lose all human vices?
Some of the smartest people in the world believe in an imaginary dad who lives in the sky and grants imperceptible wishes. Everyone is human.
I completely agree. Also almost all of the fancy editing you can do with Vim can be done just with multiple cursors, and it’s less annoying because you do it incrementally (rather than typing a long sequence of commands and then seeing the result), and you much less to memorise.


I agree. C2 continuity does matter for aesthetics sometimes, but not for a button.


Even KISS. Sometimes things just have to be complex. Of course you should aim for simplicity where possible, but I’ve seen people fight against better and more capable options just because they weren’t as simple and thus violated the KISS “rule”.


One example is creating an interface for every goddamn class I make because of “loose coupling” when in reality none of these classes are ever going to have an alternative implementation.
Sounds like you’ve learned the answer!
Virtual all programming principles like that should never be applied blindly in all situations. You basically need to develop taste through experience… and caring about code quality (lots of people have experience but don’t give a shit what they’re excreting).
Stuff like DRY and SOLID are guidelines not rules.


Thanks for highlighting your username - made me notice that you post a lot of nonsense here so I can easily block it!


AI AI blah blah AI.
Also why is HCL supposedly the 9th most popular “programming language” (which it isn’t anyway)?


There are some examples in the very first list I found googling for “cancel culture examples”.
Not all of them are political (e.g. cancelling someone for sexual assault is clearly not, and that Heineken one… how??), but a decent number are, e.g. number 6 is about as partisan as you can get.


It’s a fairly inevitable reaction to cancel culture. This was predicted and warned against when left-wing cancel culture was at its height, but people didn’t listen. Now we have right-wing cancel culture instead.


I wouldn’t recommend the Gang of Four book. Many of the design patterns they espouse are way over complicated from the days of peak OOP. You know, FactoryFactoryVisitor stuff. Usually best avoided.


Yeah, I use Claude/ChatGPT sometimes for:
I haven’t got around to setting up any of that agentic stuff yet. Based on my experience of the chat stuff I’m a bit skeptical it will be good enough to be useful on anything of the complexity I work on. Find for CRUD apps but it’s not going to understand niche compiler internals or do stuff with WASM runtimes that nobody has ever done before.


He’s right, zstd is incredibly popular, quite widely used and also generally believed to be the best compression algorithm overall.


They use QAM and similar because it’s the best way to transmit data over a small number of long wires. Exactly the opposite of wires inside a CPU.


This video confuses at least three different concepts - quantum uncertainty, ternary computers, and “unknown” values.
Ternary computers are just not as good as binary computers. The way silicon works, it’s always going to be much much slower.
“Unknown” values can be useful - they are common in SystemVerilog for example. But you rarely just have true, false and unknown, so it makes zero sense to bake that into the hardware. Verilog has 4 values - true, false, unknown and disconnected. VHDL has something like 9!
And even then the “unknown” isn’t as great as you might think. It’s basically poor-man’s symbolic execution and is unable to cope with things like let foo = some_unknown_value ? true : true. Yes that does happen and you won’t like the “solution”.
High level programming concepts like option will always map more cleanly onto binary numbers.
Overall, very confused video that is trying to make it sound like there’s some secret forgotten architecture or alternative history when there definitely isn’t.


Yeah I’m watching Ty. Pytype and Pyre are not serious options. Nobody really uses them, and Pytype is discontinued. Facebook have a new project called Pyrefly that’s also worth watching.
But for now, use Pyright. No argument. If you’re really worried about Microsoft (and not Facebook or Google for some reason) then use BasedPyright.


I would say:
Just practice, do projects. Also if you can work on projects with other people because you’ll read a lot of bad code and learn how not to do things (hopefully).
Learn lots of programming languages. They often have different and interesting ways of doing things that can teach you lessons that you can bring to any language. For example Haskell will teach you the benefit of keeping functions pure (and also the costs!).
If you only know Python I would recommend:
Learn Python with type hints. Run Pyright (don’t use mypy; it sucks) on your project and get it to pass.
Go is probably a sensible next step. Very quick to learn but you’ll start to learn about proper static typing, multithreading, build tools (Go has the best tooling too so unfortunately it’s all downhill from here…), and you can easily build native executables that aren’t dog slow.
C++ or Rust. Big step up but these languages (especially C++) will teach you about how computers actually work. Pointers, memory layouts, segfaults (in C++). They also let you write what we’re now calling “foundational software” (formerly “systems software” but that was too vague a term).
Optionally, if you want to go a bit niche, one of the functional programming languages like Haskell or OCaml. I’d probably say OCaml because it’s way easier (it doesn’t force everything to be pure). I don’t really like OCaml so I wouldn’t spend too much time on this but it has lots of interesting ideas.
Final boss is probably a dependently typed language like Lean or Idris. Pretty hardcore and not really of much practical use it you aren’t writing software that Must Not Fail Ever. You’ll learn loads about type systems though.
Also read programming articles on Hacker News.


Clean Code was pretty effectively debunked in this widely shared article from 2020. We probably don’t need to talk about it anymore.
Frankly I’m surprised it was ever recommended. Some of the things it says are so obviously insane, why would anyone think it was good?
My only guess is the title? “Your code sucks; maybe read this book that I haven’t vetted about clean code.” sort of thing?
I’d say it would be good to have a modern replacement with good advice to recommend… But in my experience you can’t really learn these things by reading about them. You have to experience it (and have good natural taste).
This list of code smells is pretty decent at least: https://luzkan.github.io/smells/
Interesting. But can’t you do basically the same thing with
@nonnullannotations? I remember using something like that a decade ago when I last wrote Java.