

It’s already been made perfect once. What updates would you make it divinely inspired code?
It’s already been made perfect once. What updates would you make it divinely inspired code?
I distro hopped about every 4 months from ~12-22, never really feeling like I’d found the right platform. Sometimes I would dual boot (or just run) Windows, and for a while I had Windows XP in a state I could tolerate.
For several years after 22, I ran Windows at home, and kept Linux for work. I basically just wanted to game, and Windows was good enough for that. Finally, something came up that I needed a home server for, and I chose Arch, based largely on my experiences from several years ago. Arch had been more stable for me, and when it did break, it always felt like the tools to fix it existed. Ubuntu and derivatives broke for me mostly in “Oops, system is dead. Maybe reinstall?” ways, which I didn’t want on my server. Other distros gave me an assortment of problems, from updates taking too long, to lacking support for a WM I enjoyed, to driver issues.
Once I was regularly SSHing from Windows to Arch, I missed the things I could do on Linux (more than just games), and steam had made Linux support from a lot of games better, so I reinstalled my gaming PC as Arch too.
I added a lot of things to my server, and had more problems with some third party tools every time e.g. elasticsearch, mongodb, or postgres updated, so I added a kubernetes cluster with an immutable OS. I tried 3 before settling on Talos, and now when a workload on the server breaks, I move it to kubernetes. That pace has worked out for me, but now the server does no heavy lifting, so I’m experimenting with local LLM on it.
Huh, I always thought Linux stood for “Linus eXtreme”. The more you know…
“Trunk records” for indie music seems 110% appropriate to me.
WaterSeer was initially somehow related to some part of UC Berkeley, rather than MIT.
You may be thinking of this device: https://www.cell.com/joule/fulltext/S2542-4351(20)30444-X
… Or the metal oxide framework predecessor to it, or the newer thing that uses some sort of gel.
I’m not aware of a commercial product based on this work.
I once forgot to install the Linux package when I was installing Arch on a system. Linux even let’s you not use Linux, if you like.
It didn’t boot.
We used to have good, strong open source tools made out of C (which is a lot like steel - it can only be worked by blue collar computer nerds with muscly brains). Now that steel core is corroding because of the influence of hackers and other white collar computer sorts with their creative problem solving, and unintended uses of memory.
That new corrosion is called rust, and it eventually appears on every C project that’s left outside, unless someone comes along to brush it off occasionally.
Ideas can only be patented, not copyrighted. If a company designs something novel enough to qualify for a patent, and so good that people willingly pay for the feature, that’s impressive, and arguably still a good thing. If instead they design a better user experience, or an improvement in performance, the ideas can be used in open source, even when the code cannot be.
I’m bragging when I say this: A decade ago, I rewrote an indecipherable mess of code into an elegant and transparent procedure, nestled comfortably inside every sanity/insanity check I could think of for the situation. Today, that code (aside from an update for a vulnerable dependency) is still running just the way I wrote it.
Releases should be fast and rare.
The secret is that rats are committed Luddites. They chew through wires, but maintain and restore the elegant mechanisms of times long past.
You may expect that a few rats couldn’t roll a 6 ton stone boulder back up a hill, but rats are also capable of growing to very different sizes depending on their environment.
You can skip 3 of these adapters if you upgrade to the latest libraries, downgrade your microcode, turn off WiFi, and bench press a goat. It turns out it was the goat involved I’m the process, rather than the sacrifice, that made that stuff work.
“Graphical UI” it is
Systemd is trying to stop a service. To do an action to a service (or any unit), it runs a job. The job to stop a service is called a stop job. Once the stop job is taken off the job queue, the stop job is running.
The method of stopping a service is configurable, but the default is to send a kill signal to the MainPID, then wait for the process to exit. If it doesn’t, after a timeout, the kill is reattempted with a harsher signal.
Companies try to maximize green per red. By paying less, and getting the same, they maximize that, year after year until (in a temporary and unforeseeable setback) you leave for… Bluer pastures, apparently.
There are different sorts of companies, and the more they think of employees as a number of years of experience plus a stack of skills, the more susceptible they are to believing that replacing humans with other equally skilled humans is a productive way to spend their time.
Arch Linux is a spectrum mean it says tomorrow
I ran out of crtcs, but I wanted another monitor. I widened a virtual display, and drew the left portion of it on one monitor, like regular. Then I had a crown job that would copy chunks of it into the frame buffer of a USB to DVI-d adapter. It could do 5 fps redrawing the whole screen, but I chose things to put there where it wouldn’t matter too much. The only painful thing was arranging the windows on that monitor, with the mouse updating very infrequently, and routinely being drawn 2 or more places in the frame buffer.
Have you tried turning them off, then turning them on again?
I think we’re still headed up the peak of inflated expectations. Quantum computing may be better at a category of problems that do a significant amount of math on a small amount of data. Traditional computing is likely to stay better at anything that requires a large amount of input data, or a large amount of output data, or only uses a small amount of math to transform the inputs to the outputs.
Anything you do with SQL, spreadsheets, images, music and video, and basically anything involved in rendering is pretty much untouchable. On the other hand, a limited number of use cases (cryptography, cryptocurrencies, maybe even AI/ML) might be much cheaper and fasrer with a quantum computer. There are possible military applications, so countries with big militaries are spending until they know whether that’s a weakness or not. If it turns out they can’t do any of the things that looked possible from the expectation peak, the whole industry will fizzle.
As for my opinion, comparing QC to early silicon computers is very misleading, because early computers improved by becoming way smaller. QC is far closer to the minimum possible size already, so there won’t be a comparable, “then grow the circuit size by a factor of ten million” step. I think they probably can’t do anything world shaking.
You can buy high (97-99) CRI LEDs for things like the film industry, where it really does matter. They are very expensive, but can pay for themselves with longer service life, and lower power draw for long term installations.
The CRI on regular LED bulbs was climbing for a long time, but it seems as though 90ish is “good enough” most of the time.
I dislike systemd less than I dislike sysvinit, so it has that going for it.