“Oh hi! Here’s some code. I didn’t write it and don’t understand it, but you should totally run it on your machine.”
Canadian software engineer living in Europe.
“Oh hi! Here’s some code. I didn’t write it and don’t understand it, but you should totally run it on your machine.”
The bit of information you’re missing is that du aggregates the size of all subfolders, so when you say du /, you’re saying: “how much stuff is in / and everything under it?”
If you’re sticking with du, then you’ll need to traverse your folders, working downward until you find the culprit folder:
$ du /*
(Note which folder looks the biggest)
$ du /home/*
(If /home looks the biggest)
… and so on.
The trouble with this method however is that * won’t include folders with a . in front, which is often the culprit: .cache, .local/share, etc. For that, you can do:
$ du /home/.*
Which should do the job I think.
If you’ve got a GUI though, things get a lot easier 'cause you have access to GNOME Disk Usage Analyzer which will draw you a fancy tree graph of your filesystem state all the way down to the smallest folder. It’s pretty handy.


Plus the FF extension is really full-featured. I can clip in different formats or even take a screenshot if the webpage makes clipping hard.
I didn’t even know there was a Firefox extension! I might give it a look.
I was a Windows user as a kid in the 80s & 90s doing pirate installs of 3.11 and later 95 for friends and family. I got into “computers” early and was pretty dedicated to the “Windows is the best!” camp from a young age. I had a friend who was a dedicated Mac user though, and she was bringing me around. The idea of a more-stable, virus-free desktop experience was pretty compelling.
That all changed when I went to school and had access to a proper “Mac lab” though. Those motherfuckers crashed multiple times an hour, and took the whole OS with them when they did it. What really got to me though was the little “DAAAAAAAAAAA!” noise it would make when you had to hard reboot it. It was as if it was celebrating its inadequacy and expected you to participate… every time it fucked you over and erased your work.
So yeah, Macs were out.
I hadn’t even heard of Linux in 2000 when I first discovered the GPL, which (for some reason) I conflated with GNOME. I guess I thought that GNOME was a new OS based on what I could only describe as communist licensing. I loved the idea, but was intimidated by the “ix” in the name. “Ix” meant “Unix” to me, and Unix was using Pine to check email, so not a real computer as far as I was concerned.
It wasn’t until 2000 that I joined a video game company called “Moshpit Entertainment” that I tried it. You see, the CEO, CTO, and majority of tech people at Moshpit were huge Linux nerds and they indoctrinated me into their cult. I started with SuSe (their favourite), then RedHat, then used Gentoo for 10 years before switching to Arch for another 10+.
TL;DR: Anticapitalism and FOSS cultists lead me into the light.
What exactly is an external drive case? Are you just talking about a USB enclosure for a single drive or something that can somehow hold multiple drives and interface over something more stable than USB?


Joplin will do this for you. It comes ready to sync with all sorts of cloud options, as well as “local folder” which works well with Syncthing. It’s offline-first, cross-platform, and FOSS.


…or contribute to Mozilla’s work while getting something in return.
That’s exactly the reasoning Google has followed with its development and promotion of webp. Unfortunately, whether the website cares or not, CO₂ emissions are markedly higher due to increased client energy consumption, and that does directly affect you, so it’s worth considering the implications of using webp in a popular site.
Webp is pretty great actually. Supporting a 32bit alpha channel means I’ve actually managed to reduce file sizes of what were formerly PNGs by something like 80%, which drastically improved performance (and the size of my project). I don’t get where the complaint of image quality came from either, as it seems to perform better than JPEG at the same file size.
The worst part is that you missed the real problem with the format: the CPU overhead (and therefore the energy cost) of handling the file. A high-traffic site can dramatically increase the energy required for the images processed by the thousands/millions of clients in a single day, which places a drain on the grid and bumps up CO₂ (yes, this is really a thing that people measure now).
Basically Google invented the format to externalise their costs. Now, rather than footing the bill for bigger datacentres and greater bandwidth, they made everyone else pay for decompression.


Serious question: could we not just fork the project under the GPL and use that?


There’s a GNOME extension called “Just Perfection” that may be exactly what you’re looking for. It let’s you hide/disable pretty much any visual thing you can think of.


from datetime import datetime
from dateutil.relativedelta import relativedelta
print(datetime.now() + relativedelta(years=10)) # 2035-08-24 12:02:49.795177
It does get better, but… it’s kinda like river rafting.
Coming from Windows, Linux can and does often feel like you’ve spent your whole life trapped in a box. Suddenly “that thing that’s always annoyed you” is something you can turn off, replace, or improve with very little effort. I remember for example that when I switched back in 2000 I was blown away by a checkbox in the KDE PDF viewer. You could, in the basic settings, with no special hackery required, simply uncheck the box labelled Respect Adobe DRM. Suddenly, my computer was actually mine.
Using Linux these days is still just as amazing. You go from an OS that spies on you, pushes ads into your eyeballs, and has some of the worst design patterns ever, to a literal bazaar of Free options. It’s different for everyone, and that’s sort of the point: Linux is “Free” in all senses of the word, as you can make your machine do whatever you want.
It takes some time to get there though, and a lot of it is hardware unfortunately. A lot of the machines out there are built exclusively for Windows and the companies that make these things hide a lot of their inadequacies in their (proprietary) Windows drivers. So, when you try to use not-Windows, you end up using drivers written by people who had to reverse engineer or just do some guesswork to get that hardware working. This arrangement works very well for both Microsoft and these budget hardware vendors because it provides lock-in for the former, and a steady market for the latter.
The reality is that if you want to make the switch to Linux, you’re more likely to have a hard time if your hardware choices fall in this camp. For example, some times it’s just easier to buy a €12 USB WiFi or Bluetooth adapter that you know works with Linux than it is to rely on the chip that came with your laptop. It’s better now than it once was, but Nvidia cards, the occasional webcam, and a few WiFi devices have presented as problems for me in the last few years.
My advice is to embrace that “patience and stubbornness” and temper it with an honest pricing of your time vs. the cost of replacing the problematic hardware. When buying new stuff, look up its Linux support online before buying anything. You’ll save yourself a lot of pain.
In cases when you really want to dig in and understand/fix your problem (because it’s Linux, you’re allowed to understand and fix things on your computer!) then I recommend looking at the Arch Wiki and even using Arch Linux since (a) that’s the basis for most of the information there, and (b) Arch tends to favour “bleeding edge” stuff, so you’re more able to install the latest version of things that may well support your hardware.
I know it’s probably not the answer you were hoping for, but if you stick it out, I promise it’s worth it. I’ve been doing this for 25 years now and I’m never going back. Windows makes me so inexplicably angry with it’s constant nagging, spying, and inadequacies, I just can’t do it.


Most of the comments here seem to be from the consumer perspective, but if you want broader adoption, you need to consider the corporate market too. Most corporate software these days is web-based, so the problem is less with the software and more with the people responsible for it.
The biggest hurdle is friction with the internal IT team. They like Windows because that’s all they ever learnt and they’re not interested in maintaining a diverse set of company laptops. They won’t entertain Linux in a corporate environment unless it’s mandated by management, and even if the bosses approve it, IT will want a way to lock you out of your laptop, force updates, do a remote wipe, etc.
There are (proprietary) tools to do some of this, but they generally suck and often clash with your package manager. Microsoft is just way ahead of Linux in the “bloatware that tours your hands” department.


That sounds like a nice feature we could use for the Aur actually. We already have the votes value, but some sort of verification body could help rescue the Aur’s reputation.


It might be interesting to test if changes in this area via KDE persist with the switch to game mode. Much of what you do in KDE is just making changes to much lower level system rules, so you may find you can set rules in KDE and use them in Steam.


This all appears to be based on the user agent, so wouldn’t that mean that bad-faith scrapers could just declare themselves to be typical search engine user agent?


I’ve been thinking about setting up Anubis to protect my blog from AI scrapers, but I’m not clear on whether this would also block search engines. It would, wouldn’t it?


I had a job interview a few weeks ago where the lead developer straight-up said that he doesn’t have any tests in the codebase because “it’s just writing your code twice”. I thought he was joking. Unfortunately he was not.
I didn’t end up getting the job, perhaps because I made it clear that I thought he was very wrong. I think I dodged a bullet.
If you build for a containerised environment, standing up your service in Kubernetes with HPA gives you all the scalability (and potentially cost) benefits of serverless without all the drawbacks.