

History showed it many times: people who annex never ever make it better for the annexed. They always exploit and things get worse.


History showed it many times: people who annex never ever make it better for the annexed. They always exploit and things get worse.


Unfortunately, that is not really possible.
The UEFI standard, a pdf that describes in detail the unified system that all motherbpards use during the boot process, is 1200+ pages long. And that’s only one of the many subsystems in a modern system (that gigantic pdf tells you nothinf about PCI, about ACPI and usb, nor any other hardware peripheral). Also, since you are talking about a modern system, you also would need kernel, drivers and operating system calls documentation. All of these exist (for an open source OS like linux, and if you follow the aforementioned standards), but bundling them in a book, and keeping them uodated, would be just impossible.


Because 1) EU laws defend the customers a lot more and 2) US companies have already so much power and money, they can fuck over you easier, and you don’t have easier alternatives, or at least some people pretend you don’t
I wonder why apt search on ubuntu and debian must be so bad: on mint each package has a single line and an easy letter telling you if the program is installed or not. On debian/ubuntu each program takes multiple lines, are all green and the only way to distinguish installed ones is to look for an (installed) string at the end of the first line. I like Mint’s apt version so much


That feels so bad for signal integrity, especially at 5+ GT/s


That’s good, AppImage is still my favourite of the “distro-agnostic” package systems and I think it really is missing a central repository solution.


It’s a package repository, but I would hardly call it “central”


I’m not saying that’s not true.
I’m saying I’ve almost never downloaded a Flatpak that didn’t require a new dependency downloaded.
When I removed all my flatpk some time ago, I had: Steam, Viking, Discord, FreeCad and Flatseal to manage them. All of them and their dependencies used something arounx 17 GB of disk space (most of which was of course several versions of dependency runtimes), and that was after I removed all the unused runtimes that forn some reason it doesn’t remove after I uninstall or they are upgraded.
I’m sure if I installed more Flatpaks, some dependencies would eventually be reused, but you still need a good collection of them at any given time. So in pracrice you still need a lot lf space unfortunately.


I don’t know if it’s still the case, but up to a couple of years ago, Flatpak was configured so that externally mounted folders were not accessible. I discovered that when Steam on flatpak refused to install games on my hdd, and it was quite frustrating to figure out how to enable it. Still, it’s difficult to criticize how “bloated” are electron apps (they are) when I need to download 2GB or runtime for an 80MB telegram binary
Snaps integration is even worse as I’ve seen browser extensions state they straight don’t work on snap’s browsers. Also desktop integration on gnone (even files drag and drop between snaps) are broken on the ubuntu installations I tried.
Appimages have the least drawbacks and are my preferred methods between the three (at least they take less storage space than an equivalent Flarpak for some reason, but are still broken sometimes), yet they still miss a central package repository, and that’s a big problem.


Appimages are usually quite reasonable in size, it’s Flatpak that usually require 2/3 GB per app since every package has its own version of KDE/Gnome or other runtimes so every app still has to download a new one.


Also each is pretty bad in terms of usability and practicality, either losing integration because “containerized” or taking GBs of space or both.
Edit: guys relax, I’m not a linux hater, I use it daily. But windows does have a unified environment, which makes deployment so much easier, while linux doesn’t. And that’s a problem since you either have old broken apps on distro repositories, or impractical, potebtially bloated, and even more fractionated environments like those I mentioned. They are patches and we should work towards a more standard environment, not adding more and more levels of abstraction like electron does.
Even Torvalds says it so.


If you only wany to retrieve your files, run the crack on a VM, it will probably run like shit, but you don’t need to do so all the time


CS:GO is a free game. I was wondering about pirating a game that is 100% online. Are people downloading this only playing against bots? Are there going to be private servers also rolledback to a previous version? I’m just curious


What’s the benefit of this? Cracking an online game that is free anyway
I think because they try so hard to be edgy (eheh) and different from the others, by constantly trashing known tested paradigms, refusing to fix known problems, all whike trying to invent the “brand new thing” that nobody wants and never reallt works out.


Since some extensions are “mozilla-approved”, I guess they test it regularly, it wouldn’t be hard to verify if one is really sending anything despite their disclosure.


You can easily download planet.osm, I think it’s a couple of TB for the compressed file.
Can’t really advise you on what to do, but here’s some conaiderations:
• I still use a 4th gen i7 with 16 GB ddr3 and a gtx970, still going fine in its 10th year. Just recently upgraded to a gtx1060 I found around • 10 years old techbology isn’t really any diffetent than today, only slower, but luckily architectural incompatibility is becoming less and less of a problem (except when it’s forced upon for no particular reason, see win11) • gpu especially are extremely backward and forward compatible, if you only need more VRAM, you can use a modern gpu with a very old mobo and cpu and chances are you’ll be as good, and even if you need to upgradr them later because you are cpu-bottleneck, you can still keep the gpu. I’m guessing in 90% of cases, pci lane speed is relatively unimportant wether it’s gen3 or gen5.
Basically, upgradr when you feel you are limited in what you can do, ignore the pressure caused by the generations passing by, as time goes on, I predict we’ll need less and less hardware upgrade until a nee revolutionary technology comes about that changes everything.
Not a good choice for a name, at first I though it was just another linux phone that would be useless for 90% of people.
Very cool project instead, hope this can lead the fondation for a 100% open source mobile OS.
Are you sure about this? As far as I know, debian modernized their repos quite a bit even compared to ubuntu, that also sparked some controversy from debian long time fans especially because they wanted more dated, stable software. Never used LMDE though, so I’m not sure if it applies