

Yes, I understand that, I meant that it somehow felt for me that it’s a dinosaur term in English that nobody uses. Dunno why.
Rephrasing a common quote - talk is cheap, that’s why I talk a lot.


Yes, I understand that, I meant that it somehow felt for me that it’s a dinosaur term in English that nobody uses. Dunno why.


I mean, a PC from year 1999 is in the realm of possible for plenty of more localized production chains than needed to have that monster with Ryzen in the name.
And it’s not unreasonable to expect such a scattering of production. It happened with plenty of technologies. Also it’s not unreasonable to expect a return from more sophisticated and powerful material culture to one less so at both, but more accessible.
That’s what happened with automobiles a few times in history, that’s what happened with construction technologies and money many times in history, with food, with warfare.
That semiconductors are something challenging in complexity to produce - that actually makes such scattering more probable.
It’s not much different from chinaware or late medieval metallurgy needed for firearms. Strategic technologies are hard to achieve and it’s simpler to purchase their output, but eventually everyone realizes they need their own.
So I really hope that instead of the same not really diverse ecosystem of Intel, AMD and ARM powerful hardware we’ll have a thousand different local manufacturers of partially compatible hardware far weaker, like Amiga 1200, but more interesting.
Perhaps this will also be similar to the transition from late Rome to early Middle Ages.
It just makes sense historically. More distributed production environment can support smaller efficiency, - can’t make and sell on the same scale, - but there will be constant pressure to have it.
Of course, in reality this is all alarmism for no reason. There will be a bubble burst, suppose, - well, then there’ll be plenty of cheap hardware thrown out. The RAM manufacturers will have hard times, but it’ll balance out eventually. Just how it did after the dotcom bubble, not in the best way, perhaps with only a few manufacturers remaining, but it will. Or if there will be no bubble burst, suppose all that computing power founds an application with non-speculative value, - well, there’s still long way to go before your typical PC usage starts requiring really expensive amounts of RAM. If we drop the Web, even with modern Linux or FreeBSD one could survive on 2GB RAM and Intel C2D in year 2019. Then on 4GB, almost comfortable, even playing some games.
One good thing I’m seeing - those RAM prices can eventually kill the Web. It’s the most RAM-hungry part of our needs for no good reason. Perhaps Gemini is not what can replace it, it’s too basic, but I can see it becoming in corporate interest to support a leaner non-compatible replacement for the same niche. And corporate interest kills.
Or perhaps they’ll like some sort of semantic web gone wrong way - with some kind of “web” intended for AI agents, not humans, with humans having a chat prompt.


(offtopic: I’ve noticed the word “notebook” to refer to the same as “laptop” to become more common recently. I even had thoughts at some point that it’s a Russian localization curiosity, using the transcription of “notebook” for that.)
Nice machine. I’d wonder if it’s deliverable to where I am if I needed it.


Yes, but this only works if said concentrated manufacturer group also holds all IP and power means to prevent competition on the market they don’t want filled.
It’s like a monopoly protected by navy, something right out of 1600s, if such a state of things is established in some countries, all the others will have an advantageous route of peaceful development, except with higher risk of war and sabotage from the former group. Almost like colonial unpleasantness between Iberian monarchies on one side and England and Netherlands on the other. From the point of the former, they had the Papal blessing and divine ownership of the New World divided between them, and the latter were heretics and thieves. From the point of the latter, the former didn’t have any exclusive rights to unpopulated by Europeans lands overseas. While the popular narrative (right out of Sabatini’s books and such) portrays the former as bad and the latter as good, I’ll notice that the former did less of racism and slavery and genocide, and their former colonies are culturally mixed unique nations. Unlike British colonies, which are all, even USA, sort of England overseas with diverged dialects.
The point is - there are legal arguments which might eventually become bigger conflicts.
So - you won’t do anything to already consolidated power. This might become a new global split, in political dimension driven by economic interest. Already in testing, in fact, with Gaza and other recent conflicts. And it would be a shame if most western countries turned up on the wrong side, because that wouldn’t make the other side better than it really is, but it really would have an advantage in development. You can forbid people to produce and own universal personal computers for all kinds of use only if they live under your control.


It’s the web as it exists now. It can’t be fixed gradually, or at least that’s harder than to design from scratch a replacement with same abilities, but fewer levels of abstraction, less bloat, making a client application in reasonable time being possible. Probably with architecture and semantics centered around how social networks and messengers work, not just hypertext. Visiting a webpage and reading a group chat are different ideas, the latter doesn’t imply connecting to one specific location. Again, that’s something that was understood since Usenet. Just no public system like Usenet, but not morally obsolete, emerged to be popular.


Yes. This will hurt mostly small investors, wage workers in everything less essential than factory jobs, service sphere, such people.
Humans that will take back control already have much of that control, the difference mostly lies in slavery being formally illegal.
But the biggest effect will be similar to that of the dotcom bubble, I think.
Ah, whatever. I’m just bored of this stage of development, let them be done with it and show the results. I want to see what happens after. It must be something interesting. There must be some movement in the opposite direction, if temporary, like the Web in the 00s for some time had this emotional flavor of a rainy Tokyo street at spring, and peer-to-peer networks had lots of popularity.
I think with all the doom and gloom we should remember that the next stage is always interesting if it doesn’t involve more murder and torture.
Say, when this bubble bursts, conversational LLM-based user interfaces will, instead of the solution for every problem solving none, become just another thing. That might be good, it might make typical UX sane again.
Also it’s almost logical that communication with another person is also how you’d want to communicate with a service. EDIT: So that they’ll exist is good.
Then after that we might see less of those very bloat and omnipresent telemetry that nobody likes. They will be less of “the new digital oil” and more of bother that doesn’t buy that much.


Perhaps. I’ve recently watched a video on historical coins and the quality of relief.
Things like Athenian tetradrachm were very rare, and thus done properly. The more and more numerous recognizable coins had to be (inflation ; for much of history in much of the world old or foreign money would be valid currency), the simpler were the designs. Medieval European mints had to make such amounts of coins, that they were very simple and streamlined.
So what this has to do with computing - building something in an empty place works differently than replacing\changing something in the middle of an old city. When computers were new, things worked differently than now, economically and socially.
It’s possible that when this settles into something more stable, normal amounts of RAM for personal computers will again be in megabytes.


Windows 2000/ME
One is good, but slower, the other is more buggy than 98SE, but a bit faster. Not much in common between them other than year.


It’s more of an emotional antipode of how tracking everyone is justified - “you have nothing to fear if you have nothing to hide” and all such.
Whether, say, a convicted rapist (I suppose that’s dishonest enough) should be tracked or not is a question in the system of values my previous comment represents.
First, whether them being a confirmed (by a proven deed) threat justifies tracking them, second, whether tracking them violates rights of those around them - their coworkers, their family members, their friends, and so on, third, whether it’s possible to make tools for tracking them without introducing a technical possibility of tracking random people.
Second and third are not the same, second is about how tracking technically only them exposes those on their social graph, third is about initially illegal, but technically possible use, that would eventually become legal, because of slippery slopes.


Perhaps that’s intentional. A whole country dependent on something that’s fully based on things not being unique.
Perhaps that’s some sort of national salvation plan. Of course, with shares of power in the post-crash USA being divided by the right people beforehand.
To crash it and then create a manufacturing economy, poor, but resilient. Sort of trying to repeat China. Or USSR.
Perhaps some important people judged that USA as a developing union of colonies on a sparsely populated continent is less stable than a normal old world nation. With all the American solutions becoming less efficient over time, because there’s less and less of using unoccupied spaces, in various senses, and more and more of continuing the same old tree. And their solution is to destroy the legacy of the time when USA was a leading country, to allow for some modern muscle to grow on that skeleton. Otherwise, when that legacy would run out, the decay would be worse.
Well, I can imagine such thought process. If plenty of people on Lemmy think most of tech is scams, then why wouldn’t billionaires, it’s not as if they were different creatures from Mars.


It’s the “common sense” part of the laws.
A honest person has right to live without being tracked. You shouldn’t care how they’ll do it and you shouldn’t care if they go out of business.
And of course you shouldn’t fear to be public about it and demand answers, LOL, the most notable for me personally part about today’s politics is that in English-speaking countries that fear seems to have become a thing. Well, because any protest that’s more than a demonstration is becoming dangerous and costly.
While literal legalism always helps tyranny.
It’s not much different from USSR in the 70s and 80s, “yeah, you can have all your rights, a defendant and all, and correspondence and you won’t be tortured for submitting a complaint, and Soviet laws will be followed to the letter, but good luck, prove you’re not a camel”.
Since USSR and western nations no longer exist in the same time period, it’s easy to discard even the thought that the latter are gradually becoming similar to the former in some regards, and might even overshoot it.
Anyway, I live in Russia, here things are for the last few months at the point where I can get jailed for writing even this, just because. LOL again.


Using XP was almost the same as using W2K, except uglier, but more sci-fi-feeling. IIRC.
But yes, I too remember W2K as the best one.
From the PoV of a kid visiting websites, reading books on the Web, playing forum RPGs and some video games, and downloading MP3s. And talking over ICQ.
From that PoV it was fast, clean and without distractions and I liked the icons, the sounds and the wallpapers.


It’s part of that FOSS I’m calling neutered and sterile.
FOSS is about following laws and making contracts that would be convenient with existing laws.
That’s fine, but it’s just a cultural stage, a projection of the wider idea onto our reality in its local form.
When you reduce that wider idea to FOSS with copyleft, you kill it.
Underground culture is important.


That’s how evolution works, mathematically. What’s more likely to survive and give offspring, survives and gives offspring.
And unless unlimited time and resources become firm reality, that’s how it will work further.
Anyway, that’s so galaxy-scale that I don’t want to think about it.


It doesn’t.
Imagine displaying a window list without GPU acceleration.


Which would mean it should have been rewritten long before. =\


More like preserved them since saner times. I think I’ve read about you being able to choose from 20+ payment systems in Japan, everywhere. And it’s also normal to use cash.
They have also kept a saner version of IP laws for fan fiction. You can legally produce and sell fan fiction there, with some limitations to genre and kind of media.


Because you did in the past and taught them you will do that again.
There’s a rule - don’t deal with assholes. No matter how big and visible their shop is.


You are mistaken. “Distro” is a word for Linux distributions because they have kernels with the same one upstream, and userspace programs assembled of many different projects into different versions of the same dish.
BSDs are different operating systems, they don’t share one upstream, they do share one ancestor (like 30 years ago, so - not very relevant now). Including userspace, except for common software, of course.
And Darwin is another operating system, including its own userspace tools, which are partially derived from BSD code, but its kernel is different, it’s Mach plus some BSD-derived code. It’s not a BSD.
And while mostly Apple’s OSes are Darwin, I think I’ve read some of them are NetBSD. Not sure which.
And it’s a store, not a package manager. It’s in the name.
I had one such, HP something, with Atom and 1024x600 display. Ran Linux with
dwm,conkeror,mpdand even some games in Wine. I really liked that, actually. Except for the 1024x600 part, at least 1366x768 would be better.