The gold standard is interesting, but wouldn’t the gold end up in a few hands eventually?
The gold standard is interesting, but wouldn’t the gold end up in a few hands eventually?
Ya the SENS repair approach is the way to go IMO.
Well the “not having extreme longevity” doesn’t seem to function, they are here anyways.
Seems your plan doesn’t work, they are here anyways.
Solar parkings!! Park your car over solar panels!! Solar pools, put them at the bottom of your pool!! Put them INSIDE!
It’s like an idiot manically obsessed with solar panels got their hands on some heavy drugs.
*“Podcast-bro”
Eat my shorts!
I remember iTunes only letting you change computer like 2-3 times max before the drm would make mysic not work any more, but maybe it was no-drm in the beginning.
I had a chinese 1GB shuffle though so IDK if that’s correct.
The chinese shuffle also doubled up as a usb key (very useful back then) and also didn’t need iTunes to function smh.
The chief engineer of Intel predicted that it’d probably be impossible to get to 1um when they broke the 3um barrier.
Ya, um, not nm. So my bet is it will probably continue, it’s not really a law and everyone interprets it differently too so.
Yeah true, no need to nitpick about transistors per mm² :-)
Stuff gets denser, sometimes more efficient, and it goes on and on, more to the rhyme of international markets today than trying to keep up with old Moore.
It also feels like the target market have changed (again), from better PCs to handhelds, for example.
You missed the /s I don’t think moores law is that dead.
No worries, we’ll hit a hard wall somewhere around 5nm because moores law is dead!! \s
And if it happens : Rebrand-Time!!
Hook it up as a subnet for yourself!
I have tons of firefox tabs open, even on my telephone. I’m quite sure they just get unloaded /not loaded if not used?
I wonder what the hell they are doing with it? I mean I have the 3B with IIRC 1GB and I can use the desktop and run python scripts to fiddle with all the I/O ports and stuff, what do you do with a raspberry that needs eight times the RAM??
I’m seriously curious!
No problem, but I mean if you’re just tinkering around then you could do with even less memory as long as the model stays in it and you sample small pieces in small batches.
We all had P series gpus and we had to buy up because the trainees model didn’t fit in 16GB (they had probably too much money) so I don’t remember what card it was for the 24GB.
We had GPUs with 24GB in 2017, go buy a pro one if you do AI and need that much RAM.
I mean, those cards are for gaming, right?
Thats golden !
I’m all for it, and it’s just the usual “moores law” trend, I just wonder if we won’t hit a wall where (most!) users just won’t need it?