

Bigots challenging other people’s rights, group.


Bigots challenging other people’s rights, group.


… why the fuck would proving Valve’s superdupermajority market share hinge on WON?


Lucille: “I don’t care for Tetris.bas.”


Software path-tracing has been on my bucket list, mostly to test a concept: physically based instant radiosity. If an eye-ray goes camera, A, B, C, then the light C->B forms an anisotropic point source. The material at B scatters light from C directly into onscreen geometry. This allows cheating akin to photon mapping, where you assume nearby pixels are also visible to B. Low-frequency lighting should look decent at much less than one sample per pixel.


It doesn’t use any mappers or added chips. There’s quicksaves, a level editor. jump-in two-player co-op, and SNES mouse support.
I have not been arsed to add music.


All both of them?
Or like a hundred million?
Sony pulls this shit with every new Playstation, through the ingenious and difficult process of not making enough. “PS3 sold out at launch! New shipment sold out again! And again!” Meanwhile they’d moved fewer total units than the 360 in the same timeframe, but Microsoft made one big shipment instead of three small ones.


I have officially lost the plot on what’s happening here.


Oh damn, we’re back to custom Windows 95 boot screen bitmaps. I had one that did the Matrix digital rain effect.


Oh it’s definitely not just Google. Apple’s been this fucked since 2007. But since this is the Android community, it’s helpful to stay on-message.


Shatter this corporation.


‘LLMs specifically won’t work.’
‘No, see, LLMs won’t work.’
Okay.


Right, should say deep neural networks. Perceptrons hit a brick wall because there’s some problems they cannot handle. Multi-layer networks stalled because nobody went ‘what if we just pretend there’s a gradient?’ until twenty-goddamn-twelve.
Broad applications will emerge and succeed. LLMs kinda-sorta-almost work for nearly anything. What current grifters have proven is that billions of dollars won’t overcome fundamental problems in network design. “What’s the next word?” is simply the wrong question, for a combination chatbot / editor / search engine / code generator / puzzle solver / chess engine / air fryer. But it’s obviously possible for one program to do all those things. (Assuming you place your frozen shrimp directly atop the video card.) Developing that program will closely resemble efforts to uplift LLMs. We’re just never gonna get there from LLMs specifically.


People have run LLMs on a Raspberry Pi.
The bubble is straining to burst because there’s not much difference between the high end and the low end.


Neural networks will inevitably be a big deal for a wide variety of industries.
LLMs are the wrong approach to basically all of them.
There’s five decades of what-ifs, waiting to be defictionalized, now that we can actually do neural networks. Training them became practical, and ‘just train more’ was proven effective. Immense scale is useful but not necessary.
But all hype has been forced into spicy autocomplete and a denoiser, and only the denoiser is doing the witchcraft people want.


Why are you even paying for it now, instead of doing it locally?


“keeping our free societies ahead”
Why is my dog barking?
Ooh, fair point. We don’t know that any of these options boot.
I cannot fathom having my shit together to such a degree that my bootloader has a theme.
Folks, they train on Disney movies. Intellectual property is not a factor.