• 1 Post
  • 188 Comments
Joined 2 years ago
cake
Cake day: June 14th, 2023

help-circle


  • There’s no committee that approves words being added to the English language. Anything that’s understood by the group that uses it is a real word. We make up new words and change the definition of old ones all the time; dictionaries are descriptive, not proscriptive.

    That doesn’t stop the concept of ‘agentic AI’ being a pile of bullshit being peddled by snake-oil salesmen, of course, but you don’t have to be Shakespeare to be permitted to make up new words.



  • I think it’s in the nature of capital cities that they tend to attract quite a lot of people who want to try “life in the city” for a while and then move on? I’ve a few friends who moved down to London to see if they could make it in the music industry, which they did not, and then moved on to somewhere else with a less insane cost of living, after a decade or so. I’d observe that, while there’s quite a lot of Brits in London, there’s a massive shortage of Londoners. When people have kids, they generally want a bigger house somewhere with a decent school nearby, which in many cases means moving to the outskirts, or to a different city altogether.

    That’s very much to London’s benefit, though. They have everything that you can imagine; specialist shops of every variety, and opportunities in every industry. However, I don’t think ‘London weighting’ of wages is really sufficient. Even if the wages are eg. 20% more for doing the ‘same job’ as the rest of the UK, you aren’t going to get a lot for that, and a lot of people in entry-level jobs are going to be living in big shared houses and struggling to scrape by, until they find the experience/inclination to leave. That’s a tale as old as time, tho, and probably to the benefit of the city - without a massive turnover of people, wages would probably need to be even higher.

    Diversity is strength. If you don’t like that, then a bustling metropolitan capital city is not for you, and London is no exception. They’ve a nice bridge for the racists to throw themselves off; cry while you do, dickheads.


  • Isn’t the default installation of Ubuntu to BTRFS? In which case, you should have an @ subvolume with Ubuntu that’s mounted to /, and an @home subvolume that’s mounted to /home.

    Make a new subvolume, install a new operating system into it, and choose that subvolume in the bootloader, should be able to have Ubuntu and ‘your favourite OS’ (I use Arch btw) living side-by-side with the same home directory.




  • addie@feddit.uktolinuxmemes@lemmy.worldMinimum specs
    link
    fedilink
    arrow-up
    6
    ·
    edit-2
    1 month ago

    I had one of the Macintosh iBook G4s with the notoriously shitty graphics card soldering. Early days of lead-free soldering. Mine started to fail just outside of warranty. The ‘fix’ was to put a lot of pressure on the chip so that all the connections were held in place, but that was quite difficult to do while it was still a laptop.

    Dismantled the damn thing, yeeted the plastic shell, and screwed the remains onto a sheet of plywood. Looked a lot like pizza-box PC in the corner there. Got another couple of years out of it. Made it a lot more convenient for watching videos, since you could just prop the whole thing against a wall or whatever. Couple of USB extension leads meant that you could still use a mouse and keyboard in comfort.


  • Yeah. I’ve got MangoHud throttling it down to 36 fps for that reason - if it tries to run 4K @ 144 fps then my graphics card sounds like a Spitfire getting ready for launch. It’s not a game that needs twitch response for any reason, so it’s not harmed by that.

    It’s an amazing game but the graphics are a small part of that, which makes the fact it runs inexplicably badly a bit of a mystery. Complicated lighting and long view distances in the underdark? No probs. Just a row of houses in act 3? Enjoy your stutters and framerate dips.


  • Reads like Intel will be using Nvidia’s stuff for integrated systems, and doesn’t say anything at all about discrete graphics cards.

    If you’re integrating a GPU, then it’s going to be either for a laptop, in which case performance-per-watt and total die size are very important, or it’s for a generic business PC, in which case ‘as cheap as they can get away with’ takes over. A B580 might be the best mid-range graphics card, but those aren’t the areas where it shines. Using someone else’s tech makes sense.


  • I got myself a remarkable after seeing a colleague use one and thinking they were cool. An astonishing price for what is essentially a kindle that you can write on, but that is essentially the entirety of its functionality right there. No web browser, no ebook integration, no keyboard, just a thing for scribbling notes with a big battery life. No distractions.

    As such, it’s completely ideal for my work diary, meeting notes, D’n’D notes, maps for games that I’ve been playing, random scribbles, all sorts. Quite a lot lighter than the thousands of sheets of paper that would be required otherwise. Also not as rude as popping open a laptop when you’re meeting someone - they can see you’re just making notes and writing to-dos.



  • From the couple of games where I’ve been able to compare; frames per second are exactly the same, but the CPU runs a great deal less hot. No concern on desktop, but that would make a difference on the Steam deck.

    (Mark Of The Ninja has a deadlocking issue on native that it doesn’t have on proton, got quite a few of Frictional’s games - Penumbra, Amnesia - that just won’t open on my main monitor when native, and most recently Silksong has been really funny about my 8bitdo controller when native. Works great with my fightstick, though.)





  • 4K for me as a developer means that I can have a couple of source files and a browser with the API documentation open at the same time. I reckon I could use legitimately use an 8K screen - get a terminal window or two open as well, keep an eye on builds and deployments while I’m working on a ticket.

    Now yes - gaming and watching video at 8K. That’s phenomenally niche, and very much a case of diminishing returns. But some of us have to work for a living as well, alas, and would like them pixels.


  • Speaking as a developer; I’ve a 4K screen which is amazing for having loads of source files open at the same time, and also works for old or undemanding games. Glorious Eggroll’s version of Proton has all the FSR patches in it, so you can ‘upscale anything’. Almost any modern game, I’m going to be running at lower resolution, usually either 1440p or the slightly odd 2954 x 1662. Generally, highest-quality graphics and upscaling looks better than medium-quality native to me, for games where I have to compromise.

    I would be interested in an 8K display for coding, as long as the price is reasonable. I’m not spending five grand, that would be crazy. But I’d still be upscaling for playing games, as basically no GPU could drive that many pixels.