• 0 Posts
  • 78 Comments
Joined 1 year ago
cake
Cake day: August 18th, 2023

help-circle







  • I agree that most people won’t care but take issue with calling them “dumb”. Everyone has a limited amount of time on this planet to build skills and chase hobbies. A lot of people on this site have tech-related jobs and hobbies, so of course this matters to us. I might expect someone who buys pre-built gaming PCs to keep this on their radar, but the vast majority of folks who use computers as email and social media machines, including those who only use it for data entry type jobs, have little reason to care about the specifics of their CPU or any other single component of their computer. If their computer breaks, that’s annoying, but that’s life. They’ll spend the same amount on a new laptop as we might spend on a new CPU and get on with their day.

    I don’t know what brand of spark plugs are in my car, and maybe a mechanic or car enthusiast would find that dumb. But hey, I’m too busy caring about my CPU to spend time worrying about my car unless it breaks.




  • Part of the problem is that we have relatively little insight into or control over what the machine has actually “learned”. Once it has learned itself into a dead end with bad data, you can’t correct it, only work around it. Your only real shot at a better model is to start over.

    When the first models were created, we had a whole internet of “pure” training data made by humans and developers could basically blindly firehose all that content into a model. Additional tuning could be done by seeing what responses humans tended to reject or accept, and what language they used to refine their results. The latter still works, and better heuristics (the criteria that grades the quality of AI output) can be developed, but with how much AI content is out there, they will never have a better training set than what they started with. The whole of the internet now contains the result of every dead end AI has worked itself into with no way to determine what is AI generated on a large scale.


  • I think the rub here is that most developers aren’t developing/publishing their own software, but honing their skills on writing proprietary code while also putting food on the table. To that end, a permissively licensed library is better because the company will actually use it and the developer will gain experience with it that they can then use outside of the proprietary environment to contribute to FOSS projects (some of which may well use GPL). If a GPL end user product gets popular enough, it will eventually be able to use all of that gained experience to compete with the propriety alternatives, so I do think the two can work in tandem.



  • I think we generally agree, but I worry that a new platform couldn’t do more than GoG+Lutris already do. Perhaps, though, it could be done with a reputable foundation.

    And the lawsuit is more or less what I was radio referring to with Steam’s price rules. I would definitely be on board with striking the requirement for publishers to offer the same price on all platforms at the same time.

    On that note, though, I wouldn’t take the whole case at face value, as I think parts of it are pretty frivolous (unless they prove that Steam is actually actively stifling competition and, you know, not just a decent platform that entered the space first.) I also think it’s silly to point out Epic’s lower commission rate since they’ve been giving out free games like candy and actually making third party games exclusive to their platform in a very clear attempt to compete with Stream. There’s absolutely no guarantee that they won’t raise their commission once they have a foothold in the market (though I do concede that their licensing terms for Unreal Engine have remained fairly reasonable).


  • On the one hand, yeah it’s absolutely important not to idolize any company, because they have no sense of loyalty or generosity. Telling yourself otherwise is a guaranteed path to disappointment.

    On the other hand, of all the shit sandwiches we’ve been served, Steam is one of the fresher ones. Though they developed Proton for their own benefit, it’s pretty undeniable that it has made gaming on Linux way more viable than it has ever been, and it’s open source. I mean no shade to FOSS solutions like Lutris, but having paid developers work on a project full-time certainly has its advantages.

    I do think that the concerns about Steam’s pricing rules are valid, as are gripes with its DRM for first party games. But, overall, they’ve brought a lot of convenience to PC gaming that is hard to find elsewhere in the gaming world.



  • I definitely agree, but that’s true of any system. The particulars of the pitfalls may vary, but a good system can’t overpower bad management. We mitigate the stakeholder issue by having BAs that act as the liason between devs and stakeholders, knowing just enough about the dev side to manage expectations while helping to prioritize the things stakeholders want most. Our stakes are also, mercifully, pretty aware that they don’t always know what will be complex and what will be trivial, so they accept the effort we assign to items.


  • Honestly a little confused by the hatred of agile. As anything that is heavily maligned or exalted in tech, it’s a tool that may or may not work for your team and project. Personally I like agile, or at least the version of it that I’ve been exposed to. No days or weeks of design meetings, just “hey we want this feature” and it’s in an item and ready to go. I also find effort points to be one of the more fair ways to gauge dev performance.

    Projects where engineers felt they had the freedom to discuss and address problems were 87 percent more likely to succeed.

    I’m not really sure how this relates to agile. A good team listens to the concerns of its members regardless of what strategy they use.

    A neverending stream of patches indicates that quality might not be what it once was, and code turning up in an unfinished or ill-considered state have all been attributed to Agile practices.

    Again, not sure how shipping with bugs is an agile issue. My understanding of “fail fast” is “try out individual features to quickly see if they work instead of including them in a large update”, not “release features as fast as possible even if they’re poorly tested and full of bugs.” Our team got itself into a “quality crisis” while using agile, but we got back out of it with the same system. It was way more about improving QA practices than the strategy itself.

    The article kinda hand waves the fact that the study was not only commissioned by Engprax, but published by the author of the book “Impact Engineering,” conveniently available on Engprax’s site. Not to say this necessarily invalidates the study, or that agile hasn’t had its fair share of cash grabs, but it makes me doubt the objectivity of the research. Granted, Ali seems like he’s no hack when it comes to engineering.


  • Windows into I went to college for development and decided to check out this Linux thing. At the time, I wanted something as different from Windows as possible, so I went with Ubuntu with Gnome 3 (I know) for about a year. Tried out Fedora, couldn’t get my sound to work and accidentally uninstalled the desktop environment trying to fix it, slunk back to Ubuntu, tried out a Debian briefly, and eventually ended up on Linux Mint with Cinnamon and KDE.

    At one time I really wanted to try a bunch of stuff and probably would’ve hopped a lot more if Fedora didn’t shatter my confidence, but nowadays I want as little disruption between machines as possible. I have to use Windows for work, so I keep my Linux setup pretty vanilla so I don’t miss features between the two very much. I’ll probably still play with other distros every now and then on old laptops, but I’ve fallen into a “if it ain’t broke” mindset with my daily machines.