• 0 Posts
  • 33 Comments
Joined 1 year ago
cake
Cake day: June 12th, 2023

help-circle



  • It’s an interesting angle, the hostility thing. People in the know have largely fallen out of love with Ubuntu but imho that’s not necessarily because Ubuntu fell in quality but just because so many “better” things have come up since Ubuntu 04.10. It is definitely a sound choice for non-techy people, maybe more than ever. Personally I’d prefer (almost) any contemporary desktop over Gnome these days, but I can definitely see the appeal for others in terms of simple design language.

    Basically you can turn any old laptop into a Chromebook these days using Linux, and most people just like your parents most definitely do not need more than a functional webbrowser. Basically a smartphone with a larger screen and physical keyboard. Even if you don’t care about your privacy (or freedom from notification-spam) why still pay the Microsoft-tax.


  • Which are about as related as the knowledge required to mount drywall and the knowledge required to run a ham radio station. You tell me which is more complicated but either way there are most certainly radio amateurs out there that don’t know the first thing about handywork and handymen that could barely find the on-off switch on a broadcast-rig.



  • oftentimes (and this is more of a general statement) throwing into google exactly what you would otherwise type into your shell of choice should get you on the right track, ie searching for “man systemctl”

    as far as the inability to reboot goes, if a regular sudo reboot can’t bring the machine back up either then this is probably a hardware issue outside the sphere of the operating system’s influence. can’t say I experienced something like that myself. I guess the closest I witnessed would be a computer that when rebooted with an old USB-Keyboard plugged in just refused to get past the POST screen. The keyboard worked fine if plugged in later, but the computer couldn’t reliably get through the boot process with the thing present. Maybe there’s a similar variable to your setup.


  • I bought a used old gen Sonos Connect about a year ago to integrate my Logitech Z906 into an existing pair of Sonos speakers. They made it deliberately tedious to downgrade those speakers (who had gotten the S2 “blessing”) back to S1 to make them work with the Sonos Connect. I’m an IT repair shop guy and I cursed all the way through this downgrade process.

    I would have gladly bought current hardware from them again if their prices were anywhere within the realm of plausibility. Credit where it’s due, that Sonos Connect hookup with the 2 wall-mounted 1st party speakers works absolutely reliably. That company just seriously lost its bearings since they engineered those parts.


  • I think if you’re talking wider demographics your model OSs are (obviously) Windows and macOS. People buy into that because CLI familiarity isn’t required. Especially with Apple products everything revolves around simplicity.

    I do dream of a day when Linux can (at least somewhat) rival that. I love Linux because I am (or consider myself) intricately familiar with it and I can (theoretically) change every aspect about it. But mutability and limitless possibilities are not what makes an OS lovable to the average user. I think the advent of immutable Linux distros is a step in the right direction for mass adoption. Stuff just needs to work. Googling for StackOverflow or AskUbuntu postings shouldn’t ever be necessary when people just want to do whatever they were doing on Windows with limited technical knowledge.

    However on another note, if you’re talking a home studio migration, not sure what that entails, but it sounds rather technical. I don’t want to be the guy to tell you that CLI familiarity is simply par for the course. Maybe your work shouldn’t require terminal interaction. Maybe there is a certain gap between absolutely basic linux tutorials and the more advanced ones like you suggest. Yet what I do want to say is that if you want to do repairwork on your own car it’s not exactly like that is supposed to be an accessible skill to acquire. Even if there are videos explaining step by step what you need to do, eventually you still need to get your own practice in. Stuff will break. We make mistakes and we learn from them. That is the point I’m trying to get at. Not all knowledge can be bestowed from without. Some of it just needs to grow organically from within.



  • Not the guy you’re asking but I agree. There would be no need for Falcon Sensor on every Windows-machine deployed inside an Enterprise (assuming that Falcon Sensor serves a purpose worth fulfilling in the first place) if the critical devices on their network were sufficiently hardened. The main problem (presumably the basis of such a solution existing) is that as soon as you have a human factor, people who must be able to access critical infrastructure as part of their job, there will be breakages of some kind. Not all of those must be malicioius or grow into an external threat. They still need to be averted of course.

    I feel that CrowdStrike is an idea that seems appealing to those making technological decisions because it promises something that cannot be done by conventional means as we have known and deployed them before. I can’t say whether or how often this promise has ever enabled companies to thwart attacks at their inception, but again, I feel that in a sufficiently hardened environment, even with compromisable human actors in play, you do not need self-surveillance (at the deepest level of an OS) to this extent.

    And to also address OP’s question: of course there is no need for this in a *NIX environment. There hasn’t been any significant need for antivirus of any kind in any of the UNIX-based world including macOS. So really this isn’t about whether an anti-malware solution in itself can satisfy the needs of a company per se, the requirements very much follow the potential attack vectors that are opened up by an existing infrastructure. In other words, when your environment is Windows-based, you are bound to deploy more extensive security countermeasures. Because they are necessary.

    Some may say that this is due to market-share, but to those I say, has the risk-profile of running a Linux-based server changed over the last 20 years? They certainly have become a lot more common in that timeframe. One example I can think of was a ransomware exploit on a Linux-based NAS-brand, I think it was QNAP. This isn’t a holier than thou argument. Any system can be compromised. Period. The only thing you can ensure is that the necessary investment to break your system will always be higher than the potential gain. So I guess another way to put this is that in a Windows-based environment your own investment into ensuring said fact will always be higher.

    But don’t get me wrong, I don’t mean to say Windows needs to be removed from the desks of office-workers. Really this failure and all these photographs of publically visible bluescreens (and all the ones in datacenters and server-rooms that we didn’t see) shows that Windows has way too strong of a foothold in places where plenty smart people are employed to find solutions that best serve the interests of their employers, including interests (i.e. security and privacy) that they are unaware of because they can’t be printed on a balance-sheet.



  • Serious question how do you get bored of Windows during its heyday?

    My first experience with Linux was Ubuntu 4.10 and it seemed super cool and all but I could’ve never switched fully during those days. And if we’re honest most legit Linux users up until not too long ago were forced to have a dual boot setup because so many things just hadn’t been universalized yet.

    So just to illustrate where I’m coming from asking that question, my first personal computer (as opposed to family PC) ran XP and that was a pretty exciting time when it comes to market dominance and all the advantages that came with being a user of the biggest platform. Looking back I just don’t see how I could’ve ever made that switch in the noughties let alone the 90s. The adoption just wasn’t there yet.



  • Ah yea I didn’t realize the official dock has 2 ports for display output. Valve is bae.

    There are definitely docks that have 3 display outputs, which would be a viable option if you also buy the Wacom Link Plus. I personally don’t know of any docks that have 2 display outputs and a USB-C port that is display-capable. There may be Thunderbolt ones but Steam Deck doesn’t do Thunderbolt unfortunately.

    So yea I guess your only option is a different dock plus Wacom Link Plus. I don’t see any other personally.


  • So the setup is currently two external monitors? Or does that include the Deck monitor? Is the USB connection to the Wacom just for pen input or does it transfer image as well? If USB-C is used as the monitor port it most definitely will not work with USB-A of any kind. Not even USB-A 3.1. You either need a different dock with a USB-C port or you need the Wacom Link Plus (which means you probably also need a different dock with at least 2 HDMI ports or one HDMI and one DisplayPort).





  • Mark my words. Don’t ever use SATA to USB for anything other than (temporary) access to non critical preexisting data. I swear to god if I had a dollar for every time USB has screwed me over trying to simplify working with customers’ (and my own) drives. Whenever it comes to anything more advanced than data level access USB just doesn’t seem to offer the necessary utilities. Whether this is rooted in software, hardware or both I don’t know.

    All I know is that you cannot realistically use USB to for example carbon copy one drive to another. It may end up working, it may throw errors letting you know that it failed, it may only seem to have worked in the end. It’s hard for me to imagine that with all the individual devices I’ve gone through that this is somehow down to the parts and that somewhere out there would be something better that actually makes this work. It really does feel like whoever came up with the controlling circuits used for USB to SATA conversion industry-wide just didn’t do a good enough job to implement everything in a way that makes it wholly transparent from the view of the operating system.

    TL;DR If you want to use SATA as intended you need SATA all the way to the motherboard.

    tbh I often ask myself why eSATA fell by the wayside. USB just isn’t up to these tasks in my experience.