• 2 Posts
  • 149 Comments
Joined 3 years ago
cake
Cake day: June 17th, 2023

help-circle


  • They are excellent in the hobby world. It’s generally when you need to do a bit of quick logic, an ESP32 can be dropped in to do it. E.g. change the colour of an led depending on a sensor.

    They also form the core of a lot of IoT devices. Simple sensors and relays that can connect to WiFi and throw up a simple web interface. ESPhome, tasmota and WLED exist to make this extremely easy.

    They are basically the hobbiest electronic multi tool. Powerful enough to do most jobs without bothering with code optimisation. Cheap enough to throw in and leave there.



  • I can think about ideas without either visualisations or inner monologue. My inner monologue is mainly for mapping ideas to a “transmittable” state. I also have to force visualisations.

    The best description I can give is multiple interacting “data streams”. E.g. a cat won’t be an image of a cat, it would be a mapping vaguely akin to how a computer game tracks things, a collection of data on pose, limb length, join angles etc. It doesn’t use actual numbers. It’s akin to how you know the angle of your elbow, without knowing the angle in degrees.

    My inner monologue’s main use mapping from this internal data blob into something that I can explain to others.





  • It’s anecdotal , but I heard that Linux bug reports are actually a problem for some game developers. When 1% of your customer base submits 10-20% of your bug reports, middle managers get upset. Apparently several games have had Linux support dropped because of this.

    While Linux often has more bugs in games (and so more reports), Linux users have also been conditioned to report bugs better. It helps a lot in FOSS etc.


  • While Putin was likely acting on their interests, the current situation has gone completely pear shaped on that front. Putin is stuck. If he backs down, he’s dead, if he doesn’t win, he’s dead. He’s currently riding the limbo between those situations, hoping for a 3rd option.

    If he died, the powers behind him would likely take the chance to disengage. The current situation is bad for business, and plans need to be re-thought. It wouldn’t fix things long term, but short term, they would likely back down.


  • There’s a lot more to teaching than just good explanations. I do enjoy trying to explain complex science in more understandable ways however.

    As for struggling, we all do at times, pushing through is how we get better. Also science is a little like a spider web. If you look closely, at just a few strands, they don’t make obvious sense. It’s only when you build up a broader picture that it becomes obvious and easy. Building that picture, unfortunately, requires pushing through the “what the hell, I can’t make sense of this!” stage.


  • It would be a mix of relative rates and the exact energy.

    If you pick an area of “empty” space where you expect very little dark matter, you will get a baseline reading. When you aim at an area expected to be dense in dark matter, you will expect to get a higher reading. E.g. 10 counts a day, Vs 100 per day. This is basically how radiation detection works on earth, so the maths is well studied.

    The other thing is energy levels. 2 electrons hitting have a distinct energy. It will vary upwards slightly, due to kinetic energy, but not that much. We also know the annihilation energy of other forms of matter, from earth experiments. A reading distinct from anything normal would be a good signature of an unknown type of matter annihilating.

    There are also extra complications from things like red shift, but those can be measured in other ways, and corrected for.

    The order of theory and discovery also helps. “Finding X that happens to support Y” is a lot weaker than “Predicting X from theory Y, then going and finding it”. If you run 1 million experiments, a 1 in a million result is quite likely by pure fluke. A 1 in a million result from a single, focused experiment is a lot more powerful.


  • In a short summary. Something is wrong with the spin of galaxies. There is more mass than we can account for, and it’s distributed wrong.

    Either the laws of gravity are slightly wrong, or there is something out there with mass, but no interaction with other matters (light particularly).

    More recent, more detailed studies have shown that the error is not consistent. Therefore either the laws of physics vary from galaxy to galaxy (very unlikely) or it’s something physical, rather than a law error.

    That leaves dark matter, sometimes called W.I.M.Ps (Weakly Interacting Massive Particles). They don’t seem to interact with electromagnetism at all, and even any strong or weak force interaction is minimal. It only interacts gravitationally.

    We know the interactions at minimal due to gravity mapping. It seems to form a cloud around galaxies, rather than collapsing in. To collapse in, they must interact to exchange momentum. If they only interact by gravity, that collapse will be extremely slow.

    That is most of what we can be fairly sure of. There’s a lot of speculation around this, and we might be barking up the wrong tree completely. However dark matter via WIMPs seems to be the most consistent with the evidence right now.

    Edit to add.

    This experiment seems quite ingenious. It assumes that WIMPs have a mix of both matter and antimatter. Ever so often a matter/antimatter pair get close enough to annihilate. This creates a pair of gamma photons. The existence of these would help back the existence of physical WIMPs. The energy would also tell us something of their mass (photon energy = mass energy + momentum energy). That will help narrow down where to look in our particle accelerator data.




  • It might also be a single dev who pushed for it. With only a 1-3% market share, the company is unlikely to push resources at it. That 1 dev getting any working version out is a win in many ways.

    Also, most Linux users are a lot better trained at reporting bugs. Most of the time, this is a good thing, letting them get fixed in FOSS development setups. Unfortunately, in gaming, it ends up making Linux look a buggy mess. When 60% of your big reports come from 0.5% of your users, companies can panic. Even if the same bugs exist in windows, just no one bothers to report them.


  • I’ll take compatible.

    Most people game on windows. It’s monolithic nature also means that they will mostly encounter the same bugs.

    Linux has a wider base of functionality. A bug might only show up on Debian, not Ubuntu.

    End result, they spend 60% of their effort solving bugs, for 2% of their base. That’s not cost viable.

    Compatibility means they just have to focus on 1 base of code. All we ask is that they don’t actively break the compatibility. This is far less effort, and a lot easier to sell to the bean counters.

    Once Linux has a decent share, we can work on better universal standards. We likely need at least 10% to even get a chance there.