• verstra@programming.dev
    link
    fedilink
    arrow-up
    6
    ·
    7 days ago

    Ok, good point, most languages I know use “C-style sequential function-calling” paradigm. Is there a specific idea that you have for a language that would better utilize our CPUs?

    Notation that treats asynchronous message-passing as fundamental rather than exceptional.

    I’m pretty sure there exists at least one research paper about notation for the actor pattern.

    You explain pretty well why you don’t think C is a good fit for hardware we have today, but that warrants a proposal for something better. Because I for sure don’t want to debug programs where everything is happening in parallel (as it does in pong).

  • webghost0101@sopuli.xyz
    link
    fedilink
    arrow-up
    7
    arrow-down
    5
    ·
    edit-2
    7 days ago

    Sorry but even if this was written by a human, ai has ruined this kind of sentences for me:

    “This is Hardware Stockholm Syndrome: we optimized the hardware for C, then pointed at the hardware and said “see, C is efficient!” We forgot we made it that way.”

    Also, so-many-dashes. Even if the “human author” fact checked all the details, it reads like slop and i cant get trough it.

    • azolus@slrpnk.net
      link
      fedilink
      English
      arrow-up
      15
      ·
      7 days ago

      I hate that people now associate dashes with ai as I used to really like using them (distinguishing between regular, en- and em-dashes, using them to make structure clearer over regular commas).

      • webghost0101@sopuli.xyz
        link
        fedilink
        arrow-up
        5
        ·
        edit-2
        7 days ago

        Don’t get me wrong, i like using dashes too.

        I actually have a contract to sign which requires an em dash that changes the interpretation drastically. I am struggling to get the author to realise its importance because i received 3 updated versions that did not include it. And this is after i replied with a self-fixed document the first time.

        But this article really does use them a lot, and thats not the only tell, just a more obvious one.

        I dont know how accurate zerogpt is but i gave it the full text and they returned 100% ai writen, not even a mix.

      • HelloRoot@lemy.lol
        link
        fedilink
        English
        arrow-up
        5
        ·
        edit-2
        7 days ago

        keep doing it but add some fitting humour and a typo here and therw - and nobody will think you’re ai

        DON’T LET EM TAKE AWAY OUR DASHES

      • webghost0101@sopuli.xyz
        link
        fedilink
        arrow-up
        4
        ·
        edit-2
        7 days ago

        A piece of writing being thoughtfully put together is far from inconsequential for me.

        I use a premium tier ai myself and am not against using it for assistance, it can craft a decent snippet (that still needs multiple manual edits) But not at all a full coherent text that reads efficiently.

        Its applies structure without understanding the goal of the text resulting in a paragraph salad.

        It simultaneously treats the reader like a toddler with oversimplified metaphors while also overcomplicating things for no other apparent reason than filling a word quota.

        Above article is twice the length it needs to be. Its lazy, lacks actual understanding and feels “sloppy” in the original meaning of the word.

        Having read more of the text i feel my original comment was way too forgiving. Even the opener does not make sense if you try and digest it. Silicon. It even includes misinformation, stack management existed before C was a thing.

        • BananaIsABerry@lemmy.zip
          link
          fedilink
          arrow-up
          3
          ·
          7 days ago

          All I’m saying is that the suspected use of AI shouldn’t be the reason you don’t like it. Instead, dislike it because of all the points you made about the article.

          I think it’s safe to argue that most news articles are not thoughtfully put together, regardless of the use of AI. Bad news articles existed before AI and will continue to exist long after.

          • webghost0101@sopuli.xyz
            link
            fedilink
            arrow-up
            2
            ·
            7 days ago

            Thats fair,

            In irony i probably could have worded my criticism better myself.

            Its not because its ai that i don’t like it but rather because it has all the sloppy patterns i started to recognize that are prevalent in ai.

            Some of those become increasingly jarring but only because i pay a subjective amount of attention to them. Bad human writers have an advantage in that their bad writing structure is still more unique to only their own writings.

      • webghost0101@sopuli.xyz
        link
        fedilink
        arrow-up
        1
        ·
        edit-2
        7 days ago

        The pattern are “we have this. we did this, now we believe this that is wrong” and “says what it is not, says what it is instead”

        Sometimes a combination of the two

        On premise there is nothing wrong with such sentences but llms tend to heavily overuse it and it becomes very formulatic.

        Ask an llm to explain any concept and you are bound to find examples of it. Tell it how it made a logical error and your almost guaranteed to see an example of it.

        Over the entire text i note about 10 variations of that pattern.