• queermunist she/her@lemmy.ml
    link
    fedilink
    English
    arrow-up
    57
    ·
    1 day ago

    Yeah I think the focus should be on technological sovereignty, not abstinence. We need control over our data, control over our software, control over our devices, control over our hardware, and through these things we can gain control over our lives while still accessing these extremely useful tools. We need our own search engines, our own operating systems, our own applications, our own email, our own social media, our own video hosting, etc etc. We can never go back, the only way out is through.

    This is extremely hard and expensive, though. It’ll require mass organization of millions of people, we can’t do it as individuals.

    • survirtual@lemmy.world
      link
      fedilink
      English
      arrow-up
      14
      arrow-down
      1
      ·
      edit-2
      1 day ago

      That’s correct. We can’t put the genie back in the bottle. We have to increase our mastery of it instead.

      The core relationship is rather simple and needs to be redefined. Remote compute does not assign numbers to any of us, we provide them with identities we create.

      All data allowances are revokable. Systems need to be engineered to make the flow of data transparent and easy to manage.

      No one can censor us to other people without the consent of the viewer. This means moderation needs to be redefined. We subscribe to moderation, and it is curated towards what we individually want to see. No one makes the choice for us on what we can and cannot see.

      This among much more in the same thread of thinking is needed. Power back to the people, entrenched by mastery.

      When you think like this more and more the pattern becomes clearer, and you know what technology to look for. The nice thing is, all of this is possible right now at our current tech level. That can bring a lot of hope.

        • survirtual@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          3 hours ago

          That is just the tip of the iceberg with the moderation framework I have in mind.

          Anyone can become a moderator by publishing their block / hide list.

          The more people that subscribe to a moderator or a moderator team, the more “votes” they get to become the default moderator profile for a topic (whatever that is on the given platform, subreddit for reddit etc).

          By being subscribed to a moderation team (or multiple), when you block or hide, it gets sent to the report queues of who you’re subscribed to. They can then review the content and make a determination to block or hide it for all their subscribers.

          Someone who is blocked or hidden is notified that their content has been blocked or hidden when it is by a large enough mod team. They can then file an appeal. The appeal is akin to a trial, and it is distributed among all the more active people that block or hide content in line with the moderation collective.

          An appeal goes through multiple rounds of analysis by randomly selected users who participate in review. It is provided with the user context and all relevant data to make a decision. People reviewing the appeal can make decision comments and the user can read their feedback.

          All of this moderation has a “karma” associated with it. When people make decisions in line with the general populace, they get more justice karma. That creates a ranking.

          Those rankings can be used to make a tiered justice system, that select the best representative sample of how a topic wishes to have justice applied. The higher ranking moderators get selected for higher tiered decisions. If a lower level appeal decision is appealed again, it gets added to their queue, and they can choose to take the appeal or not.

          All decisions are public for the benefit of users and accountability of moderators.

          When a user doesn’t like a moderator’s decision they can unblock or unhide content, and that counts as a vote against them. This is where it gets interesting, because this forms a graph of desired content, with branching decision logic. You can follow that train of thought to some very fascinating results. Everyone will have a personally curated content tree.

          Some will have a “cute” internet, filled with adorable content. Some will have a “violent” internet, filled with war videos and martial arts. Some will have a “cozy” internet, filled with non-triggering safe content. And we will be able to share our curations and preferences so others can benefit.

          There is much more but the system would make moderation not just more equitable, but more scalable, transparent, and appreciated. We’d be able to measure moderators and respect them while honoring the freedom of individuals. Everyone would win.

          I see a future where we respect the individual voices of everyone, and make space for all to learn and grow. Where we are able to decide what we want to see and share without constant anxiety. Where everything is so fluid and decentralized that no one can be captured by money or influence, and when they are, we have the tools to swiftly branch with minimal impact. Passively democratic online mechanisms.