• @[email protected]
        link
        fedilink
        English
        63 months ago

        I recall somebody’s working on actual, E2EE Mastodon DMs, but couldn’t give you details, i guess when it’s ready we’ll know when people start using it

      • @[email protected]
        link
        fedilink
        33 months ago

        Seems if the messages are sent in an inherently insecure fashion, all one would need to do is set up an instance that purposefully does not filter out all the things it’s supposed to be kind/competent enough to filter out, and boom it has everything.

        • @[email protected]
          link
          fedilink
          13 months ago

          It’s not “inherently insecure” at least not to that degree. (Once could argue that lack of E2EE is insecure.) If you stand up an unrelated instance you shouldn’t be able to access private messages that don’t relate to an account on your instance. So only bugs in your instance, or your conversation partner’s instance, will be able to leak those messages.

  • @[email protected]
    link
    fedilink
    173 months ago

    If we hit these AI companies with targeted suing, like how Scientology got their way with the IRS, maybe we then they can listen to not steal our shit.

    The MPAA and RIAA have created all these laws and used our own government againat us. Maybe we can use these same laws and do the same.

    • @[email protected]
      link
      fedilink
      63 months ago

      Maybe we have some bias on this topic, but I had the same thought. Maven is such a well known tool in IT, that I’m surprised they just created a social network with the same name. Until they get a bit famous this won’t be good for SEO.

  • @[email protected]
    link
    fedilink
    53 months ago

    I wouldn’t have a problem with all this scraping, if these companies had to release their models trained on this data as open source.

    • @[email protected]
      link
      fedilink
      43 months ago

      That’s a great idea. Can we not apply a license to that social content that forces AI models trained on it to be open source?

      • @[email protected]
        link
        fedilink
        English
        22 months ago

        That’s actually pretty good. And then they’re open to getting sued when caught.

        I guess it could be done on an instance basis, although I’m not sure how happy fediverse users will be if their instance has an official policy of open-sourcing (or maybe it’s public-domaining?) all their content by default.

        • @[email protected]
          link
          fedilink
          2
          edit-2
          2 months ago

          Well, such a license could just obligat to open source the AI model that has been trained on it. If the instance prohibits training of AI models, or allow it, would be a separate condition that’s up to the instance owner, and its users can decide if they want to contribute under that condition, or not.