• bob_omb_battlefield@sh.itjust.works
    link
    fedilink
    English
    arrow-up
    1
    ·
    20 days ago

    If you aren’t allowed to freely use data for training without a license, then the fear is that only large companies will own enough works or be able to afford licenses to train models.

    • Nomad Scry@lemmy.sdf.org
      link
      fedilink
      English
      arrow-up
      0
      ·
      20 days ago

      If they can just steal a creator’s work, how do they suppose creators will be able to afford continuing to be creators?

      Right. They think we have enough original works that the machines can just make any new creations.

      😠

      • MudMan@fedia.io
        link
        fedilink
        arrow-up
        1
        ·
        20 days ago

        It is entirely possible that the entire construct of copyright just isn’t fit to regulate this and the “right to train” or to avoid training needs to be formulated separately.

        The maximalist, knee-jerk assumption that all AI training is copying is feeding into the interests of, ironically, a bunch of AI companies. That doesn’t mean that actual authors and artists don’t have an interest in regulating this space.

        The big takeaway, in my book, is copyright is finally broken beyond all usability. Let’s scrap it and start over with the media landscape we actually have, not the eighteenth century version of it.

        • Grimy@lemmy.world
          link
          fedilink
          English
          arrow-up
          1
          ·
          edit-2
          20 days ago

          Yes precisely.

          I don’t see a situation where the actual content creators get paid.

          We either get open source ai, or we get closed ai where the big ai companies and copyright companies make bank.

          I think people are having huge knee jerk reactions and end up supporting companies like Disney, Universal Music and Google.