The biggest challenge to getting an agreement over the European Union’s proposed AI Act has come from France, Germany and Italy, who favour letting makers of generativeAI models self-regulate instead of having hard rules.

Well, we saw what happened (allegedly) with OpenAI “self-regulating” itself.

  • Even_Adder@lemmy.dbzer0.com
    link
    fedilink
    English
    arrow-up
    1
    ·
    7 months ago

    I don’t think we can let the current big AI players regulate themselves, but the ESRB hasn’t been too bad at doing its job.

    • TwilightVulpine@kbin.social
      link
      fedilink
      arrow-up
      9
      ·
      7 months ago

      …it is now commonplace to find elements that are considered psychologically equivalent to gambling with real money in games rated E for everyone, therefore recommended for children of all ages.

      ESRB may be plenty harsh on violence and sexual content, but it is completely neglecting their job where rating conditioning monetization elements accurately might earn the industry less money.

      • Even_Adder@lemmy.dbzer0.com
        link
        fedilink
        English
        arrow-up
        1
        ·
        7 months ago

        They were spawned to stop the government from regulating video game content thanks to games like Mortal Kombat and Night Trap.

        • TwilightVulpine@kbin.social
          link
          fedilink
          arrow-up
          7
          ·
          edit-2
          7 months ago

          Yes, I know. My point is that as new needs for self-regulation have come up, they are playing coy. Because as industry representatives it’s more profitable to pretend they don’t realize there is a new risk, that justifies ratings and warnings for children and their parents. If they will not catch up until the threat of government regulation comes up, they are not doing their job properly.

          Ironically they are more harsh at fictional depictions of gambling than at lootboxes with real money, so they always knew there were some risks of this kind.