EU stimmt nach expliziter Deepfake-Empörung für ein Verbot von KI-„Nudifier“-Apps

https://www.rfi.fr/en/international/20260326-eu-votes-to-ban-ai-nudifier-apps-after-deepfake-outrage

9 Kommentare

  1. someocculthand on

    I wonder how they’re going to attempt to monitor this. Does this not comprise all AI image generators, many of which can be locally run with basically any modern gaming GPU?

  2. Ya glad to see some countries looking into this, especially After seeing what people did and continue to try to do with Grok… even discounting my dislike of gen AI it was just fucking disgusting and creepy what this shit enabled

  3. That’s alright. I already seen dozens of nude pictures of Britney Spears and Christina Aguilera on the early 2000’s

  4. I don’t see this impacting anything hosted outside the eu? If your in it use at your won risk but I don’t see this impacting a single service hosted in a non eu country. At worst you’ll need a vpn to access it.

  5. AdFeeling842 on

    a lot of pervy redditors acting like this tech is no big deal. who cares if some teenager gets their life wrecked? totally harmless ‚tech‘ except that kids get seriously messed up from this and some deal with it for years and some even end up killing themselves

  6. Cool, cool. Those apps are basically scum and I’d be happy if the resources wasted on AI are used for less intrusive purposes.

    HOWEVER, can we also look at banning voice deepfakes? Sure, a nude deepfake is obviously a massive privacy intrusion and extremely embarrassing and should count as a crime…

    But faking someone’s voice is exactly the same, imo. In my humble opinion, whether I’m violated visually (nude pics) or audibly (my voice is used to get me in trouble or for other nefarious purposes) it’s exactly the same thing.

    Sharing either of these should count as a violation of privacy, identity theft, and imo assault or harassment and land anyone in jail for a while.

Leave A Reply