Sexuelle Deepfakes werden immer ausgefeilter, leistungsfähiger, leichter zugänglich und gefährlicher für Millionen von Frauen, die mit der Technologie missbraucht werden.

https://www.wired.com/story/deepfake-nudify-technology-is-getting-darker-and-more-dangerous/

16 Kommentare

  1. EmbarrassedHelp on

    One of the people interviewed seems to blame open source AI for this problem, while conveniently ignoring all the multimillion dollar sexual deepfake corporations that are operating lawfully.

    Going after the companies who explicitly sell nonconsentual sexual deepfake services is the only logical move here. Targeting open source AI doesn’t solve the problem.

  2. WeakApplication4095 on

    Aren’t women going to fight back and make fakes of the guys getting railed by tyson fury? Or a hippo? Or Elon getting it from xi jinping with trump doing 69? why are only women being subject to this abuse?

  3. VincentNacon on

    Not just women… Men too.

    And furries…

    and cars…

    There are plenty more things that you don’t wish to see, ever.

  4. Everything produced by AI – NEEDS to have an embedded water mark on it tracing it back to it’s source. Right in the middle so you cant miss it or crop it out. 
    This needs to he the number one rule across every platform.  

    I dont think this solves every problem. But it’s a start to defend victims and attempt to shame perpetrators.  

     **AI should not be filling creative gaps nor should it’s products or results be worth any value.**    

    Text is a different beast…

  5. Kori_the_cat on

    The „how does this affect anyone“ crowd came quick to the comments. This is why this technology will continue to exist and will have no consequences despite it including CP.

  6. blackvrocky on

    when you say „women are abused by things“ to get attention to your argument. like men are not subject to the same thing?

  7. there are so many headlines and posts about this lately that im beginning to think its a marketing ploy to attract users

  8. Due_Instance3068 on

    Not sure of the age group here but there was a film made years ago named Brainstorm. It was Natalie Wood’s last film she made. The theme of the film was a technology where you could plug in the computer world through a receptacle installed in the back of the neck. You could plug the tech right into the brain. One day as a filler segment, a tech worker stole a recorder out of the lab and wore it on a date with his girlfriend. And he continued to wear it during his sexual pleasures with her through the night. Then he shared his encounter with her with other male friends. The film didn’t get into if the girlfriend knew about this. But if she did, and she was an integral part of the creative process, would the plug in product be legal and saleable?

    Now what if the creative process was made with her input using artificial images of herself using AI? Are we looking at a simple disclosure statement?

  9. uniquelyavailable on

    I don’t understand how someone can be hurt by a fake image when there are very real threats like trafficking to worry about. Maybe focus your energy on something that actually matters.

  10. the only legit defence here is offence. deepfake the fuck outta these dudes and the dudes who are in office but not progressing the issue

Leave A Reply