
Laut einer Studie erzeugt KI Nacktbilder, die echte Fotos an sexueller Anziehungskraft übertreffen. Während die Leute tatsächliche Fotos immer noch als authentischer wahrnehmen, schneiden die gefälschten Bilder zuverlässig besser ab, was die Attraktivität und die allgemeine Annehmlichkeit anbelangt.
AI generates nude images that outrank real photographs in sexual appeal, study finds
25 Kommentare
Because they’re fake and people like fake. Hence all the people botoxing their face into lizard people. This seems to be a pretty simple concept. Problem is they are fabricated fake images. They don’t exist. Might as well desire barbies.
I don’t believe it, I want to take this test
Real enough to allow suspension of disbelief. Fake enough to be fantasies.
Burying the punchline of younger participants all preferred jerking off to cartoons
Fantasy is more appealing than reality. Now with a measurable p-value.
They’re trained on whatever images exist.
My hypothesis is there are more images of photogenic people than non-photogenic people (for reasons that are probably obvious), and that adds bias to make the AI generated people more attractive than a sample of images of real people.
Supernormal Stimulus – https://en.wikipedia.org/wiki/Supernormal_stimulus
I would like to review the research data. Does anyone have the source?
I’m going to have to see the dataset before I can believe this one, both the generated and the control. For science!
A friend of mine was chatting with a different friend, and the subject turned to the physical attractiveness of the first friends GF. The second one said „You know her boobs are fake, right?“
To which he replied „so?“
I doubt that, I’d need to see the pictures to be convinced
It’s because well endowed green orcs don’t exist in real life.
(reality can not compete with fantasy)
I’ve had a couple girls tell me they like the super-fake CGI stuff and I’m confused as hell
How women’s bodies look in porn is already completely detached from what a women’s body in real life looks like, so this makes sense
There is a discussion to be had of fake Vs natural imagery and conditioning yourself to get off and maybe expect idealised and unattainable beauty standards.. those things are worth taking pause on AI porn, but as someone who fools around with stable diffusion I got to say it’s nice to know there is zero chance of human trafficking and exploitation if I generate explicit imagery.
It’s also a trip to customise what kind of images are made. If you want to fantasize about milfy Japanese pearl diver women from the 1920s who are also body builders with just a bit of a butterface, for example, well you can’t just find that sort of thing from bangbros.
Nasty. Weirdos on the internet more attracted to bots than people. Go outside and talk to somebody!
This is just airbrushing with extra steps. Think about it, a general model, erasing all the imperfections be they wrinkles, hair in the wrong places, and other asymmetrical patterns, then presented as still real enough to not be weird.
This feels a bit like saying that a chocolate bar outranks an apple in palatable appeal. It’s not surprising that something people created to be highly appealing will score higher on a man-made ranking than something not specifically tailored to achieve that outcome. Beauty is not perfectly objective, but beauty standards are very well-observed and internalized across cultures such that an effective AI should be able to produce this result.
Let me grab some popcorn for these comments.
My anecdotal evidence is that the constant avalanche of AI porn ads makes me want to strangle the 19 year old tech bro somewhere who is cranking them all out. Looks fake, gross and unattractive.
It is also a perfect example of a product looking for a market. You know what the internet has plenty of? Naked people. Tons and tons of real life people happy, nay, excited to show you their bits. We don’t need fake crotches. We got all the crotches we need.
That’s cause real humans are ugly.
Watch the differences in how AI affects a population in a certain superpower where it is nationalized and regulated as opposed to the US.
Each company is for profit. Each is aimed exclusively at growth. We have begun to develop the end game of technology and it is being used to make porn, deep fakes of teenagers and politicians, Facebook slop of obese people falling. Writing emails and getting recipes you could’ve got from the book or blog it scrubbed it from.
The human brain couldn’t handle social media. Its not wired to handle AI’s impact in the media sphere, at all.
And in a post scarcity world this is something that absolutely needs to be regulated. But here we are.
Most of the non-real models were in provocative poses and most of the real models were in neutral “anatomical” poses. I wonder if this variable contributed.
I think everyone is making sweeping assumptions about *why* people might prefer what they know to be fake, but the reality is more subtle.
As someone who uses ai almost daily to render my own creations, I can tell you right now that: realistic; semi-real; anime — doesn’t matter, the **first** thing that most of these models got nearly infallibly good at rendering was boobs and butt (followed by everything that surrounds them). Not only are they trained on datasets that, by virtue of being „art“ are inherently biased toward the global collective notions of „perfect“, „attractive“, or „idealized“ (because „photogenic“), but, because of the mixture of mediums in those datasets, they are also inherently geared toward literally superhuman levels of those same notions -> think classic metahumans like Starfire or She-Hulk, then turn them into genuinely believable (not „uncanny valley“ inducing) photorealistic renders of those exact things.
This is not „dudes love big fake boobs“ or „people like poreless skin“, this is „mathematically perfect proportions, lighting, and feature combination that make even their freckles, somehow, indescribably, ever-so-slightly more attractive than what you can find on basically any real human being.“
By virtue of being the prediction engines that they are (basically turbo-charged denoising algorithms), they are they are inherently designed to hijack every single neurological response that you have to „that looks pretty“. It’s not just people or their features, it’s also cars, plants, houses, jewelry, weapons, food — you could render a half-smoked cigarette sloppily put out in a cup of yogurt, and it would somehow be less revolting than if you saw it in real life.
EDIT: because I missed a parenthesis after „photogenic“.
This is a great sign for society…