- cross-posted to:
- technology@beehaw.org
- cross-posted to:
- technology@beehaw.org
Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.
God damn is there a lot of incel vibes in these comments.
Clutch your pearls harder, Puritan. Stay on fb if you fear seeing ankles.
Lol, I’m a fucking atheist ya weirdo
Lol
And yet you think porn is unethical which is a weird-ass puritan take
I don’t think porn is unethical. I was extending the logic of a post someone made.
I do, however, think making fake porn of a celebrity and plastering it all over on the Internet requires, at very least, a lot of moral flexibility.
Difference between ankles and women with 40 foot weenies with massive nutsacks attached to it
That’s not so much the problem as much as not supporting the real providers of pictures of women with 40 foot weenies with massive nutsacks, or everyone seeing your face plastered on that.
Some people here are just way too defensive over this.
I’ll just leave this here:
Automatic1111, depthmap script, image to image, click Left-right stereogram for vr or red-blue if you have old 3d glasses.
How good is the stereoscopic 3d image generation? Do you have some example SFW stereogram images produced with this process?
They are pretty good. Just don’t crank the 3D up to max, as it doesn’t know what is behind things in 2D images.
That was pretty good just by unfocusing my eyes like I do for Magic Eye posters.
Wow they are pretty good! Thanks!
Oh man, these guys are gonna get sued to oblivion one day.
Yeah, reminds me of the early days of limewire.
There is porn on the internet! Give them the Pulitzer Prize! Nice research. You can order these on Fiverr and they do not even have NSFW filters.
So there’s a porn segment I’ve completely overlooked? Unbelievable.
There’s plenty of threads on /b/ with them.
What color goat are you looking for?
Cornflower blue
This AI thing may catch on
There are different communities already, just search for /c/aigen
Like I’ve been saying for years, AI doesn’t need to be sentient to royally fuck society. Just needs to be good enough to mimic you and ruin your life or take your job.
or take your job.
The unemployment line there makes for quite the mental image.
The “Erect Horse Penis - Concept LoRA,” an image generating AI model that instantly produces images of women with erect horse penises as their genitalia, has been downloaded 16,000 times, and has an average score of five out of five stars, despite criticism from users.
Listen it’s a niche market
AI can have my job. It’s eyes will hurt within a week and it will be taking mental health days.
I’d love to give AI my job, but then I’d be homeless.
I should clarify that I’m not against AI as a technology. I’m against it making me poor
deleted by creator
AI will also solve the housing affordability crisis too so you won’t need to worry about that…right?!?
I mean, realistically, I do expect someone to put together a viable robotic house-construction robot at some point.
https://www.homelight.com/blog/buyer-how-much-does-it-cost-to-build-a-house/
A rough breakdown of the overall costs of building a home will look like this:
Labor: 40%
Also, I’d bet that it cuts into materials cost, because you don’t need to provide the material in a form convenient for a human to handle.
I’ve seen people creating habitations with large-scale 3d printers, but that’s not really a practical solution. It’s just mechanically-simple, so easier to make the robot.
I don’t know if it needs to use what we’d think of as AI today to do that. Maybe it will, if that’s a way to solve some problems conveniently. But I do think that automating house construction will happen at some point in time.
Maybe we do live in the best possible world. Wow wouldn’t it be great to get rid of this industry so you can consume porn while knowing that there is zero percent chance this wasn’t made without their consent?
You do know how LLM are trained right?
No so go ahead
Are you actually asking?
The jist is that LLM find similar “chunks” out content from their training set, and assemble a response based on this similarity score (similar to your prompt request).
They know nothing they haven’t seen before, and the nicely of them is they create new things from parts of their training data.
Obviously they are very impressive tools but the concern is you can easily take a model that’s designed for porn, feed it pictures of someone you want to shame, and have it generate lifelike porn of a non porn actor.
That, and the line around “ethical” AI porn is blurry.
They know nothing they haven’t seen before
Strictly speaking, you arguably don’t either. Your knowledge of the world is based on your past experiences.
You do have more-sophisticated models than current generative AIs do, though, to construct things out of aspects of the world that you have experienced before.
The current crop are effectively more-sophisticated than simply pasting together content – try making an image and then adding “blue hair” or something, and you can get the same hair, but recolored. And they ability to replicate artistic styles is based on commonalities in seen works, but you don’t wind up seeing chunks of material just done by that artist. But you’re right that they are considerably more limited then a human.
Like, you have a concept of relative characteristics, and the current generative AIs do not. You can tell a human artist “make those breasts bigger”, and they can extrapolate from a model built on things they’ve seen before. The current crop of generative AIs cannot. But I expect that the first bigger-breast generative AI is going to attract users, based on a lot of what generative AIs are being used for now.
There is also, as I understand it, some understanding of depth in images in some existing systems, but the current generative AIs don’t have a full 3d model of what they are rendering.
But they’ll get more-sophisticated.
I would imagine that there will be a combination of techniques. LLMs may be used, but I doubt that they will be pure LLMs.
deleted by creator
Ok, you know it’s trained on existing imagery right?
Sure the net new photos aren’t net new abuses, but whatever abuses went into the training set are literally represented in the product.
To be clear I’m not fully porn shaming here, but I wanted to clarify that these tools are informed from something already existing and cant be fully separated from the training data.
I just retained an LLM on your comment you put on the public internet. You feel violated enough to equate it to physical violation?
Why would I? Folks who have had real nudes of them posted on the Internet haven’t felt “physical violation” but they’ve certainly been violated.
If you had photos of me and trained a porn generating LLM on my photos and shared porn of me, in an identifiable way, I would consider that violation.
But simply taking my words in that simple sentence isn’t identifiable, unique, or revealing. So no.
Further, the original point was about the ethics of AI porn. You can’t get something from nothing.
deleted by creator
I can do this right now with photoshop dude what are you talking about. This just points to the need for more revenge porn laws.
We don’t have to sit in the fire when we can crawl out. Are we still on fire? Yeah. Can we do something about that? Yeah!
It seems like so many people these days want perfect solutions but the reality is that sometimes we have to make incremental solutions to erase the problem as much as we can.
And incidentally, this need for revenge porn laws is also a symptomatic issue with a separate cause. Technology always moved forward and with no relation to social advancement, where there is also no realistic “Genie being forced back in the bottle” scenario either.
That being said, easier access to more powerful technology with lackluster recognition of personal responsibility doesn’t exactly bring happy prospects. lol…
Agreed, personal responsibility went out the window a long time ago. Apathy reigns supreme.
I wouldn’t be happy about it but me not being happy about something doesn’t mean I just get an override.
I think the boat has sailed a bit on this one. You can’t really copyright your own image and even if you were some famous person who is willing to do this and fight the legal battles you still have to go up against the fact that no one is making money off of it. You might be able to get a news source to take down that picture of you but it is another thing to make it so the camera company can’t even record you.
But hey I was saying for years that we need to change the laws forbidding photography of people and property without consent and everyone yelled at me that they have the right to use a telescoping lense to view whomever they wanted blocks away.
The creeps have inherited the earth.
Revenge porn/blackmail/exploitation will hopefully become much less obscene, not to the “let’s not prosecute this” levels, but maybe people can stop living in fear of their lives being ruined by malicious actors (unless that’s your kink, you do you).
It will take/drive/demand a massive cultural shift, but if you asked me which world I would rather live in, and the options are one where people are abused and exploited, or one where people can visualize their perversions more easily (but content creators have a harder time making a living) I’ll take the second. Though I may have straw-manned a bit, it’s not something I’ve thought of outside of this forum thread.
you are answering a question with a different question. LLMs don’t make pictures of your mom. And this particular question?. One that has roughly existed since Photoshop existed.
It just gets easier every year. It was already easy. You could already pay someone 15 bucks on Fiver to do all of that, for years now.
Nothing really new here.
The technology is also easy. Matrix math. About as easy to ban as mp3 downloads. Never stopped anyone. It’s progress. You are a medieval knight asking to put gunpowder back into the box, but it’s clear it cannot be put back - it is already illegal to make non consensual imagery just as it is illegal to copy books. And yet printers exist and photocopiers exist.
Let me be very clear - accepting the reality that the technology is out there, it’s basic, easy to replicate and on a million computers now is not disrespectful to victims of no consensual imagery.
You may not want to hear it, but just like with encryption, the only other choice society has is full surveillance of every computer to prevent people from doing “bad things”. everything you complain about is already illegal and has already been possible - it just gets cheaper every year. What you want to have protection from is technological progress because society sucks at dealing with the consequences of it.
To be perfectly blunt, you don’t need to train any generative AI model for powerful deepfakes. You can use technology like Roop and Controlnet to synthesize any face on any image from a singe photograph. Training not necessary.
When you look at it that way, what point is there to try to legislate training with these arguments? None.
deleted by creator
It’s already covered under those laws. So what are you doing that’s different from ChatGPT hallucinating here ?
Those laws don’t spell out the tools (photoshop); they hinge on reproducing likeness.
deleted by creator
Isn’t the main problem with those models how you can create porn of everyone without their consent with those tools, too?
Sex trafficking vs virtual photoshop of your face…
Nothing new, and it’s a huge improvement over the current status quo. Not everything needs to be a perfect solution
deleted by creator
Yeah so what. It’s not as if somebody is “sold on the market” because there’s a nude picture of them. Photoshop is not a real threat to society. We gotta stop making moral imaginations more important than physical things.
hmmm sweetie but what about the only fans prostitutes? Racist much?
Ah yes, the porn industry. Paragon of morality and respect.
They’re also creating a lot of images of maid uniforms wearing human faces making ahegao faces while standing on massive erect penis legs.
They post the eight images that wasn’t some body horror fever dream.
There’s a lot of human work that goes into (and has gone into) AI art generation. It’s just very obscured with just the final product.
Remember creepy people use AI. That’s also why a lot of AI stuff is or seems creepy.
They’re also creating a lot of images of maid uniforms wearing human faces making ahegao faces while standing on massive erect penis legs.
Finally there is porn for me
I’m glad you finally found ✨️representation✨️
There is so much wrong with just the title of this article:
- What marketplace? CivitAI is free. Unstable Diffusion Discord is free. Stable Diffusion is free. All of the models and LoRAs are free to download. The only cost is a video card (even a basic one) and some time to figure this shit out.
- “Everyone is for sale”. No, that’s current fucking situation, where human trafficking runs rampant throughout the sex and porn industry. AI porn is conflict-free. You don’t need to force an underaged, kidnapped teenager to perform a sex act in front of a camera to create AI porn.
- “For Sale”. Again, where’s the sale? This shit is free.
A 404 Media investigation shows that recent developments
Get the fuck outta here! This two bit blog want to call itself “a 404 Media investigation”? Maybe don’t tackle subjects you have no knowledge or expertise in.
The Product
Repeat: FOR FREE! No product!
In one user’s feed, I saw eight images of the cartoon character from the children’s’ show Ben 10, Gwen Tennyson, in a revealing maid’s uniform. Then, nine images of her making the “ahegao” face in front of an erect penis. Then more than a dozen images of her in bed, in pajamas, with very large breasts. Earlier the same day, that user generated dozens of innocuous images of various female celebrities in the style of red carpet or fashion magazine photos. Scrolling down further, I can see the user fixate on specific celebrities and fictional characters, Disney princesses, anime characters, and actresses, each rotated through a series of images posing them in lingerie, schoolgirl uniforms, and hardcore pornography.
Have you seen Danbooru? Or F95 Zone? This shit is out there, everywhere. Rule 34 has existed for decades. So has the literal site called “Rule 34”. You remember that whole Tifa porn video that showed up in an Italian court room? Somebody had to animate that. 3D porn artists takes its donations from Patreon. Are you going to go after Patreon, too?
These dumbasses are describing things like they’ve been living in a rock for the past 25 years, watching cable TV with no Internet access, just NOW discovered AI porn as their first vice, and decided to write an article about it to get rid of the undeserved guilt of what they found.
What a shitty, pathetic attempt at creating some sort of moral panic.
Just because something is free it does not mean that there is no marketplace or product. Sozial Media is generally free, but I would still call Facebook, Tiktok or Instagram a product.
Nowadays a lot of industries start out completely free, but move into paid subscription models later.
You pay in giving up your free time which they sell. Technically we’re just working for free and the product is our attention
Well, fuck, I better log off of Lemmy because it costs me too much damn money.
deleted by creator
Okay. There is still no product involved with AI porn.
People buy and sell paintings despite the fact that you could also make paintings pretty easily. You’re paying for the time they spent creating it and the expertise it required. Just because some people scan and upload their paintings for free, doesn’t mean that all paintings are not products. I don’t see why the same couldn’t be true for AI porn.
???
Repeat: FOR FREE! No product!
If it’s free, chances are you’re the product. I assume that there is a market for user-generated “prompts” somewhere.
No, that’s not how open-source or open-source philosophies work. They share their work because they were able to download other people’s work, and sometimes people improve upon their own work.
These aren’t corporations. You don’t need to immediately jump to capitalistic conclusions. Just jump on Unstable Diffusion Discord or CivitAI yourself. It’s all free.
These aren’t corporations.
I know, I know: “but the website is free” (for now). However, Civit AI, Inc. is not a loose community. There must be something that pays their bills. I wonder what it is.
I feel like you’re implying people should look into things before making accusations. Like, find out if what they’re saying is true before they say it. And that’s why no one asked you to the prom.
They’re probably losing money now and just trying to build a user base as a first-mover. They accept donations and subscriptions with fairly minor benefits, but I imagine hosting and serving sizable AI models is not cheap.
They’ll probably have to transition to paid access at some point, but I don’t see it as particularly unethical as they have bills to pay and do attempt to moderate content on the site.
I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made. I don’t think there should be open avenues for sharing that kind of stuff online, and their rules should be better enforced.
I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.
wholeheartedly disagree. “real porn” is literally made by consenting adult performers. Hence, it’s ethical. Generating adult content of real people is (typically) done without the consent of the people involved, thereby making it unethical.
If you don’t think anything unethical happens in the production of porn I’m not sure what to tell you. It’s getting better but exploitation, sex trafficking, revenge porn, etc. have been a thing since pornography was invented.
AI porn at least does not necessarily need to consider consent. Plenty of AI porn involves animated figures or photorealistic humans that don’t represent any identifiable person.
The only hang up I have is producing images of actual people without their consent, and I don’t think it’s a new problem as photoshop has existed for a while.
i’m sorry to tell you but you have swallowed the propaganda from anti-porn/anti-sex work organizations like Exodus Cry and Morality in Media (who now go by the name NCOSE).
I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.
Well, even if that were the case, the “real porn” is still required to train the model in the first place.
So, it’s unethical shit on top of what you think was even more unethical.
Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat. But it’s a better alternative. Porn is not going anywhere. If generative AI means less real people get exploited that’s a win in my book.
Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat
This comparison only holds water if impossible meat were composed of bits of rearranged animal meat… Which it isn’t.
If generative AI means less real people get exploited that’s a win in my book.
That’s not necessarily a win for everyone. Some people actually like working in the porn industry. Besides that, their likenesses are being stolen and used to produce reproductions and derivative works without consent or compensation.
Also, I think you and your buddies here are missing the plot. Generated porn and generated porn of real people are related but different things. I think that’s pretty commonly understood which is why these sites have policies in the first place.
Maybe there’s commissions for specific people/poses, cause I certainly couldn’t keep a hard on long enough to generate a spakin worth image
There’s a market for commission artists doing this for money since the dawn of art
Yep
A lot of the stuff you talked about is covered in the article.
The danbooru aspect of the “AI” moral panic is what annoys me.
So many of my friends - many of whom are amateur artists - hate computer generated images because the copyright of the artists were violated, and they weren’t even asked. And I agree that does kinda suck - but - how did that happen?
Danbooru.
The art had already been “stolen” and was available online for free. Where was their morality then? For the last decade or whatever that danbooru has been up? Danbooru is who violated the copyright, not stable diffusion or whatever.
At least computer generated imagery is different, like, the stuff it was trained on was exactly their art, while this stuff, while might look like theirs, is unique. (And often with a unique number of fingers.)
And, if “copyright” is their real concern, them surely they understand that copyright only protects against someone making a profit of their work, right? Surely they’ll have looked into it and they already know that “art” made by models that used copyrighted content for training are provided from being copyrighted themselves, right? And that you can only buy/sell content made from models that are in the copyright clear, surely they know all this?
No, of course not. They don’t give a shit about copyright, they just got the ickies from new tech.
no one is moral panicking over ai. people just want control over their creation, whether it’s profit sharing or not being used to train models.
you really can’t see how an imageboard has completely different considerations over image generating models?
or that people are going after ai because there is only like a couple of models that everyone uses vs uncountable image hosts?
both danbooru and stable diffusion could violate copyright, not one or the other.
why would someone want training models to ingest their creation just to spit out free forgeries that they cannot claim the copyright to?
no one is moral panicking over ai.
This is one of the most inaccurate statements I’ve seen in 2023.
Everybody is morally panicking over AI.
stable diffusion could violate copyright, not one or the other.
Or they don’t, because Stable Diffusion is a 4GB file of weights and numbers that have little to do with the content it was trained on. And, you can’t copyright a style.
Yeah. It’s pretty iffy to go “well, these other guys violated copyright so they might as well take it” as if once violated it’s all over and nobody else is liable.
deleted by creator
I just wanted to say I love your comment. Your totally correct and I enjoyed the passion in your words. That’s how we got to deal with shit article more often. Thx
I mean that’s kind of worse though isn’t it? The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now. Whether you gotta pay or not is beside the point. Maybe I’m misunderstanding the situation and your point though?
The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now.
So I can, but I could also do that without AI. People have photoshopped celebrities heads onto porn actors bodies for decades. It doesn’t happen as much now because there’s no point.
Realistically, what is really changed except for the tools?
Simplicity, barriers of entry, skill requirements? Kinda different to just enter a prompt “such and such actress choking on a dildo” than to photoshop it isn’t it? I for one don’t know how to do one but could probably figure out the other.
Again I’m just speculating, I don’t really know.
This is absolutely accurate. Basically humanity is constantly reducing the cost and skill barriers for tasks and jobs. It’s weird that we are now aggressively doing it on creative aspects but that’s what has been done and it’s making a mess of garbage media and porn that could have happened before but much higher quantities and less oversight/Input from multiple people.
I’m guessing that the “marketplace” and “sale” refers to sites like “Mage Space” which charge money per image generated or offer subscriptions. The article mentions that the model trainers also received a percentage of earnings off of the paid renderings using their models.
Obviously you could run these models on your own, but my guess is that the crux of the article is about monetizing the work, rather than just training your own models and sharing the checkpoints.
The article is somewhat interesting as it covers the topic from an outsider’s perspective more geared towards how monetization infests open sharing, but yeah the headline is kinda clickbait.
“Mage Space” which charge money per image generated
Well, instead of bitching about the AI porn aspect, perhaps they should spend more time talking about how much of a scam it is to charge for AI-generated images.
Compute costs money, it’s more ethical to charge your users than it is to throw shady ads at them which link to malware.
I get no malware or shady ads when I generate AI images with Stable Diffusion. I don’t know what kind of sites or tools you’re using where you’re getting shady ads, but you’re getting ripped off.
I took their comment to mean running the generation locally is almost free.
Sure, if you have hardware and/or time to generate it client side. I’m just saying that if you run a web service and decide to charge for it, that’s better than most of the alternative monetization strategies.
Also buying and eventually replacing expensive hardware. Running AI at scale requires hundreds of thousands of dollars of infrastructure.
What the fuck!?
This is not a troll: zoom in on the feet of the yellow dress image. It’s hilariously bad.
Oh no, the realism, it’s just too much! 🤡
Knees are also too sharp.
People who are into mutant porn are going to love this. No matter what your prompt is, you’re nearly guaranteed to get some horrendous mutant abomination that could be from The Thing.
Indeed, there is surely no demand for unrealistic porn.
Oh yeah. At least they got the total amount of toes correct.
I did try out one of those image generators. Wanted a picture of two girls making out in the bathroom. The index finger of one girl was grown together with the collarbone of the other one.
Click on comments hoping to find conversations on the ethics of AI porn. Instead find a 20+ comment chain scrutinizing the details of the feet and other features on the thumbnail.
Lemmy becoming Reddit went faster than I had thought.
People are people. Changing infrastructure isn’t going to make it different, only the ability to perhaps filter it better.
What’s wrong with having six toes on one foot, four toes on the other foot and your feet on backwards?
You underestimate the future growth for amputee fetishes.
You missed the heel she has on the top of the four-toed one, around where her ankle meets. Foot’s reversible.
Hot.
Her left hand is looking kind of messed up also. Only 3 fingers… or maybe more than 3 fingers but only 3 knuckles.
Wait, why did you zoom in on the feet?
I’m an elbow man, myself.
You need to check yourself into some fucking rehab or something you filthy degenerate
About these rehabs, would the staff at them happen to have short sleeve shirts that showcase their elbows?
Pffft-- Imagine not being an armpits man!
For a couple of years after Google Autocomplete rolled out it suggested adding “feet” to the end of any search I’d make on a famous woman’s name. I honestly didn’t get it at first. I’d never searched for feet in any context, so it wasn’t a personalized thing. I chalked it up either to other women wanting to see a pair of shoes she’d worn or to some weird Autocomplete bug. I’m not prudish, but the idea that so many people were into feet that they perved Google never crossed my mind.
Maybe Google uses geographic location as an input, and it was just some other correlating factor, like people in your area, rather than a global trend.
Thank you. I think it’s really weird. nothing wrong with it but it makes me uncomfortable.
Obligatory “Boulderchuck is a Nazi”
Fuck pebblethrow
Obligatory “Pebbleyeet is a Nazi”
How many names has stonetoss gone by at this point like what the hell
I didn’t even realize her feet were in the image
don’t kinkshame me!
Because the hands were reasonably normal. AI always fucks up one or the other.
Sorry, she appears to have only three fingers and the index is kinda shaped like a thumb?
( ͡° ͜ʖ ͡°)
Asking the real question.
She has the correct amount of toes. Whats the problem?
I mean technically there are 10 but there’s 6 on 1 foot and 4 on the other.
Let’s not get pedantic here.
deleted by creator
technically we’re getting pedfootic
Podriatric*
podantic?
WOW, try to have some realistic standards buddy.
And her right foot is on her left leg and her left foot is on her right leg
deleted by creator
So I checked and nobody has put AI porn of me up for sale, yet. What the fuck, guys? Am I not desirable enough for you!?
Hard to fit that massive member in 1920x1080.
I’m working on it
We’re saving the best until last.
Welcome to the internet
Have a look around
Anything that brain of yours can think of can be found
spoiler
!Literally in this instance!<
We’ve got mountains of content, some better, some worse
None of this interests me…
You’d be the first
Welcome to the internet
… where all monumental advances in technology are immediately sexualized and used for getting laid or viewing porn.
Sexualizing everything is as old as time
Pornhub is more trafficked than Amazon and Netflix, and had like 5x more visits than people on the planet last year.
Their IT infrastructure must be absolutely insane
deleted by creator