Each of these reads like an extremely horny and angry man yelling their basest desires at Pornhub’s search function.

  • @randon31415@lemmy.world
    link
    fedilink
    English
    312 years ago

    I’ll just leave this here:

    Automatic1111, depthmap script, image to image, click Left-right stereogram for vr or red-blue if you have old 3d glasses.

  • @raker@lemmy.world
    link
    fedilink
    English
    112 years ago

    There is porn on the internet! Give them the Pulitzer Prize! Nice research. You can order these on Fiverr and they do not even have NSFW filters.

  • @AlexWIWA@lemmy.ml
    link
    fedilink
    English
    182 years ago

    Like I’ve been saying for years, AI doesn’t need to be sentient to royally fuck society. Just needs to be good enough to mimic you and ruin your life or take your job.

    • tal
      link
      fedilink
      102 years ago

      or take your job.

      The unemployment line there makes for quite the mental image.

      The “Erect Horse Penis - Concept LoRA,” an image generating AI model that instantly produces images of women with erect horse penises as their genitalia, has been downloaded 16,000 times, and has an average score of five out of five stars, despite criticism from users.

    • @pdxfed@lemmy.world
      link
      fedilink
      English
      72 years ago

      AI can have my job. It’s eyes will hurt within a week and it will be taking mental health days.

      • @AlexWIWA@lemmy.ml
        link
        fedilink
        English
        82 years ago

        I’d love to give AI my job, but then I’d be homeless.

        I should clarify that I’m not against AI as a technology. I’m against it making me poor

          • tal
            link
            fedilink
            1
            edit-2
            2 years ago

            AI will also solve the housing affordability crisis too so you won’t need to worry about that…right?!?

            I mean, realistically, I do expect someone to put together a viable robotic house-construction robot at some point.

            https://www.homelight.com/blog/buyer-how-much-does-it-cost-to-build-a-house/

            A rough breakdown of the overall costs of building a home will look like this:

            Labor: 40%

            Also, I’d bet that it cuts into materials cost, because you don’t need to provide the material in a form convenient for a human to handle.

            I’ve seen people creating habitations with large-scale 3d printers, but that’s not really a practical solution. It’s just mechanically-simple, so easier to make the robot.

            I don’t know if it needs to use what we’d think of as AI today to do that. Maybe it will, if that’s a way to solve some problems conveniently. But I do think that automating house construction will happen at some point in time.

  • @afraid_of_zombies@lemmy.world
    link
    fedilink
    English
    422 years ago

    Maybe we do live in the best possible world. Wow wouldn’t it be great to get rid of this industry so you can consume porn while knowing that there is zero percent chance this wasn’t made without their consent?

        • @GBU_28@lemm.ee
          link
          fedilink
          English
          -12 years ago

          Are you actually asking?

          The jist is that LLM find similar “chunks” out content from their training set, and assemble a response based on this similarity score (similar to your prompt request).

          They know nothing they haven’t seen before, and the nicely of them is they create new things from parts of their training data.

          Obviously they are very impressive tools but the concern is you can easily take a model that’s designed for porn, feed it pictures of someone you want to shame, and have it generate lifelike porn of a non porn actor.

          That, and the line around “ethical” AI porn is blurry.

          • tal
            link
            fedilink
            1
            edit-2
            2 years ago

            They know nothing they haven’t seen before

            Strictly speaking, you arguably don’t either. Your knowledge of the world is based on your past experiences.

            You do have more-sophisticated models than current generative AIs do, though, to construct things out of aspects of the world that you have experienced before.

            The current crop are effectively more-sophisticated than simply pasting together content – try making an image and then adding “blue hair” or something, and you can get the same hair, but recolored. And they ability to replicate artistic styles is based on commonalities in seen works, but you don’t wind up seeing chunks of material just done by that artist. But you’re right that they are considerably more limited then a human.

            Like, you have a concept of relative characteristics, and the current generative AIs do not. You can tell a human artist “make those breasts bigger”, and they can extrapolate from a model built on things they’ve seen before. The current crop of generative AIs cannot. But I expect that the first bigger-breast generative AI is going to attract users, based on a lot of what generative AIs are being used for now.

            There is also, as I understand it, some understanding of depth in images in some existing systems, but the current generative AIs don’t have a full 3d model of what they are rendering.

            But they’ll get more-sophisticated.

            I would imagine that there will be a combination of techniques. LLMs may be used, but I doubt that they will be pure LLMs.

        • @GBU_28@lemm.ee
          link
          fedilink
          English
          0
          edit-2
          2 years ago

          Ok, you know it’s trained on existing imagery right?

          Sure the net new photos aren’t net new abuses, but whatever abuses went into the training set are literally represented in the product.

          To be clear I’m not fully porn shaming here, but I wanted to clarify that these tools are informed from something already existing and cant be fully separated from the training data.

      • @diffuselight@lemmy.world
        link
        fedilink
        English
        132 years ago

        I just retained an LLM on your comment you put on the public internet. You feel violated enough to equate it to physical violation?

        • @GBU_28@lemm.ee
          link
          fedilink
          English
          12 years ago

          Why would I? Folks who have had real nudes of them posted on the Internet haven’t felt “physical violation” but they’ve certainly been violated.

          If you had photos of me and trained a porn generating LLM on my photos and shared porn of me, in an identifiable way, I would consider that violation.

          But simply taking my words in that simple sentence isn’t identifiable, unique, or revealing. So no.

          Further, the original point was about the ethics of AI porn. You can’t get something from nothing.

          • stevedidWHAT
            link
            fedilink
            English
            7
            edit-2
            2 years ago

            I can do this right now with photoshop dude what are you talking about. This just points to the need for more revenge porn laws.

            We don’t have to sit in the fire when we can crawl out. Are we still on fire? Yeah. Can we do something about that? Yeah!

            It seems like so many people these days want perfect solutions but the reality is that sometimes we have to make incremental solutions to erase the problem as much as we can.

            • @polymer@lemmy.world
              link
              fedilink
              English
              4
              edit-2
              2 years ago

              And incidentally, this need for revenge porn laws is also a symptomatic issue with a separate cause. Technology always moved forward and with no relation to social advancement, where there is also no realistic “Genie being forced back in the bottle” scenario either.

              That being said, easier access to more powerful technology with lackluster recognition of personal responsibility doesn’t exactly bring happy prospects. lol…

              • stevedidWHAT
                link
                fedilink
                English
                12 years ago

                Agreed, personal responsibility went out the window a long time ago. Apathy reigns supreme.

          • @afraid_of_zombies@lemmy.world
            link
            fedilink
            English
            22 years ago

            I wouldn’t be happy about it but me not being happy about something doesn’t mean I just get an override.

            I think the boat has sailed a bit on this one. You can’t really copyright your own image and even if you were some famous person who is willing to do this and fight the legal battles you still have to go up against the fact that no one is making money off of it. You might be able to get a news source to take down that picture of you but it is another thing to make it so the camera company can’t even record you.

            But hey I was saying for years that we need to change the laws forbidding photography of people and property without consent and everyone yelled at me that they have the right to use a telescoping lense to view whomever they wanted blocks away.

            The creeps have inherited the earth.

          • @Donjuanme@lemmy.world
            link
            fedilink
            English
            32 years ago

            Revenge porn/blackmail/exploitation will hopefully become much less obscene, not to the “let’s not prosecute this” levels, but maybe people can stop living in fear of their lives being ruined by malicious actors (unless that’s your kink, you do you).

            It will take/drive/demand a massive cultural shift, but if you asked me which world I would rather live in, and the options are one where people are abused and exploited, or one where people can visualize their perversions more easily (but content creators have a harder time making a living) I’ll take the second. Though I may have straw-manned a bit, it’s not something I’ve thought of outside of this forum thread.

          • @diffuselight@lemmy.world
            link
            fedilink
            English
            11
            edit-2
            2 years ago

            you are answering a question with a different question. LLMs don’t make pictures of your mom. And this particular question?. One that has roughly existed since Photoshop existed.

            It just gets easier every year. It was already easy. You could already pay someone 15 bucks on Fiver to do all of that, for years now.

            Nothing really new here.

            The technology is also easy. Matrix math. About as easy to ban as mp3 downloads. Never stopped anyone. It’s progress. You are a medieval knight asking to put gunpowder back into the box, but it’s clear it cannot be put back - it is already illegal to make non consensual imagery just as it is illegal to copy books. And yet printers exist and photocopiers exist.

            Let me be very clear - accepting the reality that the technology is out there, it’s basic, easy to replicate and on a million computers now is not disrespectful to victims of no consensual imagery.

            You may not want to hear it, but just like with encryption, the only other choice society has is full surveillance of every computer to prevent people from doing “bad things”. everything you complain about is already illegal and has already been possible - it just gets cheaper every year. What you want to have protection from is technological progress because society sucks at dealing with the consequences of it.

            To be perfectly blunt, you don’t need to train any generative AI model for powerful deepfakes. You can use technology like Roop and Controlnet to synthesize any face on any image from a singe photograph. Training not necessary.

            When you look at it that way, what point is there to try to legislate training with these arguments? None.

    • hh93
      link
      fedilink
      English
      132 years ago

      Isn’t the main problem with those models how you can create porn of everyone without their consent with those tools, too?

      • stevedidWHAT
        link
        fedilink
        English
        152 years ago

        Sex trafficking vs virtual photoshop of your face…

        Nothing new, and it’s a huge improvement over the current status quo. Not everything needs to be a perfect solution

      • @gandalf_der_12te@feddit.de
        link
        fedilink
        English
        42 years ago

        Yeah so what. It’s not as if somebody is “sold on the market” because there’s a nude picture of them. Photoshop is not a real threat to society. We gotta stop making moral imaginations more important than physical things.

    • ax1900kr
      link
      fedilink
      English
      -392 years ago

      hmmm sweetie but what about the only fans prostitutes? Racist much?

  • @ZombiFrancis@sh.itjust.works
    link
    fedilink
    English
    172 years ago

    They’re also creating a lot of images of maid uniforms wearing human faces making ahegao faces while standing on massive erect penis legs.

    They post the eight images that wasn’t some body horror fever dream.

    There’s a lot of human work that goes into (and has gone into) AI art generation. It’s just very obscured with just the final product.

    Remember creepy people use AI. That’s also why a lot of AI stuff is or seems creepy.

    • @SCB@lemmy.world
      link
      fedilink
      English
      162 years ago

      They’re also creating a lot of images of maid uniforms wearing human faces making ahegao faces while standing on massive erect penis legs.

      Finally there is porn for me

  • P03 Locke
    link
    fedilink
    English
    296
    edit-2
    2 years ago

    There is so much wrong with just the title of this article:

    1. What marketplace? CivitAI is free. Unstable Diffusion Discord is free. Stable Diffusion is free. All of the models and LoRAs are free to download. The only cost is a video card (even a basic one) and some time to figure this shit out.
    2. “Everyone is for sale”. No, that’s current fucking situation, where human trafficking runs rampant throughout the sex and porn industry. AI porn is conflict-free. You don’t need to force an underaged, kidnapped teenager to perform a sex act in front of a camera to create AI porn.
    3. “For Sale”. Again, where’s the sale? This shit is free.

    A 404 Media investigation shows that recent developments

    Get the fuck outta here! This two bit blog want to call itself “a 404 Media investigation”? Maybe don’t tackle subjects you have no knowledge or expertise in.

    The Product

    Repeat: FOR FREE! No product!

    In one user’s feed, I saw eight images of the cartoon character from the children’s’ show Ben 10, Gwen Tennyson, in a revealing maid’s uniform. Then, nine images of her making the “ahegao” face in front of an erect penis. Then more than a dozen images of her in bed, in pajamas, with very large breasts. Earlier the same day, that user generated dozens of innocuous images of various female celebrities in the style of red carpet or fashion magazine photos. Scrolling down further, I can see the user fixate on specific celebrities and fictional characters, Disney princesses, anime characters, and actresses, each rotated through a series of images posing them in lingerie, schoolgirl uniforms, and hardcore pornography.

    Have you seen Danbooru? Or F95 Zone? This shit is out there, everywhere. Rule 34 has existed for decades. So has the literal site called “Rule 34”. You remember that whole Tifa porn video that showed up in an Italian court room? Somebody had to animate that. 3D porn artists takes its donations from Patreon. Are you going to go after Patreon, too?

    These dumbasses are describing things like they’ve been living in a rock for the past 25 years, watching cable TV with no Internet access, just NOW discovered AI porn as their first vice, and decided to write an article about it to get rid of the undeserved guilt of what they found.

    What a shitty, pathetic attempt at creating some sort of moral panic.

    • @Schneemensch@programming.dev
      link
      fedilink
      English
      172 years ago

      Just because something is free it does not mean that there is no marketplace or product. Sozial Media is generally free, but I would still call Facebook, Tiktok or Instagram a product.

      Nowadays a lot of industries start out completely free, but move into paid subscription models later.

      • @Touching_Grass@lemmy.world
        link
        fedilink
        English
        12 years ago

        You pay in giving up your free time which they sell. Technically we’re just working for free and the product is our attention

        • P03 Locke
          link
          fedilink
          English
          12 years ago

          Well, fuck, I better log off of Lemmy because it costs me too much damn money.

        • @JuxtaposedJaguar@lemmy.ml
          link
          fedilink
          English
          -22 years ago

          People buy and sell paintings despite the fact that you could also make paintings pretty easily. You’re paying for the time they spent creating it and the expertise it required. Just because some people scan and upload their paintings for free, doesn’t mean that all paintings are not products. I don’t see why the same couldn’t be true for AI porn.

    • rhabarbaOP
      link
      fedilink
      English
      -602 years ago

      Repeat: FOR FREE! No product!

      If it’s free, chances are you’re the product. I assume that there is a market for user-generated “prompts” somewhere.

      • P03 Locke
        link
        fedilink
        English
        972 years ago

        No, that’s not how open-source or open-source philosophies work. They share their work because they were able to download other people’s work, and sometimes people improve upon their own work.

        These aren’t corporations. You don’t need to immediately jump to capitalistic conclusions. Just jump on Unstable Diffusion Discord or CivitAI yourself. It’s all free.

        • rhabarbaOP
          link
          fedilink
          English
          -182 years ago

          These aren’t corporations.

          I know, I know: “but the website is free” (for now). However, Civit AI, Inc. is not a loose community. There must be something that pays their bills. I wonder what it is.

            • @jeremyparker@programming.dev
              link
              fedilink
              English
              92 years ago

              I feel like you’re implying people should look into things before making accusations. Like, find out if what they’re saying is true before they say it. And that’s why no one asked you to the prom.

          • @infamousta@sh.itjust.works
            link
            fedilink
            English
            02 years ago

            They’re probably losing money now and just trying to build a user base as a first-mover. They accept donations and subscriptions with fairly minor benefits, but I imagine hosting and serving sizable AI models is not cheap.

            They’ll probably have to transition to paid access at some point, but I don’t see it as particularly unethical as they have bills to pay and do attempt to moderate content on the site.

            I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made. I don’t think there should be open avenues for sharing that kind of stuff online, and their rules should be better enforced.

            • Joshua Casey
              link
              fedilink
              English
              12 years ago

              I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.

              wholeheartedly disagree. “real porn” is literally made by consenting adult performers. Hence, it’s ethical. Generating adult content of real people is (typically) done without the consent of the people involved, thereby making it unethical.

              • @infamousta@sh.itjust.works
                link
                fedilink
                English
                22 years ago

                If you don’t think anything unethical happens in the production of porn I’m not sure what to tell you. It’s getting better but exploitation, sex trafficking, revenge porn, etc. have been a thing since pornography was invented.

                AI porn at least does not necessarily need to consider consent. Plenty of AI porn involves animated figures or photorealistic humans that don’t represent any identifiable person.

                The only hang up I have is producing images of actual people without their consent, and I don’t think it’s a new problem as photoshop has existed for a while.

                • Joshua Casey
                  link
                  fedilink
                  English
                  -12 years ago

                  i’m sorry to tell you but you have swallowed the propaganda from anti-porn/anti-sex work organizations like Exodus Cry and Morality in Media (who now go by the name NCOSE).

            • @aesthelete@lemmy.world
              link
              fedilink
              English
              -8
              edit-2
              2 years ago

              I tend to agree generating adult content of real people is unethical, but probably less so than how a lot of real porn is made.

              Well, even if that were the case, the “real porn” is still required to train the model in the first place.

              So, it’s unethical shit on top of what you think was even more unethical.

              • @infamousta@sh.itjust.works
                link
                fedilink
                English
                12 years ago

                Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat. But it’s a better alternative. Porn is not going anywhere. If generative AI means less real people get exploited that’s a win in my book.

                • @aesthelete@lemmy.world
                  link
                  fedilink
                  English
                  1
                  edit-2
                  2 years ago

                  Sure, and “impossible” meat wouldn’t have existed if people weren’t already eating actual meat

                  This comparison only holds water if impossible meat were composed of bits of rearranged animal meat… Which it isn’t.

                  If generative AI means less real people get exploited that’s a win in my book.

                  That’s not necessarily a win for everyone. Some people actually like working in the porn industry. Besides that, their likenesses are being stolen and used to produce reproductions and derivative works without consent or compensation.

                  Also, I think you and your buddies here are missing the plot. Generated porn and generated porn of real people are related but different things. I think that’s pretty commonly understood which is why these sites have policies in the first place.

        • @Sethayy@sh.itjust.works
          link
          fedilink
          English
          42 years ago

          Maybe there’s commissions for specific people/poses, cause I certainly couldn’t keep a hard on long enough to generate a spakin worth image

    • @jeremyparker@programming.dev
      link
      fedilink
      English
      30
      edit-2
      2 years ago

      The danbooru aspect of the “AI” moral panic is what annoys me.

      So many of my friends - many of whom are amateur artists - hate computer generated images because the copyright of the artists were violated, and they weren’t even asked. And I agree that does kinda suck - but - how did that happen?

      Danbooru.

      The art had already been “stolen” and was available online for free. Where was their morality then? For the last decade or whatever that danbooru has been up? Danbooru is who violated the copyright, not stable diffusion or whatever.

      At least computer generated imagery is different, like, the stuff it was trained on was exactly their art, while this stuff, while might look like theirs, is unique. (And often with a unique number of fingers.)

      And, if “copyright” is their real concern, them surely they understand that copyright only protects against someone making a profit of their work, right? Surely they’ll have looked into it and they already know that “art” made by models that used copyrighted content for training are provided from being copyrighted themselves, right? And that you can only buy/sell content made from models that are in the copyright clear, surely they know all this?

      No, of course not. They don’t give a shit about copyright, they just got the ickies from new tech.

      • @adrian783@lemmy.world
        link
        fedilink
        English
        -22 years ago

        no one is moral panicking over ai. people just want control over their creation, whether it’s profit sharing or not being used to train models.

        you really can’t see how an imageboard has completely different considerations over image generating models?

        or that people are going after ai because there is only like a couple of models that everyone uses vs uncountable image hosts?

        both danbooru and stable diffusion could violate copyright, not one or the other.

        why would someone want training models to ingest their creation just to spit out free forgeries that they cannot claim the copyright to?

        • P03 Locke
          link
          fedilink
          English
          2
          edit-2
          2 years ago

          no one is moral panicking over ai.

          This is one of the most inaccurate statements I’ve seen in 2023.

          Everybody is morally panicking over AI.

          stable diffusion could violate copyright, not one or the other.

          Or they don’t, because Stable Diffusion is a 4GB file of weights and numbers that have little to do with the content it was trained on. And, you can’t copyright a style.

        • @TwilightVulpine@lemmy.world
          link
          fedilink
          English
          42 years ago

          Yeah. It’s pretty iffy to go “well, these other guys violated copyright so they might as well take it” as if once violated it’s all over and nobody else is liable.

    • @Send_me_nude_girls@feddit.de
      link
      fedilink
      English
      122 years ago

      I just wanted to say I love your comment. Your totally correct and I enjoyed the passion in your words. That’s how we got to deal with shit article more often. Thx

    • @solstice@lemmy.world
      link
      fedilink
      English
      22 years ago

      I mean that’s kind of worse though isn’t it? The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now. Whether you gotta pay or not is beside the point. Maybe I’m misunderstanding the situation and your point though?

      • Echo Dot
        link
        fedilink
        English
        42 years ago

        The point I got from this is that people can make porn of celebs, exes, colleagues, whoever, super easy now.

        So I can, but I could also do that without AI. People have photoshopped celebrities heads onto porn actors bodies for decades. It doesn’t happen as much now because there’s no point.

        Realistically, what is really changed except for the tools?

        • @solstice@lemmy.world
          link
          fedilink
          English
          12 years ago

          Simplicity, barriers of entry, skill requirements? Kinda different to just enter a prompt “such and such actress choking on a dildo” than to photoshop it isn’t it? I for one don’t know how to do one but could probably figure out the other.

          Again I’m just speculating, I don’t really know.

          • @Krauerking@lemy.lol
            link
            fedilink
            English
            22 years ago

            This is absolutely accurate. Basically humanity is constantly reducing the cost and skill barriers for tasks and jobs. It’s weird that we are now aggressively doing it on creative aspects but that’s what has been done and it’s making a mess of garbage media and porn that could have happened before but much higher quantities and less oversight/Input from multiple people.

    • drfuzzyness
      link
      fedilink
      English
      152 years ago

      I’m guessing that the “marketplace” and “sale” refers to sites like “Mage Space” which charge money per image generated or offer subscriptions. The article mentions that the model trainers also received a percentage of earnings off of the paid renderings using their models.

      Obviously you could run these models on your own, but my guess is that the crux of the article is about monetizing the work, rather than just training your own models and sharing the checkpoints.

      The article is somewhat interesting as it covers the topic from an outsider’s perspective more geared towards how monetization infests open sharing, but yeah the headline is kinda clickbait.

      • P03 Locke
        link
        fedilink
        English
        112 years ago

        “Mage Space” which charge money per image generated

        Well, instead of bitching about the AI porn aspect, perhaps they should spend more time talking about how much of a scam it is to charge for AI-generated images.

          • P03 Locke
            link
            fedilink
            English
            12 years ago

            I get no malware or shady ads when I generate AI images with Stable Diffusion. I don’t know what kind of sites or tools you’re using where you’re getting shady ads, but you’re getting ripped off.

            • @darth_helmet@sh.itjust.works
              link
              fedilink
              English
              12 years ago

              Sure, if you have hardware and/or time to generate it client side. I’m just saying that if you run a web service and decide to charge for it, that’s better than most of the alternative monetization strategies.

          • @JuxtaposedJaguar@lemmy.ml
            link
            fedilink
            English
            6
            edit-2
            2 years ago

            Also buying and eventually replacing expensive hardware. Running AI at scale requires hundreds of thousands of dollars of infrastructure.

  • @db2@sopuli.xyz
    link
    fedilink
    English
    612 years ago

    This is not a troll: zoom in on the feet of the yellow dress image. It’s hilariously bad.

    Oh no, the realism, it’s just too much! 🤡

    • @Hamartiogonic@sopuli.xyz
      link
      fedilink
      English
      12 years ago

      People who are into mutant porn are going to love this. No matter what your prompt is, you’re nearly guaranteed to get some horrendous mutant abomination that could be from The Thing.

    • FaceDeer
      link
      fedilink
      62 years ago

      Indeed, there is surely no demand for unrealistic porn.

    • @ThetaDev@lemm.ee
      link
      fedilink
      English
      12 years ago

      Oh yeah. At least they got the total amount of toes correct.

      I did try out one of those image generators. Wanted a picture of two girls making out in the bathroom. The index finger of one girl was grown together with the collarbone of the other one.

    • Roundcat
      link
      fedilink
      122 years ago

      Click on comments hoping to find conversations on the ethics of AI porn. Instead find a 20+ comment chain scrutinizing the details of the feet and other features on the thumbnail.

      • rhabarbaOP
        link
        fedilink
        English
        82 years ago

        Lemmy becoming Reddit went faster than I had thought.

        • Rhaedas
          link
          fedilink
          72 years ago

          People are people. Changing infrastructure isn’t going to make it different, only the ability to perhaps filter it better.

    • Bizarroland
      link
      fedilink
      132 years ago

      What’s wrong with having six toes on one foot, four toes on the other foot and your feet on backwards?

    • @krayj@sh.itjust.works
      link
      fedilink
      English
      22 years ago

      Her left hand is looking kind of messed up also. Only 3 fingers… or maybe more than 3 fingers but only 3 knuckles.

          • Bizarroland
            link
            fedilink
            52 years ago

            You need to check yourself into some fucking rehab or something you filthy degenerate

            • @AllonzeeLV@lemmy.world
              link
              fedilink
              English
              82 years ago

              About these rehabs, would the staff at them happen to have short sleeve shirts that showcase their elbows?

        • @rapscallion@lemmy.world
          link
          fedilink
          English
          92 years ago

          For a couple of years after Google Autocomplete rolled out it suggested adding “feet” to the end of any search I’d make on a famous woman’s name. I honestly didn’t get it at first. I’d never searched for feet in any context, so it wasn’t a personalized thing. I chalked it up either to other women wanting to see a pair of shoes she’d worn or to some weird Autocomplete bug. I’m not prudish, but the idea that so many people were into feet that they perved Google never crossed my mind.

          • tal
            link
            fedilink
            42 years ago

            Maybe Google uses geographic location as an input, and it was just some other correlating factor, like people in your area, rather than a global trend.

        • Zima
          link
          fedilink
          22 years ago

          Thank you. I think it’s really weird. nothing wrong with it but it makes me uncomfortable.

          • gabe [he/him]
            link
            fedilink
            English
            172 years ago

            How many names has stonetoss gone by at this point like what the hell

      • @db2@sopuli.xyz
        link
        fedilink
        English
        112 years ago

        Because the hands were reasonably normal. AI always fucks up one or the other.

        • Nepenthe
          link
          fedilink
          42 years ago

          Sorry, she appears to have only three fingers and the index is kinda shaped like a thumb?

  • ivanafterall
    link
    fedilink
    462 years ago

    So I checked and nobody has put AI porn of me up for sale, yet. What the fuck, guys? Am I not desirable enough for you!?