• @RainfallSonata@lemmy.world
    link
    fedilink
    2006 months ago

    I never understood how they were useful in the first place. But that’s kind of beside the point. I assume this is referencing AI, but due to the fact that you’ve only posted one photo out of apparently four, I don’t really have any idea what you’re posting about.

    • Hildegarde
      link
      fedilink
      2486 months ago

      The point of verification photos is to ensure that nsfw subreddits are only posting with consent. Many posts were just random nudes someone found, in which the subject was not ok with having them posted.

      The verification photos show an intention to upload to the sub. A former partner wanting to upload revenge porn would not have access to a verification photo. They often require the paper be crumpled to make it infeasible to photoshop.

      If an AI can generate a photorealistic verification picture, it cannot be used to verify anything.

      • @RainfallSonata@lemmy.world
        link
        fedilink
        68
        edit-2
        6 months ago

        I didn’t realize they originated with verifying nsfw content. I’d only ever seen them in otherwise text-based contexts. It seemed to me the person in the photo didn’t necessarily represent the account owner just because they were holding up a piece of paper showing the username. But if you’re matching the verification against other photos, that makes more sense.

        • @RedditWanderer@lemmy.world
          link
          fedilink
          696 months ago

          It’s been used way before the nsfw stuff and the advent of AI.

          Back in the days if you were doing an AMA with a celeb, the picture proof is the celeb telling us this is the account they are using. Doesn’t need to be their account and was only useful for people with an identifiable face. If you were doing an AMA because you were some specialist or professional, giving your face and username doesn’t do anything, you need to provide paperwork to the mods.

          This is a poor way to police fake nudes though, I wouldn’t have trusted it even before AI.

      • oce 🐆
        link
        fedilink
        296 months ago

        Was it really that hard to Photoshop enough to bypass mods that are not experts at photo forensic?

        • @AnneBonny@lemmy.dbzer0.com
          link
          fedilink
          English
          316 months ago

          I think it takes a considerable amount of work to photoshop something written on a sheet of paper that has been crumpled up and flattened back out.

        • @psmgx@lemmy.world
          link
          fedilink
          296 months ago

          It’s mostly about filtering the low-hanging fruit, aka the low effort trolls, repost bots, and random idiots posting revenge porn.

        • @xor@sh.itjust.works
          link
          fedilink
          26 months ago

          there’s a lot of tools to verify if something was photoshopped or not… you don’t need to be an expert to use them

          • oce 🐆
            link
            fedilink
            4
            edit-2
            6 months ago

            I tried some one day and I didn’t find any that is actually easy for a noob, I remember having to check resolution, contrast, spatial frequency disruption etc. and nothing looked easy to detect without proper training.

            • @bradorsomething@ttrpg.network
              link
              fedilink
              36 months ago

              You can verify the resolution changes across a video or photo. This can be overcome by setting a dpi limit to your lowest resolution item in the picture, but most people go with what looks best instead of a flat level.

      • DominusOfMegadeus
        link
        fedilink
        136 months ago

        On a side note, they are also used all the time for online selling and trading, as a means to verify that the seller is a real person who is in fact in possession of the object they wish to sell.

      • @trolololol@lemmy.world
        link
        fedilink
        16 months ago

        How does traditional - as in before AI - photo verification knows the image was not manipulated? In this post the paper is super flat, and I’ve seen many others.

        • Hildegarde
          link
          fedilink
          106 months ago

          From reading the verification rules from /r/gonewild they require the same paper card to be photographed from different angles while being bent slightly.

          Photoshopping a card convincingly may be easy. Photoshopping a bent card held at different angles that reads as the same in every image is much more difficult.

          • stebo
            link
            fedilink
            76 months ago

            That last thing will still be difficult with AI. You can generate one image that looks convincing, but generating multiple images that are consistent? I doubt it.

              • I feel like you could do this right now by hand (if you have experience with 3d modelling) once you’ve generated an image. 3d modelling often includes creating a model from references, be they drawn or photographs.

                Plus, I just remembered that creating 3d models of everyday objects/people via photos from multiple angles has been a thing for a long time. You can make a setup that uses just your phone and some software to make 3d printable models of real objects. No reason preventing someone from using a series of AI generated images instead of photos they took, so long as you can generate a consistent enough series to get a base model you can do some touch-up by hand to fix anything that the software might’ve messed up. I remember a famous lady in the 3d printing space who I think used this sort of process to make a complete 3d model of her (naked) body, and then sold copies of it on her Patreon or something.

        • @KneeTitts@lemmy.world
          link
          fedilink
          English
          16 months ago

          Jut ask for multiple photos of the person in the same place, AI has a hard time with temporal coherence so in each picture the room items will change, the face will change a bit (maybe a lot), hair styles will change… etc

    • The Picard ManeuverOP
      link
      196 months ago

      I found this singular screenshot floating around elsewhere, but yes r/stablediffusion is for AI images.

    • I had some trouble figuring out what exactly was going on as well, but the Stable Diffusion subreddit gave away that it was at least AI related, as that’s one of the popular AI programs. It wasn’t until I saw the tag though, that I really understood - Workflow Included. Meaning that the person included the steps they used to create the photo in question. Which means that the person in the photo was created using the AI program and is fake.

      The implications of this sort of stuff are massive too. How long until people are using AI to generate incriminating evidence to get people arrested on false charges, or the opposite - creating false evidence to get away with murder.

    • @ysjet@lemmy.world
      link
      fedilink
      English
      -176 months ago

      Pretty sure it started because nsfw subreddit mods realized they demand naked pictures of women that nobody else had access to and it made their little mod become a big mod.

      • chaogomu
        link
        fedilink
        346 months ago

        Verification posts go back further than Reddit.

        They were used extensively on 4chan, because they were the only way to prove that a person posting was in fact that person.and yes, it was mostly people posting nudes, but it was more that they wanted credit.

        The reason it carried on to Reddit was because people were using the accounts to advertise patreon and onlyfans, and mods mostly wanted the people making money off the pictures to be the people who took those pictures.

        Also it was useful for AMA posts and other such where a celebrity was involved.

        • @ysjet@lemmy.world
          link
          fedilink
          English
          16 months ago

          4chan was a bit different in that it was anonymous to begin with- and more to the point, it was self-volunteered verification, not a mod-driven requirement.

          As for reddit, mods were requiring private verification photos LONG before patreon and onlyfans even existed in the first place.

          AMAs, agreed.

      • @hansl@lemmy.world
        link
        fedilink
        26 months ago

        “No no it’s not about consent it’s about someone being horny” is such a bad take… and bad taste.

        • @ysjet@lemmy.world
          link
          fedilink
          English
          1
          edit-2
          6 months ago

          I hate to break this to you, but there was in fact subreddits that publically stated that they required you to privately DM mods a full-body full-face nudes in poses of the mod’s choice for verification.

          That ain’t me being in bad taste, it’s just me doing basic observation. Some subreddits it was about verification, yes. Some it was about consent. Some of them it was about the mods being horny. And most of them, it was some combination of the three.

          To pretend that it didn’t happen is… well, casual erasure of sexual misconduct of the mods, frankly.

  • Margot Robbie
    link
    fedilink
    1256 months ago

    Due to having so many people trying to impersonate me on the internet, I’ve become somewhat of a expert on verification pictures.

    You can still easily tell that this is fake because if you look closely, the details, especially the background clutter, is utterly nonsensical.

    1. The object over her right shoulder (your left), for example, looks like if someone blended a webcam with a TV with a nightstand.
    2. Over her left shoulder (your right), her chair is only on that one side and it blends into the counter in the background.
    3. Is it a table lamp or a wall mounted light?
    4. The doorframe in background behind her head is not even aligned.
    5. Her clavicles are asymmetrical, never seen that on a real person.
    6. Her wispy hairstrands. Real hair don’t appear out of thin air in loops.
    • @Honytawk@lemmy.zip
      link
      fedilink
      446 months ago

      The point isn’t that you can spot it.

      The point is that the automated system can’t spot it.

      Or are you telling me there is a person looking at every verification photo, and if they did they would thoroughly scan the photo for imperfections?

      • Margot Robbie
        link
        fedilink
        206 months ago

        The idea of using a picture upload for automated verification is completely unviable. A much more commonly used system would be something like telling you to perform a random gesture on camera on the spot, like “turn your head slowly” or “open your mouth slowly” which would be trivial for a human to perform but near impossible for AI generators.

        • @curiousPJ@lemmy.world
          link
          fedilink
          English
          236 months ago

          but near impossible for AI generators.

          …I feel like this isn’t the first time I heard that statement before.

          • Margot Robbie
            link
            fedilink
            26 months ago

            It’s not that difficult to identify if you have a good understanding of photography principles. The lighting on this image is the biggest tell for me personally, since I can’t visualize any lighting setup that can cast shadows in the directions that’s shown on this picture, it just instinctually looks wrong to me on first sight because of the impossible light sources.

            That’s the reason the picture looks WRONG, even if you can’t identify the reason why it looks wrong.

            I only focused on the nonsense background clutters because I think it’s easier for people who don’t work around cameras all day.

            • @ObsidianBlk@lemmy.world
              link
              fedilink
              106 months ago

              This is what makes this technology anxiety inducing at best…

              So, for yourself, you have no issues seeing the artificiality of the image due to your extensive exposure to and knowledge of photographic principles. This is fair… that said, I have read your earlier comment about the various issues with the photo as well as this one about light sources, and I keep going back to scrutinize those elements, and… for the life of me… I cannot pick out anything in the image that, to me, absolutely screams artificial.

              I’m fairly sure most people who look at these verification photos would be in a similar boat to me. Unless there’s something glaringly obvious (malformed hands, eyes in the wrong place, a sudden cthulhu-esk eldritch thing unnaturally prowling the background holding a stuffed teddy bear) I feel most people would accept an image like this at face value. Alternatively, you’ll get those same people so paranoid about AI generated fakes they’ll falsely flag a real image as fake because of one or two elements they can’t see clearly or have never seen before.

              And this is only the infancy of AI generated art. Every year it gets better. In a decade, unless there are some heavy limitations on how the AI is trained (of which, only public models would ever really have these limitations as private models would train be trained on whatever their developers saw fit… to shreds with what artists and copyright said), there would probably be no real way to tell a real image from a fake out apart at all… photographic principals and all.

              Interesting times :D

        • @iegod@lemm.ee
          link
          fedilink
          12
          edit-2
          6 months ago

          near impossible for AI generators

          That’s not really the case but moreoever the gap is closing at a blistering pace. Approximately two years ago this stuff was in the distant future. One year ago the lid was blown open. Today we’re seeing real-time frame generation. This rallying against the tech is misguided. It needs to be embraced and understood. Trying to do otherwise is great folly as everything will fall even further behind and lead to even larger misunderstandings. This isn’t theoretical. It’s already here. We can’t bury our heads in the sand.

        • @EmergMemeHologram
          link
          66 months ago

          If you look at gaussian splatting and diffusion morphs/videos, this is merely in the space of “not broadly on hugging face yet” and not impossible, or even difficult depending on the gesture.

          We’re months away from fully posable and animatable 3d models of these AI images. It already exists in demos and on arxiv, it runs on consumer hardware but not in realtime, so a video upload would work but a live stream would require renting a cloud GPU ($$$).

        • @smooth_tea@lemmy.world
          link
          fedilink
          56 months ago

          Having an AI act out random gestures is really not that different from generating an image based on a prompt if you think about it. The temporal element has already been done, the biggest factor right now is probably that it’s too computationally heavy to do in real time, but I can’t see that being a problem for more than a year.

      • @nova_ad_vitum@lemmy.ca
        link
        fedilink
        66 months ago

        More than that - these systems will eventually figure out how to not bitch the background so obviously. Then what? As others have said, we could switch to verification videos. That will be an extra year or two.

        • @EmergMemeHologram
          link
          56 months ago

          The system doesn’t even need to get better at backgrounds, you just generate more images until one looks good.

      • oce 🐆
        link
        fedilink
        5
        edit-2
        6 months ago

        I think so. I don’t think there would be more than a few dozens of verification to do every day, with a dozen of mods, it seems doable in this context. It’s not like millions of users are asking for verification every day.

      • Margot Robbie
        link
        fedilink
        27
        edit-2
        6 months ago

        That’s esteemed Academy Award nominated verification picture expert/character actress Margot Robbie to you!

        Now watch me win my Golden Globe tonight. (Still no best actress… sigh)

    • @Aganim@lemmy.world
      link
      fedilink
      23
      edit-2
      6 months ago

      Her clavicles are asymmetrical, never seen that on a real person.

      Shit, are you telling me that every time I see myself in the mirror I’m actually looking at a string of AI generated images, generated in real-time? The matrix is real. 😱

      It’s either that, or my clavicles are actually very asymmetric. ☹️

      • Margot Robbie
        link
        fedilink
        86 months ago

        What I meant is that her right clavicle (your left) is about an inch higher than her left.

        I could be wrong, of course, but I imagine if that condition actually exists, then it would be extremely painful.

        • @Gutless2615@ttrpg.network
          link
          fedilink
          English
          186 months ago

          You’re reaching. I don’t think this is “easy” to tell as you’re making it at all. You’re benefiting enormously from knowing the results before you begin extrapolating.

        • I was agreeing with everything you’ve said, but I was in a pretty nasty bike accident years back which dislocated my clavical. Which now makes it sit about half an inch higher; mainly on the neck side. I was freaked out at first but the doctor said to just live with it so it can happen.

        • @Aganim@lemmy.world
          link
          fedilink
          2
          edit-2
          6 months ago

          Yeah, I see what you mean, but my shoulders look almost exactly like that. Doesn’t hurt at all, just very annoying when carrying a backpack as the straps will always tend to slide off from my ‘drooping’ shoulder.

          But I agree with your comments about the background, that looks like a fever dream. And of course my situation isn’t the norm, so the shoulders/clavicles can be treated as a red flag, it’s just not definite proof and care should be taken to realise some people might actually just be built weird.

    • @phoenixz@lemmy.ca
      link
      fedilink
      196 months ago

      Due to so many people trying to impersonate me on the Internet

      Yeah see, now I am not really sure if you’re the real Margot Robbie.

      Could you send me a verification picture?

      • Margot Robbie
        link
        fedilink
        156 months ago

        But then how will I astroturf (I mean, organically market) my current and future movies, like Golden Globe winning summer blockbuster, Barbie, now available on Blu-Ray and select streaming services, here if I get verified?

      • nickwitha_k (he/him)
        link
        fedilink
        16 months ago

        There’s already an AI generated one in this post (you didn’t specify that it be her or legitimate).

      • @MystikIncarnate@lemmy.ca
        link
        fedilink
        English
        36 months ago

        Me neither. There’s clearly more pictures that aren’t included here, so maybe on one of those?

        The odd thing about the hair in that picture to me is that on the left side of the photo, there’s one piece that seems to go on a nearly 90 degree bend for seemingly no reason, mid air. I don’t generally see hair get… Kinked like that. I suppose it’s not outside the realm of possibility, but it’s odd at least.

        The rest of the hair seems fine to me, but I’m no expert.

        I will note however that the object(s) in the background on the left side of the photo look like a gigantic (novelty sized) point and shoot camera from the 90’s. The box on top is the viewfinder and there’s the impression of a circle below that which would be the lens.

        Just makes me giggle at the thought of such a large disposable camera.

        • TheHarpyEagle
          link
          fedilink
          46 months ago

          Curly hair can look like that when it’s curling tightly towards/away from you. It looks fairly natural to me.

    • @problematicPanther@lemmy.world
      link
      fedilink
      176 months ago

      every time I’d seen this photo, I only focused on the subject in the foreground, if I were the one verifying that the person in the photo is real, I’d have fallen for it. To me, the subject is entirely convincing. the issues you mentioned about the clavicles and hair, i think kind of make it a bit more convincing. Nobody is completely symmetrical for one, so seeing something like that, while not common, wouldn’t be necessarily uncommon. The hair, to me, just looks like normal person hair. sometimes hair do be like that.

      • @Riven@lemmy.dbzer0.com
        link
        fedilink
        56 months ago

        Dude same, before I even read anything I was thinking ‘that’s a cute girl I didn’t know they started doing verifications on lemmy’ then I read and saw the whole hullabaloo.

    • lad
      link
      fedilink
      116 months ago

      Didn’t get the 5^th point, there’s only one clavicle visible, am I missing something?

      • @Bohurt@lemm.ee
        link
        fedilink
        116 months ago

        Even so clavicles can be asymmetrical due to previous injury. We are pretty asymmetrical overall if you look closely enough.

        • @TheFinn@discuss.tchncs.de
          link
          fedilink
          26 months ago

          Well she’s Margot Robbie, so her clavicles are symmetrical af. She probably just assumes the rest of us are like that too 😓

    • @Riven@lemmy.dbzer0.com
      link
      fedilink
      76 months ago

      An easy ‘solution’ to fix the background is to just use a mild blurring tool. They’re verifying you not your house, it wouldn’t be sus to just have a mild messy blur around you.

      • Margot Robbie
        link
        fedilink
        66 months ago

        The bokeh effect is surprisingly hard to fake, actually, because it has to do with the physical properties of the camera lens. I think with a light Gaussian blur it would be even less convincing.

    • @ManOMorphos@lemmy.world
      link
      fedilink
      36 months ago

      The “holes” on her cheeks are easy to miss but seriously unsettling close up. They’re not like freckles or blackheads but more like what termite tunnels look like in wood.

      • cheesepotatoes
        link
        fedilink
        7
        edit-2
        6 months ago

        Nah. They just look like big pores. There are a few giveaways here that it’s AI generated, but the pores aren’t one of them.

        Source: have big pores. Also, google images.

    • /home/pineapplelover
      link
      fedilink
      56 months ago

      I think ai can already do videos with people in them. Not without it looking completely natural though so there will be some discrepancies.

      • @dustyData@lemmy.world
        link
        fedilink
        English
        86 months ago

        AI video still looks like fever dreams. The AI can’t keep consistent details, specially in the background, from frame to frame. There’s always parts that morph and look like conjured up by Van Gogh during a maniacal delirium. Maybe in a couple of years and with some human grooming in the middle.

      • @Coasting0942@reddthat.com
        link
        fedilink
        16 months ago

        It’s purpose is also to reduce the chances.

        If somebody is going to go to all the trouble of fooling a human, they probably aren’t going to just start spamming random pictures on the community for an instant moderator ban.

  • @HiddenLayer5@lemmy.ml
    link
    fedilink
    English
    93
    edit-2
    6 months ago

    At some point the only way to verify someone will be to do what the Klingons did to rule out changelings: Cut them and see if they bleed.

        • @SpaceCowboy@lemmy.ca
          link
          fedilink
          46 months ago

          Two seasons of stupid garbage followed by Season 3 which just ignored the garbage and made things right.

          • @SpaceCowboy@lemmy.ca
            link
            fedilink
            26 months ago

            Yeah but isn’t it just the changelings that Starfleet did all the fucked up experiments on that bleed?

            • @Im_old@lemmy.world
              link
              fedilink
              16 months ago

              technically yes, but IIRC they were pretty much the only changelings remaining since the others were wiped out in the war? I missed a few Star Trek shows/alternate timelines etc so I might be wrong.

              • @SpaceCowboy@lemmy.ca
                link
                fedilink
                16 months ago

                In the ending of DS9, Odo went to the changeling homeworld and rejoined the great link and cured them of their disease. And he kinda made them more mellowed out or something like that. In Picard S3, Worf mentions something about how his friend (obviously Odo) sent word it wasn’t those changelings involved in the plot. So Odo and all the Gamma quadrant changelings are fine, it’s just the ones that were experimented on by Starfleet that we see in Picard S3.

          • @HiddenLayer5@lemmy.ml
            link
            fedilink
            English
            16 months ago

            So they’ve gotten better at mimicking other life forms? Is that the canonical reason in the show or is Picard just going against established lore?

            • @CaptPretentious@lemmy.world
              link
              fedilink
              -26 months ago

              I only watched a little of Picard before I gave up. But if I had to guess, it’s that the writers never watched any Star Trek, so they get a lot wrong.

  • @yamanii@lemmy.world
    link
    fedilink
    876 months ago

    Can confirm, I made some random korean dude on dall-e to send to Instagram after it threatened to close my fake account, and it passed.

        • In the dark future, an underground market has formed to preserve the anonymity and privacy of the average person using holographic disguises of anthropomorphic figures that were in the distant past sometimes known as “furries.”

          • AceCephalon
            link
            fedilink
            76 months ago

            Ah yes, even in the dark future, furries are making super advanced and useful technologies to be more furry.

      • @ThePinkUnicorn@lemdro.id
        link
        fedilink
        English
        196 months ago

        There are projects that already exist with this sort of purpose, one I came across a while ago was Deep Privacy which uses deepfakes to replace your face and body in an image with one that is AI generated.

    • @pythonoob@programming.dev
      link
      fedilink
      126 months ago

      I’ve had an AI generated mix between my face and an actors as my Facebook profile pic for a little over a year now I think, or close to it, and only my sister has called me out on it.

    • Ook the Librarian
      link
      fedilink
      76 months ago

      I’m in the same boat. I basically want to wear an ai mask. I don’t like cartoon face trackers or similar. I don’t have the hardware to render a video though, and I’m not going to buy server time.

    • Rikudou_Sage
      link
      fedilink
      26 months ago

      Smart! Just today I took a very funny photo, but don’t want to break the anonymity of anyone in there.

    • Turun
      link
      fedilink
      26 months ago

      Google automatic1111, it’s the program to run if you want to generate AI images. You can put in the original photo, use the built in editor and request the face of a pretty man/woman/elephant (for all I care) and it’ll generate a face and merge it with the surrounding image perfectly.

      Requires a graphics card with a few gigabytes of vram though, so there is a certain hardware requirement if you want to do this locally.

    • @AA5B@lemmy.world
      link
      fedilink
      16 months ago

      I really like “Bitmoji” on my iPhone as an interesting start in that direction. I can create my avatar, whether as similar to me or not, and use it as a filter on FaceTime where it follows a lot of my actual movement and expressions

        • Riskable
          link
          fedilink
          English
          86 months ago

          Example: When famous people do an AMA. E.g. Obama did an AMA on Reddit and he was verified with a photo that would be very easy to fake today using AI.

        • @EnderMB@lemmy.world
          link
          fedilink
          36 months ago

          There are plenty of people making Instagram and OF accounts of fully AI girls. They are hilariously fake, but looking at some of the thirsty comments their posts get, I’m inclined to say that subreddits like /r/AmIUgly and /r/rateme are likely to end up with lots of verification posts that result in lots of scams.

          Although, as already pointed out, verification posts have always been easy for people to scam with Photoshop.

    • Nora
      link
      fedilink
      10
      edit-2
      6 months ago

      Micro communities based on pre (post-truth) connections. Only allowing people into the community that can be confirmed be others?

      I’ve been thinking of starting a matrix community to get away from discord and it’s inevitable Botting.

  • Striker
    link
    fedilink
    336 months ago

    Isn’t there a trick where you can ask someone to do a specific hand gesture to get photos verified. That’ll still work especially because AI makes fingers look wonky

    • @fidodo@lemmy.world
      link
      fedilink
      676 months ago

      AI has been able to do fingers for months now. It’s moving very rapidly so it’s hard to keep up. It doesn’t do them perfectly 100% of the time, but that doesn’t matter since you can just regenerate it until it gets it right.

      • @YoorWeb@lemmy.world
        link
        fedilink
        15
        edit-2
        6 months ago

        “For your verification please close left eye and run two fingers through your hair while eating a cauliflower with whipped cream. Attach a paperclip to your left ear and write your username on your forehead using an orange marker.”

        • lad
          link
          fedilink
          106 months ago

          And then if the user had done everything requested then the photo is generated, because nobody would do that all in their sane mind 😂

      • @Paradachshund@lemmy.today
        link
        fedilink
        146 months ago

        You could probably just set up a time for the person to send a photo, and then give them a keyword to write on the paper, and they must send it within a very short time. Combine that with a weird gesture and it’s going to be hard to get a convincing AI replica. Add another layer of difficulty and require photos from multiple angles doing the same things.

        • @Vampiric_Luma@lemmy.ca
          link
          fedilink
          96 months ago

          Lornas can be supplied to the AI. These are data sets of specific ideas like certain hand gestures, lighting levels, whatever style you need you can fine-tune the general data set with lornas.

          I have the minimum requirements to produce art and HQ output takes 2 minutes. Low-quality only takes seconds. I can fine-tune my art on a LQ level, then use the AI to upscale it back to HQ. This is me being desperate, too, using only local software and my own hardware.

          Do this through a service or a gpu farm and you can spit it out much quicker. The services I’ve used are easy to figure out and do great work for free* in a lot of cases.

          I think these suggestions will certainly be barriers and I can think of some more stop-gaps, but they won’t stop everyone from slipping through the cracks especially as passionate individuals hyper-focus on technology we think in passing continue working on it.

        • @ExperimentalGuy@programming.dev
          link
          fedilink
          2
          edit-2
          5 months ago

          I feel like there’s a way to get around that… Like if you really wanted, some sort of system to Photoshop the keyword onto the piece of paper. This would allow you to generate the image but also not have to worry ab the AI generating that.

          Edit: also does anyone remember that one paper that had to do with a new AI architecture where you could put in some sort of negative image to additionally prompt an AI for a specific shape, output, or position.

          • @Unkn8wn69@monero.town
            link
            fedilink
            26 months ago

            Just write on paper and overlay via Photoshop. Photopea has a literal one button click function for that very easy to do. Just blank paper and picture with enough light. Very easy

    • Bob
      link
      fedilink
      -226 months ago

      Some AI models have already nailed the fingers, this won’t do anything. We need something that we can verify without having to trust the other person. I hate to say it but the block chain might be one of the best ways to authenticate users to avoid bots

  • qaz
    link
    fedilink
    276 months ago

    That’s why you need a video with movement. AI still can’t do video right.

  • 🇰 🔵 🇱 🇦 🇳 🇦 🇰 ℹ️
    link
    fedilink
    English
    25
    edit-2
    6 months ago

    My discord friends had some easy ways to defeat this.

    You could require multiple photos; it’s pretty hard to get AI to consistently generate photos that are 100% perfect. There would bound to be things wrong with trying to get AI to generate multiple photos of the same (non-celeb) person that would make it obvious it’s fake.

    Another idea was to make it a short video instead of a still photo. For now, at least, AI absolutely sucks balls at making video.

    • Jvrava9
      link
      fedilink
      76 months ago

      Just wait a few months for video, about consistency in photos, just use loras and seeds.