[ad_1]
After frenzied speculation that a recent photo of Kate Middleton and her children was AI-generated, the Duchess of Cambridge herself had to address the controversy.
“Like many amateur photographers, I do occasionally experiment with editing,” Middleton wrote in a statement. “I wanted to express my apologies for any confusion the family photograph we shared yesterday caused.”
There’s something odd about the photo — possibly that you have four people staring directly into your soul and smiling at you. But then there’s the hands. It’s an appropriate amount of hands — eight hands for four people — but they all look a bit off. We can see Kate’s hands, but her arms are hidden behind two children, and one hand looks slightly blurry, while the other is missing a wedding ring. Charlotte’s wrist looks like it’s blended with another missing wrist, and the sleeve of her cardigan blends into another grey cardigan that doesn’t seem to be in the photo. And Louis is just doing something weird with one of his hands that probably has more to do with kids being weird than it has to do with editing. But the longer you look at the photo, the creepier it gets.
To make things more suspicious, the Royal Family’s fans have been speculating about Princess’ recent absence from the public eye. The Royal Family announced in January that Middleton was having a planned abdominal surgery in London; after a few weeks in the hospital, she was discharged, and the family reported she was still recovering, but doing well. But it’s been more than a month since Middleton has been home, and she still hasn’t made any public appearances — for a member of the Royal Family, that’s not normal.
As someone who had Instagram in high school, I get it. There’s a world of temptation out there, from VSCO to Facetune to Canva, and it’s so easy to just erase away an inconveniently placed zit… but you might get caught. Princesses: they’re just like us! This was a time when AI felt like science fiction, and you still had to use Photoshop to remove the background of an image. Back then, Royal commentators and fans probably would have pointed out the weirdness of the childrens’ fingers in the photo, or how there’s an area near Charlotte’s elbow that looks like something went wrong with a content-aware fill. But we wouldn’t have spun up a conspiracy that the entire image was a synthetic psyop created by Buckingham Palace.
Rumors about Middleton’s absence have sparked increasingly dubious explanations. Page Six reported on unfounded speculation that the Princess got a Brazilian Butt Lift, while others joked that Middleton had something to do with that Willy Wonka pop-up gone wrong. One tweet even joked that Middleton might be Bansky. So when official Royal Family accounts published the suspicious-looking photo of Kate and her children, the internet had a field day. The discourse around the photo got so out of hand that the Associated Press pushed a “kill order,” meaning that it asked news outlets to take the picture down.
It’s not clear what tools the Princess used to edit the photo — a tool like Facetune might be able to remove blemishes or toggle the brightness of the photo, but it won’t create a phantom sleeve beneath Charlotte’s elbow. Some retouching tools, like Photoshop’s content-aware fill or a clone brush, might use elements of the photo to create something that wasn’t originally there. But those aren’t the kinds of photo editing tools that people use when they’re trying to make themselves look Instagram-ready — it’s what you use when you’re trying to edit out a random guy in the background of your beach photo.
Even British celebrities like Piers Morgan have weighed in, raising the question of why the Royal Family won’t quash the conspiracy theories by just releasing the unedited photo.
As AI-powered image generation becomes mainstream, we’re losing our grip on reality. In a time when any image can be fake, how can we know what’s actually real? There are some tell-tale signs, like if someone has an abnormal number of fingers, or if someone is wearing an earring on one ear but not the other (though that could also be a style choice — you know it when you see it). But as AI gets better and more widespread, these methods of detection aren’t as reliable. A recent study from the Center for Countering Digital Hate revealed that deepfake images about elections have been rising by 130% per month on average on X (Twitter). Though speculation about a missing Princess isn’t going to sway an election, this incident shows that people are finding it more and more challenging to distinguish between fact and fiction.
It’s a good thing that the public was so skeptical about Middleton’s sketchy photo, since she admitted that it was edited. Family photos are always awkward, but at least ours probably won’t spark international debate.
[ad_2]
Source link