Kate Middleton’s Doctored Photo Is A Sign Of Something Much More Serious

Trust in photos is eroding fast, and tech is to blame.

Doctored image of Kate Middleton and her children for mothers day PRINCE AND PRINCESS OF WALES/INSTAGRAM, Lunarts Studio, Billion Photos | Canva
Advertisement

Kate Middleton, the Princess of Wales, recently caused a stir in media circles when she published a UK Mother’s Day photo taken by Prince William. The photo shows Kate Middleton with her children in a happy, smiling moment. Several news agencies picked up the photo and distributed it widely.

The problem? It turns out the photo was faked — or at least heavily edited.

Kate Middleton eventually addressed the controversy, apologizing and stating, “Like many amateur photographers, I do occasionally experiment with editing.”

Advertisement

The whole saga is a bit embarrassing for the royal family. But it’s also an indicator of a much bigger problem in the world of photography — and one that AI is about to make exponentially worse.

RELATED: An Astrologer Analyzes Kate Middleton's Birth Chart & Theorizes Where She Really Is

Advertisement

High standards

News photographers like myself have always taken the veracity of images extremely seriously. Professional organizations for the trade, such as the National Press Photographers Association (NPPA), have stringent editorial guidelines that ensure news photos are truthful. Likewise, big agencies like Getty Images and the Associated Press have their own, similar standards.

Most people don’t realize how strict these standards are. They essentially prohibit any editing of an image beyond simple steps like correcting color balance or making small crops.

Any selective editing of a photo, removal of anything from an image, or even cropping that removes any essential part of the image is strictly forbidden. Working within these guidelines can be a challenge. But it’s part of what makes photojournalistic images trustworthy.

Detecting manipulated images using technological means is difficult or impossible. So instead, our industry has focused on ensuring trust by creating rigorous codes of ethics and only accepting images from trusted photographers who abide by them consistently over years or decades.

Advertisement

Editing the truth

Those high professional standards, though, don’t apply to amateur photographers. And as Princess Kate suggests, many amateurs edit their photos enthusiastically.

Even more concerningly, with the emergence of new AI tools, amateur photographers can easily doctor their images without even realizing they’re doing it.

A decade ago, making selective edits to a photo required knowledge of a tool like Photoshop or Adobe Lightroom. It was a very deliberate act. You would need to open the photo in one of these programs, decide what you want to change about it, and go in at the pixel level to make edits and alterations.

With the rise of generative AI, however, amateur photographers can make powerful edits with the click of a button.

Advertisement

The popular Pixel phone, for example, includes a function called Magic Eraser. The idea is that if you see something in the background of your photo that you don’t like, you can easily click on it and have it instantly disappear.

Commercials for this capability show people cheerfully deleting photobombers from the background of their vacation shots or removing unsightly objects from selfies.

Many camera apps now include filters that make similar edits for the user before the photo is even taken. This might include touching up the appearance of the subject, changing the lighting, or even swapping a dreary overcast sky for a sunny one filled with cheerful white clouds.

Advertisement

The casual nature of these edits would make any experienced photojournalist cringe. What users might see as improving a photo would be considered, in the journalism industry, an inexcusable alteration of the truth of a scene. The kind of edits that you can quickly do on your phone with a couple of button presses could easily get a news photographer blacklisted from a publication.

RELATED: How To Tell If A Picture Is Photoshopped & Save Your Self-Esteem

The root of the problem

Kate Middleton’s high-profile snafu is a perfect example of where casual photo manipulation can become destructive. Middleton probably didn’t intend to deceive anyone by making small edits to her photo. But as her example shows, we don’t always know where our photos will end up.

A photo that feels like a casual snapshot today could turn out to be interpreted through a historical lens tomorrow. My own agency’s archives are filled with examples of vernacular family photos that — decades later — help us to determine valuable historical information. Visual clues in amateur photos can reveal important information about architecture, clothing, foodways, and more.

Advertisement

We can trust the veracity of these photos in part because, at the time, heavy photo editing was challenging. It certainly did happen — we’ve seen news photos from the 1930s where a photographer literally cut and pasted a subject from one photo into another with scissors and glue! But for the most part, family snapshots and historical news photos weren’t subject to heavy editing.

No longer. Especially as AI continues to advance and is integrated natively into more and more apps and devices, it will be nearly impossible to verify that a photo hasn’t been manipulated.

Middleton’s edits were obvious in part because she did a sloppy job. Had she used today’s more advanced AI editing apps, it’s unlikely anyone would have known the photo was manipulated.

The undetectability of AI-powered edits threatens to render any photo taken by any photographer untrustworthy.

Advertisement

RELATED: Veteran Publicist Breaks Down How The Royal Family's Statements Are 'Evidence' That 'Something Happened' To Kate Middleton

Embracing verifiability

There are a couple of ways to combat this. One is to digitally sign a photo the instant that it’s taken.

Several high-end camera manufacturers are working with coalitions of publishers and other stakeholders to build digital signing technology into their cameras. With this tech, as soon as a photographer takes a picture, a digital fingerprint is captured along with the image. Later, editors and other stakeholders can look back at that fingerprint to verify that no changes have been made to the photo or to see the nature of any changes made in participating software, like Adobe Photoshop.

Another — perhaps more extreme — solution is to return to analog technologies that are harder to alter. I still shoot a lot of my professional work on film, in part because that way, I always have a physical negative that can be used to verify the authenticity of the image.

Advertisement

For amateur photographers like the Princess, though, the lesson is a bit simpler. Just because you can use AI or other means to edit a photo, that doesn’t mean you should.

The veracity of photos is incredibly important. Especially as nefarious deepfakes become more common, ensuring that day-to-day images are truthful and unaltered will become ever more crucial. Edits that seem innocent in the moment can irrevocably change the nature of an image. And no one knows which photos will be historical in the future.

The casual snapshot of your college friend will take on a new character in the future if they become a powerful CEO or political leader. You might be capturing history on a daily basis, without necessarily knowing that you’re doing it.

Ultimately, Middleton’s photo is more a source of amusement (and perhaps fodder for royal conspiracy theories) than anything darker. But with many photos, that won’t be the case.

When changing the nature of photos becomes casual and routine, it opens the door for truly nefarious photos to slip under people’s radar. Casual alternation also makes it far easier for people to simply doubt the veracity of every photo they see, which deprives the world of an important way to capture history, document atrocities, and tell truthful visual stories.

Advertisement

Sure, your vacation snapshots probably look better if you remove the random surfer from the background, or maybe throw on some digital makeup. But if you resist that urge to manipulate and instead embrace the realism in your images, you can take a small but powerful stance against the kind of deep fakes and alterations that genuinely do cause lasting harm.

RELATED: Videos Throughout Prince William's Life Show Alleged Anger Issues That Reportedly Concerned His Family

Thomas Smith is a professional photographer and editor of the Bay Area Telegraph, a publication focusing on San Francisco Bay Area food and culture.