You can’t use AI to ‘zoom and enhance’ that grainy photo of Kate Middleton


WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

The internet’s latest obsession is Kate Middleton — specifically about her whereabouts after an unexpected January surgery. Despite the initial announcement that the princess would not resume her duties until Easter, the world couldn’t stop itself from speculating and theorizing about Kate’s health and the status of her marriage to Prince William. Of course it didn’t help that the only photos of the princess published since then were, let’s say, not definitive. There are grainy photos taken from a distance, and of course, the infamous family photo that was later discovered to be manipulated. (A post on X (formerly Twitter) attributed to Kate Middleton (Sorry for the edited photo, posted later.)

Finally, The Sun published a video of Kate and William walking through a farm shop on Monday, putting the matter to bed. However, the video did little to satisfy the most ardent conspiracy theorists, who believe the video is of too low a quality to confirm whether or not the walking woman is indeed a princess.

In fact, one number goes so far as to suggest that what we can see shows it. do not have Kate Middleton. To prove it, some have turned to AI photo-enhancement software to sharpen the pixelated frames of the video to discover once and for all who’s dating the future King of England. had lived:

There you have it, folks: it’s a woman. no Kate Middleton. This is one of those three. The introduction is over! or wait, This The woman in the video is actually:

Dude, probably not. Jeez, these results aren’t consistent at all.

That’s because these AI “enhancement” programs aren’t doing what users think they’re doing. None of the results prove that the woman in the video is not Kate Middleton. They just prove that AI can’t tell you what a pixelated person actually looks like.

I don’t necessarily blame anyone who thinks AI has this power. After all, we’ve seen AI image- and video-makers do extraordinary things over the past year: if something like Midgerni can render a realistic scene in seconds, or if OpenAI’s Sora is a non-existent dog. Can create realistic video. Playing in the snow, why can’t a program sharpen a blurry image and show us who’s behind those pixels?

AI is only as good as the information it has

See, when you ask an AI program to “enhance” a blurry image, or for that matter create additional parts of an image, you’re really asking the AI ​​to add more information to the image. . Digital images, after all, are just 1s and 0s, and to show more detail in someone’s face, you need more information. However, AI can’t see a blurred face and, through sheer computational power, “know” who’s really there. All it can do is get information. And guess what really should be there.

So, in the case of this video, AI programs run through the pixels of the woman we’re talking about, and, based on their training set, add more detail to the image to see what it is. thinks There should be – not what actually is. That’s why you get such different results every time, and often terrible to boot. This is just guessing.

Jason Kubler of 404media offers a great demonstration of how these tools don’t work. Not only did Kobler try programs like Footer or Remini on The Sun’s video, with terrible results like the rest online, but he also tried it on a blurry photo of himself. The results, as you can guess, were not correct. So, clearly, Jason Kubler is missing, and an imposter has taken over his role at 404 Media. #cobblergate.

Now, some AI programs are Better at this than at others, but usually in specific use cases. Again, these programs are augmenting the data based on their thinking. should Be there, so it works well when the answer is obvious. For example, Samsung’s “Space Zoom,” which the company advertised as being able to take high-quality photos of the moon, was using AI to fill in this missing data. Your galaxy will take a picture of a faint moon, and the AI ​​will fill in the information with bits of the actual moon.

But the moon is one thing. Specific face is another. Sure, if you had a program like “Kate AI” that was trained entirely on photos of Kate Middleton, it would probably be able to turn a pixelated female face into Kate Middleton. , but that’s only because he was trained to do so. It certainly won’t tell you if a person in a photo was Kate Middleton. As it stands, there is no AI program that can “zoom in and out” to reveal who a pixelated face really belongs to. If there isn’t enough data in the image to tell who’s really there, then there isn’t enough data for the AI ​​either.

WhatsApp Group Join Now
Telegram Group Join Now
Instagram Group Join Now

Leave a Comment