After Meta’s “Made with AI” labels began to appear where they shouldn’t have, a number of people voiced their worries.
The AI labeling project by Meta may be making it harder to discern between AI-generated and original photographs at a time when doing so is getting harder and harder.
According to a TechCrunch story, some users have accused the large tech giant of mistakenly labeling their posts as “Made with AI” despite the images not created using AI.
For instance, a basketball player image posted on Instagram by former White House photographer Pete Souza was labeled as “Made with AI.” “In contrast to my earlier posts, I processed this film a few days following the match. Why Instagram ishows “made with AI” on my post is beyond me. “My photos don’t use artificial intelligence,” Souza emphasized in the description.
One Instagram user said that an image of their grandparents had the AI label on it, while another user claimed that the tag was on a picture of the user herself wearing a costume. Nearer to home, the official Kolkata Knight Riders account has tagged several images as “Made with AI,” one of which is a picture of the cricket team winning the Indian Premier League (IPL) in 2024.
Meta declared that it would be labeling artificial intelligence (AI)- generated content and manipulated media on Facebook, Instagram, and Threads ahead of the Lok Sabha elections. Still, it seems like there is still a long way to go for the Mark Zuckerberg-led company’s project. Only in mobile view do the posts that were purportedly incorrectly identified as AI-generated seem like that. Furthermore, the identical IPL photo appears to have been applied inconsistently across platforms because Facebook did not have any tags on it.
In addition to mislabeling original photos as AI-generated, Meta is also appending labels to images that might have been altered with the use of AI tools like Adobe’s Generative Fill. According to sources, Meta can identify photos that have been marginally altered with AI features as AI-generated.
“Our goal has always been to make it easier for consumers to identify material created with artificial intelligence. According to a Meta representative cited by TechCrunch, “We are considering recent feedback and continue to evaluate our approach so that our labels reflect the amount of AI used in an image.”
How is AI-generated material recognized and labeled by Meta?
Meta currently uses two methods to recognize and categorize content created by artificial intelligence. First, it asks users to categorize artificial intelligence (AI)-generated or modified photorealistic video or realistic-sounding sounds. This can be a reel with a convincing AI-generated voiceover or an AI-generated movie depicting people strolling around an outdoor market. Meta alerts users to the possibility of punishment if they neglect to mark such content.
However, individuals who publish non-photorealistic AI-generated content, such as a cartoon-style movie from an outside environment, are exempt from these labeling rules.
Meta stated that it will not only allow users to self-disclose AI-generated content but also identify AI usage automatically. “Any work created with AI will bear the term “Made with AI,” provided it carries industry-standard signals indicating its AI generation. The blog post stated, “This includes content that is created or edited using third-party AI tools.”
If the images were solely modified with AI, it could imply that Meta’s detection tools are analyzing the metadata of pictures to identify them as artificial intelligence (AI) creations.
The mislabeling of original content by Meta as AI-generated may have repercussions. For example, even if the label “Made with AI” is added appropriately, users may ignore it completely. The fact that the “Made with AI” badge cannot be unchecked or removed has also been brought up by users. Consequently, many people are coming up with ways to get around Meta’s AI labeling initiative, such as placing the image into a blank Photoshop project or sharing a screenshot of the image rather than the actual image.