Meta is instituting a groundbreaking policy to promote transparency in political and social issue advertising with regards to AI-altered or digitally manipulated content in ads for political and social causes.
Starting in the new year, the tech giant will require advertisers to disclose any digital creation or alteration of images, videos, or audio within such ads on Facebook and Instagram.
This move aims to combat the spread of misinformation by ensuring users can identify content that has been manipulated to depict events or statements that did not occur.
The new regulation mandates that advertisements containing photorealistic representations of nonexistent people or events, or altered footage of real events, carry a disclosure.
However, disclosures are not required for minor edits like color correction or image resizing, provided they do not materially affect the ad’s message.
Meta will include a notice in the ad itself and in the Ad Library when content is identified as digitally altered.
Advertisers who fail to comply will face ad rejection and potential penalties.
Meta’s policy continues to prohibit content that breaches its guidelines, relying on independent fact-checkers to vet viral misinformation.
The implementation of this policy reflects a growing concern over the integrity of information disseminated through online platforms.