"This policy does not extend to content that is parody or satire, or video that has been edited solely to omit or change the order of words"
Facebook Inc (FB.O) said it will remove deepfakes and other manipulated videos from its platform if they have been edited, but not content that is parody or satire, in a move aimed at curbing misinformation ahead of the US presidential election.
It would also remove misleading media if it was a result of technologies like AI that "merges, replaces or superimposes content on to a video, making it appear to be authentic", the California-based company said in a blogpost.
"This policy does not extend to content that is parody or satire, or video that has been edited solely to omit or change the order of words," Facebook said.
The social media giant told Reuters that as part of its new policy it will not remove a heavily edited video that attempted to make US House Speaker Nancy Pelosi seem incoherent by slurring her speech and making it appear like she was repeatedly stumbling over her words.
"The doctored video of Speaker Pelosi does not meet the standards of this policy and would not be removed. Only videos generated by artificial intelligence to depict people saying fictional things will be taken down," Facebook said in a statement.
"Once the video of Speaker Pelosi was rated by a third-party fact-checker we reduced its distribution, and critically, people who saw it, tried to share it or already had received warnings that it was false".
Facebook has been criticized over its content policies by politicians from across the spectrum. Democrats have blasted the company for refusing to fact-check political advertisements, while Republicans have accused it of discriminating against conservative views, a charge that it has denied. In the run-up to the US presidential election in November 2020, social platforms have been under pressure to tackle the threat of deepfakes, which use artificial intelligence to create hyper-realistic videos where a person appears to say or do something they did not.