YouTube: No ‘deepfakes’ or ‘birther’ videos in 2020 election
YouTube is making clear there will be no “birtherism” on its platform during this year's U.S. presidential election and won't allow election-related “deepfake” videos and anything that aims to mislead viewers about voting procedures.
YouTube is making clear there will be no “birtherism” on its platform during this year's U.S. presidential election — a belated response to a type of conspiracy theory more prevalent in the 2012 race.
The Google-owned video service is also reiterating that it won't allow election-related “deepfake” videos and anything that aims to mislead viewers about voting procedures and how to participate in the 2020 census.
YouTube clarified its rules ahead of the Iowa caucuses Monday. The company is mostly reiterating content guidelines that it has been putting in place since the last presidential election in 2016.
Its ban on technically manipulated videos of political figures was made apparent last year when YouTube became the first major platform to remove a doctored video of House Speaker Nancy Pelosi. But the announcement Monday further clarifies that it will take down any election-related videos that are technically altered to mislead people in a way that goes beyond simply taking clips of speech out of context. The company also said it would remove doctored videos that could cause “serious risk of egregious harm” — such as to make it appear that a government official is dead.
Facebook, which last year had resisted early calls to yank the Pelosi video, said in January that it was banning “deepfake” videos, the false but realistic clips created with artificial intelligence and sophisticated tools. Such videos are still fairly rare compared to simpler “cheap fake” manipulations such as were used in the video that altered Pelosi's speech to make it seem like she was slurring her words.
Google also said Monday that it will remove any videos that advance false claims about whether political candidates and elected officials are eligible to serve in office. That had been policy before, but wasn't made explicit.
The company's announcement comes about nine years after celebrity businessman Donald Trump began to get notice for claiming that Barack Obama, the nation's first African American president, was not born in the United States.
Trump repeatedly voiced citizenship doubts even after Obama produced his long-form birth certificate. Trump only fully backed off from the idea in the final stages of his 2016 presidential campaign.
YouTube said it will also crack down on any attempts to artificially increase the number of views, likes and comments on videos. It changed its systems for recommending what videos users watch last year in a push to curb harmful misinformation. Twitter and Pinterest also last week outlined their efforts to reduce election misinformation on their platforms.