https://www.myjoyonline.com/youtube-stands-alone-as-other-social-media-providers-race-to-deplatform-trump/-------https://www.myjoyonline.com/youtube-stands-alone-as-other-social-media-providers-race-to-deplatform-trump/

When it comes to social media and President Trump, one company’s actions have stood out: YouTube.

On Wednesday, Jan. 6, President Trump gave a speech that some followers took as a call to violent action, sparking a violent insurrection at the U.S. Capitol. The next day, Facebook announced it would take the unprecedented step of blocking Trump from posting at least through the end of his term on Jan. 20, and perhaps for longer. 

Snapchat followed shortly after with a temporary ban, which it later made permanent. On Friday, Twitter followed with more dramatic action, banning Trump’s account permanently. Snap started with a suspension, then followed up with a ban.

Not until the following Tuesday did Google-owned YouTube say it would temporarily suspend Trump for a week -- and not because of a new rule, but because he violated a violence policy, thus hitting strike one of the company’s three-strike rule.

Trump’s account remains online, but it cannot add new content at least until Tuesday, Jan. 19 -- one day before Joe Biden’s inauguration as president.

Trump’s YouTube home page, meanwhile, still automatically plays a 46-minute video rife with false allegations of voter fraud.

It’s been up for a month and had nearly 6 million views as of Friday (YouTube said it has left the video up because it was uploaded before the safe harbour deadline and that it is displayed alongside election results information panel).

“YouTube is kind of an outlier because right now they’re standing out beyond the rest of the social networks making aggressive calls,” said Irina Raicu, internet ethics program director at Santa Clara University. 

Not a new approach

YouTube’s measured approach is not new. Numerous reports show how YouTube has been sluggish to control misinformation in the aftermath of the 2020 election.

In October, Facebook banned all accounts related to the false conspiracy theory QAnon, which have spread voter misinformation and communicated plans for Wednesday’s events months beforehand.

In response, YouTube issued a carefully-worded policy that effectively banned some QAnon content, but stopped short of banning it, citing grey areas it categorizes as “borderline content.”

Some videos that spread misinformation and called for violence after Election Day continued to display ads, meaning their creators were earning money through the site, sometimes until a reporter notified the company.

A month after the election, YouTube said it would start removing content that falsely alleged widespread fraud surrounding the 2020 presidential election, reasoning that it hit the safe harbour deadline for the election and the fact that several states had already certified their results.

It’s not clear why YouTube moves in a slower and more measured way than its competitors when it comes to violations.

One possibility may be that it’s simply harder for YouTube and outsiders -- like researchers and journalists -- to search through video content to find violations. In addition, while most social media networks are primarily accountable to advertisers, YouTube also has a strong partnership with creators -- the company says the number of creators earning more than $100,000 a year has grown 40% in the last year, and says it’s paid out more than $2 billion to owners of copyrighted content over the last five years, for instance. Being too quick to take down material might alienate these creators and create different kinds of publicity headaches.

Consistency the move?

Alphabet CEO Sundar Pichai defended the company’s actions on on Thursday when Reuters editor-in-chief Stephen J. Adler asked whether its moves to restrict Trump’s account were too little, too late.

“When we find content violative, there’s a warning and there’s a three-strike process and it depends on the timing in which it applies,” Pichai responded. “We make these decisions to be consistent and clear and transparent about how we do it.”

Some experts praised the company’s ability to stick to its policies, while others said they saw a need for more aggressive actions.

“It is interesting to hear them talk about strikes and regular rules when the other companies acknowledged these are unprecedented times and they need to do something more aggressive given the violence unraveling,” Raicu said. “I think YouTube would argue they would be more fair but fairness also requires treating people who are similarly situated and we are not in that situation,” Raicu added.

Joan Donovan, research director at Harvard Kennedy School’s Shorenstein Center on Media, on Twitter called YouTube’s action an example of “half measures.”

John Redgrave, CEO of abuse detection software company called Sentropy, said he viewed YouTube’s actions as a way to avoid allegations of bias. “I think with more aggressive remediation action comes a lot of people questioning ‘if this is your response, why not take down others doing this?’”

But he still thinks YouTube’s approach is too lax, citing a responsibility to user safety. “You need something in proportion to the results— and triage things when a person has a million more followers. Three strikes until a ban is too many for something like this.”

Harvard law lecturer Evelyn Douek, who’s been a vocal critic of YouTube, took a contrary point of view, saying the company’s adherence to its policy should count for something, as outright bans may lead to their own problems.

“Hold on to your hats, but I think YouTube has — so far, at least — handled the Great Deplatforming well,” Douek tweeted earlier this week. “It removed a video that violated a clearly (if belatedly) stated rule against allegations of voter fraud and hasn’t removed the entire channel just coz everyone else is doing it.”

The announcement underlines “how this decision isn’t at all about how it’s perceived and just a normal application of the rules,” Douek added.

YouTube defended its policies by noting that it enforces them consistently and does not make exceptions for world leaders or anyone else. 

DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.


DISCLAIMER: The Views, Comments, Opinions, Contributions and Statements made by Readers and Contributors on this platform do not necessarily represent the views or policy of Multimedia Group Limited.