Travel News

YouTube suspends Trump’s channel for at least seven days

OAKLAND, California – YouTube said on Tuesday it had suspended President Trump’s channel over concerns over the “potential for continued violence,” the latest move by one of the big tech companies to limit the president online .

In one Tweeter on the official YouTube account, the Google-owned video site said it suspended Mr. Trump’s account after one of his recent videos violated its policy of banning content that spreads misinformation alleging voter fraud generalized. YouTube said it would not be able to upload new content to its channel, which had around 2.8 million subscribers, for at least seven days. YouTube has also said it turns off comments on its video indefinitely.

It wasn’t immediately what video had triggered the suspension of his account.

Many tech companies have moved to restrict Mr. Trump online since a violent mob of his supporters, under pressure from the president, stormed the Capitol last week. In the process, Facebook suspended the president of its main social network as well as Instagram at least until the end of his mandate. Twitter followed by permanently blocking Mr. Trump’s account on his service, robbing him of his favorite social media platform, where he had 88 million subscribers. Other sites, such as Snapchat, Reddit, and Twitch, have also cut down on Mr. Trump.

The measures won praise from liberals and others, who said the actions were long overdue because Mr. Trump had used social media to spread lies and incite violence. But critics said the general crackdown raises questions about the power of tech companies over online discourse.

Big tech companies have also gotten support from other sites that have hosted right-wing content. Speak, a social networking site that has become popular with Trump supporters for its cowardly approach to free speech on Monday, turned grim after Amazon pulled IT services. Apple and Google had previously removed Parler from their app stores. Parler said he was looking for a way to get back online.

This is a developing story and will be updated.

Travel News

What to expect from Facebook, Twitter and YouTube on election day

The Facebook app will also be different on Tuesday. To prevent candidates from declaring victory prematurely and inaccurately, the company plans to add a notification at the top of newsfeeds to let people know that no winner has been chosen until the election results are released. not verified by news outlets like Reuters and The Associated Press.

Keep up with Election 2020

Facebook is also planning to deploy special tools it has used in “at-risk countries” like Myanmar, where election-related violence was possible. The tools, which Facebook has not publicly described, are designed to slow the spread of inflammatory messages.

After the polls are closed, Facebook plans to suspend the dissemination of all political ads on the social network and its photo-sharing site, Instagram, in order to reduce misinformation about the election result. Facebook has told advertisers they can expect the ban to last for a week, although the timeline is not set and the company has not publicly made a long-term commitment.

“We have spent years working to make elections safer and more secure on our platform,” said Kevin McAlister, a Facebook spokesperson. “We applied lessons from previous elections, built new teams with experience in different areas, and created new products and policies to prepare for various scenarios before, during and after Election Day.”

Twitter has also been fighting disinformation since 2016, sometimes going far beyond Facebook. Last year, for example, he banned political advertising altogether, arguing that the reach of political messages “should be earned, not bought.”

At the same time, Twitter started tagging politicians’ tweets if they spread inaccurate information or glorified violence. In May, it added several fact-checking tags to President Trump’s tweets about the Black Lives Matter protests and mail-in voting, and limited the ability of people to share those messages.

In October, Twitter began experimenting with additional techniques to slow the spread of disinformation. The company added context to trending topics and limited the ability of users to quickly retweet content. The changes are temporary, although Twitter has not said when they will end.