YouTube follows Facebook in banning QAnon, but with caveats

Must read

How to disable and delete Samsung Pay from your Galaxy phone

When it comes to paying conveniently, Samsung Pay makes using your phone as a bank card an easy process; however, there may come...

How to Find Your Spotify Wrapped 2020

2020 has been a year for the books, but that hasn’t stopped anyone from streaming and listening to countless hours of music on...

Mi Robot Vacuum Mop-P Shipments in India Delayed, Xiaomi Now Promises to Ship by December 1

Mi Robot Vacuum Mop-P shipments in India have been delayed until December 1. The robotic vacuum cleaner was launched in the country...
Bhawani Singh
I am a blogger who believes in delivering latest tech news from around the world to my viewers.

By: Bloomberg |

Updated: October 15, 2020 10:06:33 pm

YouTube bans QAnon, QAnon, QAnon banned by YouTube, Facebook bans QAnon, what is QAnon, QAnon explained, US elections, US presidential elections, US elections 2020, US presidential elections 2020, Tech news, Indian ExpressAn attendee holds a QAnon flag before a campaign rally for U.S. President Donald Trump in Winston-Salem, North Carolina, U.S., on Tuesday, Sept. 8, 2020.

YouTube will ban videos that promote QAnon and other conspiracy theories, but only if they target specific people or groups, seeking to crackdown on potentially dangerous misinformation after criticism the service helped these fringe movements expand.

The decision comes a week after Facebook Inc. said it would remove accounts associated with QAnon, a far-right movement that the FBI has reportedly labeled a domestic terrorism threat.

YouTube’s ban is an attempt to stamp out the conspiracy without hindering the massive volume of news and political commentary on its service. Rather than a blanket prohibition of QAnon videos or accounts, YouTube is expanding its hate and harassment policies to include conspiracies that “justify real-world violence,” the company said on Thursday.

“Context matters, so news coverage on these issues or content discussing them without targeting individuals or protected groups, may stay up,” YouTube, a unit of Alphabet Inc.’s Google, wrote in a blog post.

From Explained | How QAnon is thriving in Germany

Technology platforms have released a blitz of new rules to curb misinformation after mounting momentum for movements such as QAnon. Twitter Inc. recently said it would make it harder for people to find tweets supporting QAnon, while Etsy Inc. removed QAnon-related merchandise from its online marketplace.

Pressure for these companies to act has been building for months. YouTube already instituted a policy similar to Twitter’s, although it did not publicize it. Starting last year, the service began to treat QAnon videos as “borderline content,” meaning the clips are recommended and shown in search results less often. Views from recommendations on “prominent” QAnon videos have dropped 80% since then, the company said.

YouTube was a key driver of QAnon’s early popularity, according to Angelo Carusone, president and chief executive officer of Media Matters for America, a non-profit group that analyzes conservative misinformation.

A QAnon evangelist called PrayingMedic has almost 400,000 subscribers on his YouTube channel, for instance. And even after YouTube’s borderline content move last year, QAnon videos spread from the Google service to other sites. YouTube broadcasts about the conspiracy theory featured regularly in Facebook groups and pages, until Facebook’s recent ban. YouTube QAnon clips also continued to be shared on other niche services such as Parler.

Still, Carusone said YouTube’s efforts to slow the spread of the conspiracy theory have been relatively effective in recent months.

The tech platforms and QAnon supporters will now likely enter into a game of cat and mouse, where users come up with new hashtags and different claims to evade automated filters. QAnon followers have proven particularly adept at this, according to Carusone.

“There has never been a community where their participants are as adaptable,” he said.

A significant unanswered question is how well YouTube can identify videos designed to be less obvious upon initial inspection, Carusone added.

“It is very easy for them to identify explicitly identified QAnon content and accounts,” he said. “What they have not articulated is how well that can be applied to less-explicit accounts.”

📣 The Indian Express is now on Telegram. Click here to join our channel (@indianexpress) and stay updated with the latest headlines

For all the latest Technology News, download Indian Express App.

Source link

More articles

Leave a Reply

Subscribe to our newsletter

To be updated with all the latest news, offers and special announcements.

- Advertisement -

Latest article

ZTE Blade V2021 5G With Triple Rear Cameras, Dimensity 720 SoC Launched: Price, Specifications

ZTE Blade V2021 5G has been launched in China as the latest smartphone offering from the company. The phone has a triple rear...

How to view your Spotify 2020 Wrapped top songs, artists, and more

Spotify has launched 2020 Wrapped to summarize the year on Android and iOS.This year includes quizzes, Stories, badges and new playlists.It explores podcasts,...

Spotify 2020 Wrapped Offers Insights on Your Listening Habits for the Year

Spotify 2020 Wrapped is live for eligible users and offers a look at their listening habits of the past 11 months. It is...

Kangana Ranaut blocks Himanshi Khurana on Twitter, ‘wohh, krta block’ says ex-Bigg Boss contestant – bollywood

Former Bigg Boss contestant Himanshi Khurana has been blocked on Twitter by Kangana Ranaut, after she criticised her comments on the ongoing farmer...
- Advertisement -