Take the pledge to vote

For a better tommorow#AajSawaroApnaKal
  • I agree to receive emails from News18

  • I promise to vote in this year's elections no matter what the odds are.
  • Please check above checkbox.


Thank you for
taking the pledge

Vote responsibly as each vote counts
and makes a diffrence


Issued in public interest by HDFC Life. HDFC Life Insurance Company Limited (Formerly HDFC Standard Life Insurance Company Limited) (“HDFC Life”). CIN: L65110MH2000PLC128245, IRDAI Reg. No. 101 . The name/letters "HDFC" in the name/logo of the company belongs to Housing Development Finance Corporation Limited ("HDFC Limited") and is used by HDFC Life under an agreement entered into with HDFC Limited. ARN EU/04/19/13618
News18 » Tech
3-min read

YouTube Removes More Than 8 Million Videos For Content Policy Violations

YouTube said it still needed an in-house team of humans to verify automated findings on an additional 1.6 million videos that were removed only after some users watched the clips.


Updated:April 25, 2018, 5:41 PM IST
facebookTwitter Pocket whatsapp
YouTube PiP Mode Rolling Out For Non-Premium Users: Here Are The Details
YouTube PiP Mode Rolling Out For Non-Premium Users: Here Are The Details (Reuters)

YouTube, owned by Alphabet's Google, deleted about 8 million videos from its platform for content policy violations in last year's fourth quarter before any viewers saw them, it said in a new report that highlighted its response to pressure to better police its online community. YouTube has been criticized by governments that say it does not do enough to remove extremist content, and by advertisers, such as Procter & Gamble Co and Under Armour that briefly boycotted the service when they unwittingly ran ads alongside videos the companies deemed inappropriate. YouTube said in the report Monday that automating enforcement through software "is paying off" in quicker removals. The company said it did not have comparable data from prior quarters.

Also Read: Top Five Phones With a Notch Display: iPhone X, Huawei P20 Pro, Vivo V9 And More

YouTube said it still needed an in-house team of humans to verify automated findings on an additional 1.6 million videos that were removed only after some users watched the clips. The automated system did not identify another 1.6 million videos that YouTube took down once they were reported to it by users, activist organizations and governments.

"They still have lots of work to do but they should be praised in the interim," Paul Barrett, who has followed YouTube as deputy director at the New York University ‎Stern Center for Business and Human Rights, said. Facebook also said on Monday it had removed or put a warning label on 1.9 million pieces of extremist content related to ISIS or al-Qaeda in the first three months of the year, or about double the amount from the previous quarter. Corralling problematic videos, whether through humans or machines, could help YouTube, a major driver of Google's revenue, stave off regulation and a sales hit. For now, analysts say demand for YouTube ads remains robust.

The following are steps that YouTube has taken.


YouTube officials say the company removes videos that contain hate speech or incite violence. It issues "a strike" to the uploader in each instance and bans uploaders with three strikes in a three-month period. Also banned are government-identified "terrorist organizations" and materials such groups would upload if they could. YouTube shares the digital fingerprints of removed videos with a consortium of tech companies.

Borderline videos get stamped "graphic" and stripped of features that would give them prominence. YouTube added options for advertisers to avoid sponsoring these videos last year. YouTube automated scans have sped up takedowns of videos tied to ISIS or al-Qaeda. But it has struggled to draw a line on views espoused by white right-wing extremists, who tend to know the rules well and stop short of overt hate speech.


YouTube said it would be difficult to enforce a "truth" policy, leaving the company to look for other policy violations to remove videos with misleading information. For instance, YouTube could delete a fabricated news report by finding it harasses its subject. Since autumn, it has promoted "authoritative sources" such as CNN and NBC News in search results to push down problematic material. YouTube also plans to display Wikipedia descriptions alongside videos to counter hoaxes.

But YouTube still is cited as slow to identify misinformation amid major global breaking news events when video bloggers quickly upload commentary. The company preserves other challenged clips that have public interest value or come from politicians.

Also Read: Xiaomi Mi 6X (Mi A2) Official Teaser Leaked Ahead of April 25 Launch


YouTube last year began removing videos and issuing strikes when the filming may have put a child in danger or when a cartoon character is used inappropriately. YouTube does not alert law enforcement or intellectual property owners about these videos because YouTube says it cannot easily identify uploaders and rightsholders. Copyright owners that believe a video violates guidelines or infringes their copyright or trademark can report it to YouTube.

The company last year begin stepping up moderation of comments that inappropriately reference children.

Also Watch: Canon EOS M50 'Mirrorless Camera' First Look

Get the best of News18 delivered to your inbox - subscribe to News18 Daybreak. Follow News18.com on Twitter, Instagram, Facebook, Telegram, TikTok and on YouTube, and stay in the know with what's happening in the world around you – in real time.

| Edited by: ---
Read full article
Next Story
Next Story

facebookTwitter Pocket whatsapp

Live TV

Countdown To Elections Results
To Assembly Elections 2018 Results