Classifieds | Archives | Jobs | About TGT | Contact | Subscribe
Last updated 5 hours, 41 minutes ago
Printer Friendly Version | TGT@Twitter | RSS Feed |
YouTube deletes 5m videos for content violations
April 25, 2018
 Print    Send to Friend

SAN FRANCISCO: YouTube, owned by Alphabet Inc’s Google, deleted about 5 million videos from its platform for content policy violations in last year’s fourth quarter before any viewers saw them, it said in a new report that highlighted its response to pressure to better police its online community.

YouTube has been criticised by governments that say it does not do enough to remove extremist content, and by advertisers, such as Procter & Gamble Co and Under Armour Inc that briefly boycotted the service when they unwittingly ran ads alongside videos the companies deemed inappropriate. YouTube said in the report on Monday that automating enforcement through software “is paying off” in quicker removals. The company said it did not have comparable data from prior quarters.

YouTube said it still needed an in-house team of humans to verify automated findings on an additional 1.6 million videos that were removed only after some users watched the clips.

The automated system did not identify another 1.6 million videos that YouTube took down once they were reported to it by users, activist organisations and governments.

“They still have lots of work to do but they should be praised in the interim,” Paul Barrett, who has followed YouTube as deputy director at the New York University ‎Stern Center for Business and Human Rights, said.

Facebook Inc also said on Monday it had removed or put a warning label on 1.9 million pieces of extremist content related to Daesh or Al Qaeda in the first three months of the year, or about double the amount from the previous quarter.

Corralling problematic videos, whether through humans or machines, could help YouTube, a major driver of Google’s revenue, stave off regulation and a sales hit. For now, analysts say demand for YouTube ads remains robust.

The following are steps that YouTube has taken.

YouTube officials say the company removes videos that contain hate speech or incite violence.

It issues “a strike” to the uploader in each instance and bans uploaders with three strikes in a three-month period.


Add this page to your favorite Social Bookmarking websites
Post a comment
Related Stories
YouTube reunites Indian with family after 40 years
MUMBAI: An Indian man missing for 40 years was to be reunited with his family on Thursday after a YouTube video of him singing a popular Hindi film song went viral. Kh..
Advertise | Copyright