8.28 million videos removed by YouTube in final quarter of 2017
YouTube has been heavily criticized in the past for issues with video and commenting moderation. Problems such as these have cost the company dearly in advertising losses, so it seems the Google-owned company now wants to prove to the world (and its advertisers) that it has been taking steps to improve the site’s content moderation. In the final quarter of last year 8,284,039 videos were removed from the site for breaking community guidelines. YouTube’s first quarterly Community Guidelines enforcement report offers up some interesting statistics:
- India is the top flagging nation, followed by the United States and Brazil.
- Human flaggers cited sexual content as the number one reason for flagging, accounting for 30.1% of flags.
- One for the conspiracy theorists: 73 of the 8.28 million videos flagged were reported by government agencies.
- 75.9% of videos subject to automated flagging were removed before receiving any views.
The final fact suggests that YouTube’s machine-learning flagging system, which was introduced in June 2017, has been reasonably efficient in catching videos breaking the site’s community guidelines - but there is still room for improvement. It takes thousands of trusted flaggers and users to assist with the platform’s moderation, and with over 400 hours of video being uploaded every minute, enforcing strict guidelines is definitely an uphill task for YouTube’s staff and AI systems.
Are you a techie who knows how to write? Then join our Team! Wanted:
- News translator (DE-EN)
- Review translation proofreader (DE-EN)
Details here