Meet the teams keeping our corner of the internet safer

Share
  • February 5, 2019

I joined Google a year ago to lead its Trust and Safety organisation and to work with the thousands of people working to protect our users and make our products, from Gmail to Maps, more safe.

Deciding what content is allowed on our platforms, while preserving people’s right to express themselves freely at the colossal scale we operate is a big responsibility. It means developing rules that we can enforce consistently on much-debated lines. It means balancing respect for diverse viewpoints and giving a platform to marginalised voices, while developing thoughtful policies to tackle egregious content. These values can often be in tension and the calls we make can be controversial. We feel the weight of our responsibility here and the impact of our decisions keenly.

Our teams tackle a huge spectrum of online abuse, from petty scams, like the email from a “relative” stranded abroad needing a bank transfer to get home safely, to the utterly abhorrent, including child sexual abuse material (CSAM) online. We work across products like Search, which connects people to information hosted on the web, as well as across products we host, like Photos. Understanding the different parameters of the products we serve is vital to our work and policy development. Given that breadth, our team is diverse, comprising product specialists, engineers, lawyers, data scientists, ex-law enforcement officials and others. They work hand-in-hand around the world and with a global network of safety and subject matter experts.

Our goal in the Trust and Safety team is to achieve both accuracy and scale in our work. That’s why we have people and technology working together—and we invest heavily in both. More and more, we use smart technology to detect problematic content hosted on our platforms, which is driving progress. Take violent extremism online. Where once we relied heavily on users to flag this content to us, today the majority of terrorist content we remove on Google products is first identified by our machines. We can then send this content to our language and subject matter experts, who swiftly and accurately review and remove content. We’ve also built systems that allow us to work in partnership with NGOs, other tech companies, and government Internet Referral Units, like Europol, to alert us to potentially problematic content.

Other issues, like combating hate speech, require a different approach. I’m proud of the strong progress we’re making to tackle online hate, including through the European Commission’s Code of Conduct on hate speech. We’ve improved our speed and accuracy of review by creating a dedicated team of language specialists in the EU. But there are many distinct challenges here. Standards for what constitutes hate speech vary between countries, as does the language and slang that’s used. Making meaningful progress through automatic detection will take time, but we’re putting our best technology and people to the task.

To give a sense of the scale of our efforts, in 2017, our team pulled down 3.2 billion ads that broke our policies; they also blocked 79 million ads designed  to trick you into clicking on malware-laden sites. Between July and September 2018, YouTube removed over 7 million videos that broke its rules and blocked 224 million comments. Across other products like Drive, Photos and Blogger, in the past year, we took down over 38,000 pieces of hate speech and 160,000 pieces of violent extremism content. We also support tools like SafeSearch, which help you avoid explicit Search results.

None of this work can be done in isolation and our partnerships are essential. Nor can our policies be static—we must be responsive to the world around us and take the guidance of experts. That’s why I’m in Brussels this week to share insights from our work in content moderation and to listen and learn from others. The message I’ll bring from Google is that we will be more transparent, accountable and frank about where we can improve.  I have no doubt that 2019 will bring more challenges but rest assured that we will dedicate all the resources necessary to do our part. We’ll do all that we can, through technology and people, to meet and overcome the many challenges we face online, and to think beyond our corner of the internet.

Editor’s note:  Kristie Canegallo is speaking at CoMo Brussels, a conference about content moderation held at the European Parliament. Kristie’s background is in government, where she worked under Presidents Bush and Obama in a range of national security and domestic policy  roles, including as President Obama’s Deputy Chief of Staff.

Source : Meet the teams keeping our corner of the internet safer