As our lives become more digital, they’ve also become more searchable. Today, lots of people can find out personal information about you—phone number, physical address—with a simple Google search. This is not always ideal, or, frankly, comfortable. Google knows this, and it’s giving ordinary people the chance to scrub this information from its all-knowing search engine.
On Wednesday, the search giant launched “Results About You,” a new tool that allows users to request the removal of their physical address, phone number, and email address with just a few clicks. In addition, beginning next year, users will be able to set alerts on their personal information in Results About You, which will enable them to ask Google to remove it faster. The feature is available in the Google app and also in your browser via individual search results.
Danny Sullivan, Google’s public liaison for Search, told Gizmodo in a video chat interview this week that, over time, people have become more sensitive to their personal information appearing in search results. They’re less comfortable with it now than they used to be, and many would prefer their information not be so readily accessible, he explained.
“Our response is just to step up to meeting those desires and [asking] how can we do this in a balanced and careful way so that we’re not just removing information that might have a public [interest],” Sullivan said.
If you’ve ever tried to get personal information off of Google, you’ll know that it’s not exactly easy. The search giant has more than half a dozen removal policies related to different content, from involuntary fake porn and sites with exploitative removal practices to images of minors and doxxing content. It’s enough to make your head spin.
The company promises that Results About You will simplify that process, at least when it comes to removing certain types of information. The feature is will also serve as a hub for Google’s other content removal policies. Although the feature will only provide a quick and easy removal process for physical addresses, phone numbers, and email addresses, it will inform folks about content that falls under Google’s other policies and direct them to the process to petition its takedown. It’s kind of like a compass when you’re lost in the mountains. In this case, the mountains are Google’s mammoth-sized website and sea of policies.
Gizmodo spoke to Sullivan about Results About You this week and other aspects of Google Search, such as whether the new alerts feature is basically a repackaged version of Google Alerts, what person or algorithm reviews content removal requests, and what Google’s doing to ensure the public understands its miles-long content removal policies. Oh, and we also asked him what a Google public search liaison is.
You can check out our Q&A below. It has been edited for length and clarity.
Gizmodo: Before we start, Danny, can you tell a bit more about what a public liaison for Search does? If possible, can you describe it in one sentence?
DS: It’s to better offer two-way communication between the search team and people outside of Google. So, in particular, the role [focuses on the fact] that people have questions about how search works. It’s really important in everybody’s lives and sometimes they don’t always understand why a search result may appear the way it does or what’s going on, and they may raise concerns. It’s usually questions. My role is to go out there and try to explain and share more insight about how search operates.
Gizmodo: How would you describe the problem of personal identifiable information on Google Search and the steps Google has been taking to address this problem?
DS: I think people are more sensitive to it over time because there’s been more information that’s out there and more people continue to search. The problem or the concern is just they’re just not comfortable with it and they’d really prefer not to have it so readily accessible.
Our response is just to step up to meeting those desires and [asking] how can we do this in a balanced and careful way so that we’re not just removing information that might have a public [interest]. There are a few cases where there might be a public aspect, but we can also address some of the concerns people have, especially I think because they’ve seen some of this content go into third party websites and you’re like, “I don’t know why it’s there.”
Gizmodo: Before Results About You, was there a process or way for folks to get personally identifiable info removed?
DS: Typically, there wasn’t. In the EU, there’s the Right to Be Forgotten mechanisms that you can use in some cases. But I think what we’re doing might be a bit broader in some other cases as well. And you know, but there really wasn’t. It wasn’t something that was within our policies. The way we work when it comes to web searches, we have very limited policies for removal.
Basically, if it’s not spam and it’s not something illegal—like we have laws we have to react to, like with child sexual abuse content—we tend to leave it there and trust in our ranking system to try to show the most helpful stuff that we can, so that we’re not stepping in and then somehow taking information out that other people might find useful.
There might be researchers that are trying to discover information for very good reasons. But this was a case where we said there’s enough interest in having this type of thing and we think that those concerns can be met without impacting the search results in a way that’s not making them less helpful to people.
Gizmodo: In recent months and years, there have been very high-profile investigations by The New York Times and other media outlets about people whose personal identifiable information has been you know has been on Google Search and that has identified them as pedophiles, for instance. Did instances such as those inspire you all in any way to roll out this feature?
So, I think those cases are actually different policies. This policy is just more [aimed at the] ordinary person [who] just really would prefer not to have [their] address and my phone number be available. That really hasn’t been the focus of those kinds of things, not to take away from those concerns, but those concerns that when they have come up have already been covered by other kinds of policies we have, especially the doxxing policy.
Prior to when we made the change, if you wanted personal information removed, you had to show doxxing. So that was dealing with that kind of thing that you would talk about [from] a newspaper report. But the ordinary person’s like, ‘Well, I wasn’t doxxed. Do I have to be doxxed to just not have the stuff that I prefer not to be there?’ And our solution is, ‘No, you don’t.’ Everybody can just have it removed. It really is broadening it toward people who don’t necessarily have some concern of harm that they’re actually having to show, it’s just that I would want to be more comfortable, right. I just would feel more comfortable if it’s just not there for whatever reason.
Similarly, I think another thing you said is that sometimes there’s harassing content. There’s other things that we do. We have a different policy about that, which doesn’t necessarily tie in to showing a physical thing, right. It can just tie into whether or not information is being posted. And usually sometimes someone says, ‘Well, I’ll remove it but you have to pay a fee.’ And we’re like, [no]. We’ve already had that policy for like two, three years already where it’s like, ‘Yeah, you don’t have to pay a fee. You can have that removed. Just fill out our form. It’s the kind of thing that we do.
[There’s] also another thing too, and this is a bit further, but you know, we recently made it so that if you’re under 18, you can remove images for any reason. And that wasn’t because there was a big investigation. It was because we understand that, especially if you’re a minor, you might post images, or friends might do it, [and realize] ‘Oh, I really don’t want those showing up in search, even though they’re out on the open web.’ So, it’s just designed to make it easier for people, just ordinary people.
Gizmodo: You all have so many policies around this, and I understand why because each is specific to a certain situation. But I think that may be a bit confusing for the average person, who might be asking, ‘Was there something that spurred Google to offer this tool at this time?’
DS: I totally agree we have lots of different policies and we have a whole page if you [type] ‘remove information from Google.’ It will list things like, ‘Is it doxxing?’ or ‘Is it nonconsensual imagery?’ etc. And so the idea here is, and I’m sure we’ll continue to build on this, you don’t have to necessarily find the right help page and then try to read it all. You’re in the moment looking at the search results saying, ‘Oh, I’m concerned about this,’ and you click on it and you can see [whether the information you’re concerned about] matches to [a Google policy for removal], then put in a request and have you guided through the process without having to necessarily read all the details. It doesn’t mean you [don’t] have to follow it, you still do need to, but it’s just to make it easier for people.
Gizmodo: What was Google’s thinking behind the new alert feature? How is it different from Google alerts for specific topics or keywords? For instance, I have a Google alert for my name for this very reason because I don’t want my address or phone number out there.
DS: It’s still being developed, so I don’t have a whole lot of a lot of things to say, but I think you’ll envision that it would be kind of like alert where you have something watching for you, which is like Google Alerts does. I doubt it’s going to be like an every day you get a report [type of thing] because you probably don’t have stuff every day, but it’s probably going to be linked into the kind of information you find sensitive [and] want to know if it’s [out] there. If you’ve said, ‘I don’t want my address to be on pages, then it will probably be tied to finding things with your name and your address out there. You could kind of do that with Google Alerts now, but it would be more integrated into this kind of system so that you would know it, you could act upon it when it came out. All the details are still being worked out, but hopefully that gives you some sense of it now.
Gizmodo: As far as reviewing the removal requests that go into Results about you, are there human reviewers who look and weigh these requests? If so, can you tell me a bit more about how they’re trained and how big the team is? Or if they’re not human reviewers, is it automated?
DS: We definitely have human reviewers that are involved. We use a variety of human and automated systems. I don’t have the numbers of the teams. I do know that it’s all working well enough to be sufficient that we feel comfortable saying that we process these things within a few days. So, you know, that’s pretty good when you figure the scale of things that we kind of deal with from there. But it’s definitely a combination of things both to make it quick but also to make it reassured and reviewed. There’s definitely review mechanisms that are built into all this.
Gizmodo: So, what is the average wait time to get an answer on whether a request was approved? I know you mentioned earlier that it’s a few days, but is there an average of how long it takes review each request or just or does it depend?
DS: I think it just depends and it can also vary. I think like initially when we launched we had a big influx of requests, so then it could have slowed things down as we get through a bunch of them. And then maybe you have a day where there’s just not a lot of requests so you’re getting through more. It can just vary. I don’t think we have like an average time like that.
Gizmodo: And if the request is denied, is there some sort of appeals process people can go through to have you all reconsider a request?
DS: Yes, if I recall, the tool will take you through it and we even go through and provide more information that might be helpful. Sometimes people just need to share a bit more information and we understand the context a bit more. That can help with that from there.
Gizmodo: I was going through Google’s help pages and I found quite a few policies on content removal. I counted six: nonconsensual images, involuntary fake porn, content on sites with exploitative practices, PII or doxing content, images of minors, and irrelevant pornography under my name.
What are you all doing to make sure that the public understands these policies and all of the recourses available to them? That’s quite a lot of policies and you have to click on each of them to understand them.
DS: Well, I think for a lot of people, most of those policies aren’t an issue for them. I think most people are probably not thinking, ‘Gosh, I had irrelevant pornography associated with my name.’ That’s like a really weird situation where someone has created a porn site and then they scrape a bunch of names that have nothing to do with it and they just generate this stuff. And typically our systems aren’t going to show that stuff. Like, if you do a search, you wouldn’t even see it to begin with. But [let’s say] you have an unusual name and there’s not a lot of information about you. And so, then suddenly there’s not a lot of pages and maybe that makes to [search results]. That doesn’t impact that many people, so it’s probably not to the degree that we need to build it into the tool just yet.
In contrast, the things that we’re [addressing] with this kind of tool really are the things that impact people a lot and things that we think will be helpful to a lot of people. And then I think what we can do more is we’ll probably continue to develop the tool to have more of that stuff that’s out there so you can understand it. So like, if there is an image, then maybe this gets integrated [into the tool] down the line where you can remove this if you’re under 18 and [the tool can] kind of guide you through that sort of process.
Totally understand there’s a lot of things that are out there. I think part of the challenge is, first of all, you build up the policies cause that’s already a big [part] of how do we deal with some of these issues and decide if we have removals and what mechanisms and criteria. And then, it’s so nice to the sense of, we have the policies now to the degree that we can say, ‘Well, how do we make it even easier for people to act on these policies and become aware of them and [build] it into the app and [build] it into the system that’s right there?’ Because I think also when people really think about ‘I want to deal with something,’ it’s when they’ve done a search and then they’re in the moment and they realize, ‘Oh, I don’t like this in relation to me, how do I deal with it?’ And now, for the first time in ages that I can think of, you can interact with it right from the search results and know how to go with it.
Gizmodo: I’ve seen that you’re on Twitter. I’m not sure if you’re familiar with it, but Twitter released a new reporting system that is similar to what you’re describing. It matches the content being reported to an individual Twitter policy on the matter. For instance, is the content you’re reporting abusive behavior? Does it target someone because of their religion? And then depending on peoples’ answers, the system matches the content with Twitter policies to help users make more effective reports.
Can you imagine something like that for Google in the future where there is like a hub that helps people match the content they want removed in search results with the Google policy on it?
DS: I think that’s what this will do. When you click on a thing next to a search result, it’s going to tell you things like, do you want to remove this? When you click on it, it’s going to say things like, ‘it shows my personal contact info, which ties to the policy. It shows my contact info with an intention to harm me. It shows something else that’s personal that I don’t like. So that’s already going to lead you into the policies that we have. It’s showing something illegal, like this is copyright infringement, or I think this is child abuse. So, it really is exactly as you’re talking about this, how do we match the policies up into this sort of a tool and guide people better to those sorts of solutions.
Gizmodo: Got it. Are all the policies in the tool? Like, is there an option to report something for content removal because it’s “irrelevant pornography,” for example?
DS: With some of the [policies], you’ll be able to go through to do the removal process completely through the [tool], like this is my personal info and I want it removed. OK, [the tool will] take you through the process and go with it. And if it’s like “this is illegal info and it’s involving child abuse,” [the tool will tell you to] click on this page. Now you can report to through the form because we haven’t hooked that [specific content removal process] into a tool system, but you’ll still be able to get to the right place and learn more about it and have a reporting mechanism. So that is the overall goal.
The main takeaway we’re saying from this story is it’s going to make it easier for you to report and process your personal removals, but it’s actually much more robust and getting to what you’re talking about, which is we do have these different policies that are out there and the tool itself is designed to better guide you to understand how to make any of these kinds of removals and reports. In some cases, you may have to go to a webpage, but in other cases, you can do it right within the tool itself.
Gizmodo: Switching gears. You mentioned the EU’s Right to Be Forgotten at the beginning and I wanted to ask you about that. How does Google’s approach to removing personal identifiable information in the U.S. differ from what it has in the EU, where you all do offer a specific Right to Be Forgotten process?
DS: Right to Be forgotten is only for the EU, it only operates within the EU. These things that we’re talking about work globally. So, you could still make use the right to be forgotten for certain things if you want, and there’s things [our content removal policies] won’t cover that right to be forgotten might cover. As I understand you might not like an article that was written about a crime that maybe you were convicted of, but now it’s old and you’re like, ‘I just don’t want that showing up anymore.’ We don’t have removal policies for that outside the EU. But in the EU you have a right to request a review of maybe having that removed. So you can do that sort of thing there, but you can’t do that through this process. On the flip side, when it comes to your personal info, there’s nothing [like that there]. It’s broader in some sense and different in other senses.
Gizmodo: Will Results about you be available in the EU as well?
DS: The tool right now is only US English, but the policies are global in nature so people can already use them around the world and then we’ll bring that out as well. It’s not like the thing that the tool does is only for people in the US searching in English, it’s just that we only have the tool process for it. We expect it’s going to come to other languages in other countries, but the underlying policies and the ability to do those kinds of removals, those are global in nature.
Gizmodo: What was the biggest challenge for the team when developing Results about you?
DS: So, I wasn’t involved in the design process of it... But I know one of the challenges was ‘how do we communicate all these different policies quickly, concisely, [and] in a way that’s helpful to people as they go through this tool form? And from what I’ve seen, I look at and I’m like, ‘Wow, this is actually good.’ [From] my perspective, where I’m especially always trying to think about how we can explain things clearly as we can. So hopefully that will come across well for people and if not, we’ll take the feedback and will continue to refine it from that.
Gizmodo: Last question, what would you want the public to know about this tool?
DS: If there’s something you are uncomfortable with about yourself in search, you have a new way of reporting it and maybe getting it removed in the right circumstances.