Primary Topic
This episode discusses the ethical and privacy concerns associated with powerful facial recognition search engines that can identify individuals from a single photograph.
Episode Summary
Main Takeaways
- Facial recognition search engines can significantly compromise personal privacy.
- Clearview AI and similar technologies have set a precedent for what many see as an overreach in surveillance capabilities.
- There are real-world implications and ethical dilemmas associated with the deployment of such technology, especially concerning consent and privacy.
- Legal and regulatory frameworks are currently inadequate to manage the privacy issues these technologies present.
- The technology's potential for abuse is high, as demonstrated by various stakeholders, including law enforcement and private enterprises.
Episode Chapters
1: Introduction to the Topic
PJ Vogt introduces the subject of facial recognition technology, describing a search engine that can find personal information through a simple photograph. "PJ Vogt: The line between general Internet curiosity and stalking felt too close."
2: Kashmir Hill's Investigation
Technology reporter Kashmir Hill shares her journey uncovering the operations of Clearview AI, a company that has influenced numerous other facial recognition technologies. "Kashmir Hill: Clearview AI has scraped billions of photos from the public web."
3: Ethical Dilemmas and Legal Concerns
Discussion on the legal ambiguities and ethical concerns surrounding facial recognition technologies, with insights into their potential for misuse. "Kashmir Hill: There's no federal law yet that gives us the right to opt out of these databases."
4: Real-world Implications
Real-life examples of how facial recognition technology is being used and abused in various sectors, from law enforcement to private disputes. "Kashmir Hill: It's hard to argue against technologies that help solve serious crimes, but the potential for misuse is vast."
5: Looking Forward
Speculations on the future of facial recognition technology, including potential regulatory responses and societal impacts. "Kashir Hill: We might be moving towards a world where facial recognition technology is as regulated as wiretapping."
Actionable Advice
- Be cautious about where your images are posted online.
- Regularly check settings on social platforms to manage privacy.
- Advocate for clear laws on facial recognition technology.
- Stay informed about technological advances and their implications.
- Consider using more secure methods for identity verification.
About This Episode
After stumbling on a new kind of search engine for faces, we called privacy journalist Kashmir Hill. She’s been reporting on the very sudden and unregulated rise of these facial search engines. Here’s the story of the very first one, the mysterious person who made it, and the copycats it helped spawn.
People
Kashmir Hill, PJ Vogt
Companies
Clearview AI
Books
"Your Face Belongs to Us" by Kashmir Hill
Guest Name(s):
Kashmir Hill
Content Warnings:
None
Transcript
PJ Vogt
Search engine is brought to you by Rosetta Stone I only speak English, and honestly, not that well. I'm about to go to England to report a story where I'm at least going to get to pretend like I'm understanding people speaking a different language. But what I should do, what you should do is try Rosetta Stone, the most trusted language learning program available on desktop or as an app that truly immerses you in the language you want to learn. Rosetta Stone is a trusted expert for 30 years with millions of users and 25 languages offered. Spanish, French, Italian, German, Korean, Chinese, Japanese, Dutch, Arabic, and Polish.
And it's an amazing value. Lifetime membership has all 25 languages for any and all trips and language needs in life. Don't put off learning that language. There's no better time than right now to get started. For a very limited time, search engine listeners can get Rosetta Stones lifetime membership for 50% off.
Visit rosettastone.com searchengine. That's 50% off unlimited access to 25 language courses for the rest of your life. Redeem your 50% off at rosetta stone.com searchengine today.
Not all search engines are good. Some are morally dubious. A couple months ago, I found myself using a morally dubious search engine. This search engine lives on a website. I'm not going to tell you the name of it for reasons that'll become clear, but I am going to describe to you how it works.
So you open the site, you upload a photo of a person's face. I can upload a photo of myself right now, wait about 30 seconds for the search to run, and then I get taken to this page, which includes all these different photos of my face from all these different places on the Internet. I can click on any of the photos and it'll take me to the site where the photo lives. This is Shazam for faces. If I put it in a stranger's face, I'll almost always get their real name because it'll take me to one of their social media profiles.
From there, I can typically use a different search engine to find a physical address, often a phone number on the site itself. I also usually see stills from any videos the person has appeared in. When I first learned about this site, I did what you do when you get Google for the first time. I looked myself up and then I started looking at my friends and it took about 30 seconds before I saw things that made me pretty uncomfortable. I was just seeing stuff I should not be seeing.
I don't know the most delicate way to say this except people I knew had compromising stuff on the Internet, stuff they had put there, but not under their real names. And I don't think they knew. I certainly hadn't known that the technology to search someone's face was right around the corner.
I decided to stop using the search engine. The line between general Internet curiosity and stalking. This felt like the wrong side of it. It felt seedy. But now, even just knowing this tool existed changed how I thought.
In the real world, I found myself trying to reach for it the way any digital tool that works begins to feel like another limb. I found a driver's license on the floor of a dance club. The person had a name too common to Google, like Jane Smith. But I realized I could just find their face with the search engine. Another night, two people at a restaurant were talking.
One of them, the guy, was telling what sounded like a very personal story about the vice president of America. Who was this guy? I realized if I snapped a photo of him, I now had the ability to know. We take for granted the idea that we have a degree of privacy in public, that we are mostly anonymous to the strangers we pass. I realized this just wasn't true anymore.
Right now there are a lot of discussions about AI chat bots, about the ethics and problems of a very powerful new technology. I feel like we should also be talking about this technology, these search engines, because my feeling using one was, we are not at all ready for this, this thing that is already here. And I wanted to know, is it too late? Is there a way to stop these tools or limit them? And I especially wanted to know who unleashed this on us.
So I called the person you call when you have questions like this, can you introduce yourself? Sure. I'm Kashmir Hill. I am a technology reporter at the New York Times and I've been writing about privacy for more than a decade now. Kashmir is one of the best privacy and technology reporters in America.
She published a book a few months ago about these search engines and about the very strange story of how she discovered that they even existed. It's called, appropriately, your face belongs to us. Her reporting follows a company called Clearview AI, which is not the search engine I was referencing before. Clearview AI is actually much more powerful and not available to the public. But in many ways, Clearview created the blueprint for copycats like the one I'd found.
Kashmir told me the story of when she learned of Clearview AI's existence, back when the company was still in deep stealth mode. So I heard about Clearview AI. It was November 2019. I was in Switzerland doing this kind of fellowship there, and I got an email from a guy named Freddie Martinez, who worked for a nonprofit called Open the government. And he does public records research, and he's obsessed with privacy and security, as I am.
Kashmir Hill
And I'd known him for years. And he sent me this email saying, I found out about this company that's crossed the rubicon on facial recognition. That's how he put it. He said he'd gotten this public records response from the Atlanta police Department describing this facial recognition technology they were using. And he said, it's not like anything I've seen before.
They're selling our Facebook photos to the cops. And he had attached the PDF he got from the Atlanta police Department. It was 26 pages. And when I opened it up, the first page was labeled privileged and confidential. And it was this memo written by Paul Clement, whose name I recognize because he's kind of a big deal lawyer, was solicitor general under George Bush, now in private practice, and he was talking about the legal implications of Clearview AI.
And he's describing it as this company that has scraped billions of photos from the public web, including social media sites, in order to produce this facial recognition app, where you take a photo of somebody, and it returns all the other places on the Internet where their photo appears. And he said, we've used it. Our firm, it returns fast and reliable results. It works with something like 99% accuracy. There's hundreds of law enforcement agencies that are already using it.
And he had written this memo to reassure any police who wanted to use it that they wouldn't be breaking federal or state privacy laws by doing so. And then there was a brochure for Clearview that said, stop searching, start solving, and that it was a Google for faces. And as I'm reading it, I'm just like, wow, how have I never heard of this company before? Why is this company doing this and not Google or Facebook? And does this actually work?
Is this real? Because it is violating things that I have been hearing from the tech industry for years now about what should be done about facial recognition technology. I flashback to this workshop I'd gone to in DC, organized by the Federal Trade Commission, which is kind of our de facto privacy regulator in the United States. And they had a bunch of what we call stakeholders there. Google was there.
Facebook was there. Little startups, privacy advocates, civil society organizations, academics. Good morning, and I want to welcome all of you both here in Washington, DC, and those watching online, to today's workshop on facial recognition technology. This workshop that Kashmir remembers. It happened in 2011.
PJ Vogt
It was called Face Facts, or maybe face facts. The video of the workshop on the FTC's website shows a string of speakers presenting at a podium in front of a limp looking american flag. We will focus on the commercial use, that is, on the possibilities that these technologies open up for consumers, as well as their potential threats to privacy. Most of you know this, but the mission of the FTC, they're talking about the nitty gritty of facial recognition technology. What safeguards need to be put in place around this technology that's rapidly becoming more powerful?
Kashmir Hill
And everyone in the room had different ideas about what we should be doing. You know, Google and Facebook at that point were just tagging friends and photos. And there are some people there saying, we need to kind of ban this. But there was one thing that everybody in the room agreed on, and that was that nobody should build a facial recognition app that you could use on strangers and identify them. Since day one, we asked ourselves, how do we avoid the one use case that everybody fears, which is to de anonymize people?
PJ Vogt
That's the CEO of a facial recognition company that would soon be acquired by Facebook. He was saying they had to prevent the use case no one wanted. Shazam. For faces. So the input into our system is both the photos and the people that you want to have identified that will give you back the answer.
So, in fact, you can never identify people you do not know. That's our mantra, right? Is this one thing that we wanted to make sure that doesn't happen? And so now I'm looking at this memo that says that has happened. Right?
Kashmir Hill
And so, yeah, I was very shocked. And I told Freddie, I'm definitely going to look into this as soon as I fly back to the United States. And that's what I did.
PJ Vogt
So at this point in late 2019, here's what Kashmir knows about Clearview AI. It's supposedly a very powerful technology that has scraped billions of photos from the public web, and it's being used by the Atlanta police Department. She doesn't know who's behind the company, but she has ideas about how to find them. She starts calling their clients.
Kashmir Hill
And so I reached out to the Atlanta police Department. They never responded. Other foias were starting to come in that showed other departments using Clearview. And I just did a kind of Google dorking thing where I searched for Clearview and then site dot gov to see if it showed up on budgets. Oh, that's really smart.
Yeah. And so I started seeing Clearview, and it was really tiny amounts, like $2,000, $6,000. But it was appearing on budgets around the country. And so I would reach out to those police departments and say, hey, I'm, like, looking at Clearview AI. I saw that you're paying for it.
Would you talk to me? And eventually, the first people to call me back were the Gainesville police Department, a detective there named Nick Ferrara. He's a financial crimes detective, and he calls me up on my phone. He said, oh, hey, I heard that you're working on a story about Clearview AI. I'd be happy to talk to you about it.
It's a great tool. It's amazing. And he said he would be the spokesperson for the company. So he told him they loved him. He's just like, this is great.
He loved it. He said he had a stack of unsolved cases on his desk where he had a photo of the person he was looking for, like a fraudster. And he had run it through the state facial recognition system, not gotten anything. And he said he ran it through Clearview AI, and he got hit after hit. And he just said it was this really powerful tool.
It worked like no facial recognition he'd used before. The person could be wearing a hat, glasses, looking away from the camera, and he was still getting these results. And this is sort of the positive case for any of this, which is that if a dangerous person who, like, has committed violent crimes is out in the world and there's some photo of them where maybe they were, like, robbing a bank and their mouth was covered and there was a hat low over their head. And if a cop can take that surveillance still, plug it into a big machine and find this person's name, we live in a safer world. Right.
This is the ideal use case. Solving crimes, finding people who committed crimes, bringing them to justice. Yeah. So Nick Ferrara, this detective, said, yeah, it works incredibly well. And I said, well, I'd love to see what the results look like.
I've never kind of seen a search like this before. And he said, well, I can't send you something from one of my investigations, but why don't you send me your photo, and I'll run you through Clearview, and I'll send you the results. So I do that. I send some photos of myself. How do you pick the photos?
I tried to choose hard photos, so I had one where my eyes were closed, one where I was wearing a hat and sunglasses, and another that was kind of like an easy photo in case those other two didn't work. And then I waited to hear how it went and see for myself how well this software works. And Nick Ferrara ghosts me. He just totally disappears. Disappears, won't pick up when I call him, doesn't respond to my email.
PJ Vogt
Kashmir says she tried this again with a different police officer in a different department, and the same thing happened. They were friendly at first Kashmir, asked them to run a search on her face. They agreed, and then they were gone. And so eventually, I kind of recruited a detective in Texas, a police detective who was kind of friend of a friend at the times, and said, oh, you're looking into this company. I'm happy to download the tool, tell you what it's like.
Kashmir Hill
And so he requests a trial of Clearview. And at this point, Clearview was just giving out free trials to any police officer as long as they had an email address associated with the department. That's what Facebook did when they first opened, but with college campuses. Yeah, exactly. It was exclusive just for government workers.
And so he goes to their website, where he can request a trial within 30 minutes. He's got Clearview on his phone, and he starts testing it, running it on some suspects whose identity he knows, and it works. He tried it on himself, and he kind of had purposefully not put a lot of photos of himself online because he was worried about exposure and people coming after him, who he had been involved in catching, sending to jail. And it worked for him. It found this photo of him on Twitter where he was in the background of someone else's photo, and he had been on patrol, so it actually had his name tag on it.
So it would have been a way to get from his face to his name. And he immediately thought, wow, this is so powerful for investigators. But it's gonna be a huge problem for undercover officers. If they have any photos online. It's gonna be a way to figure out who they are.
PJ Vogt
Yeah. And so I told him about my experience with other officers running my photo, and he ran my photo, and there weren't any results, which was weird. Cause I have a lot of photos online. Like, it just came up, like, nothing, nothing. And then within minutes, he gets a call from an unknown number.
Kashmir Hill
And when he picks up, the person says, this is Marco with Clearview AI tech support, and we have some questions about a search that you just did. Oh, my God. And he says, why are you running photos of this lady from the New York Times? And the detective kind of plays it cool, and he's like, oh, I'm just testing out the software. How would I know somebody in New York.
I'm in Texas. And anyways, his account gets suspended. Oh, wow. And this is how I realized that even though Clearview's not talking to me, they have put an alert on my face. And every time an officer has run my photo, they've gotten a call from Clearview telling them not to talk to me.
PJ Vogt
Just to spell out what Kashmir believed was going on here. These police officers may have thought they were using a normal search engine like Google, but what they hadn't counted on was that someone on the other end of that search engine seemed to be watching their searches, surveilling the cops who were using the surveillance technology. It was a moment where Kashmir saw clearly how this mysterious company, by being the first to build this tool no one else would, had granted itself immense power to monitor Kashmir, to monitor these cops. This company, whose product would reduce the average american's privacy, was keeping quite a lot of privacy for itself. Of course, Kashmir is, fortunately for us, a nosy reporter.
So all this cloak and dagger behavior just made her more curious. She tries to crack into the company a bunch of different ways. She's reaching out to anybody online who might have links to the company. She finds an address listed on Clearview AI's website. It's in Manhattan, but when she goes there in person, there's no such address.
The building itself does not exist. It's a real Harry Potter moment. Finally, she tries something that does work. On the website pitchbook, she can see two of Clearview AI's investors. Peter Thiel.
No luck there, but also an investment firm based in New York.
Kashmir Hill
They're north of the city, and they weren't responding to emails or phone calls. So I got on the metro north and went up to their office to see if they had a real office. And it was kind of an adventure being there. The office was empty. All their neighbors said they never came in.
I kind of hung out in the hallway for about an hour. A Fedex guy came. He dropped off a box. He says, oh, they're never here. And I thought, oh, my gosh, this is a waste of a trip.
But then I'm walking out of the building and it was on the second floor, and I'm coming down the stairs, and these two guys walk and they just, they were wearing, like, lavender and pink, and they just looked like moneyed. They stood out. And I said, oh, are you with Kiranaga partners, which is the name of this investment firm? And they look up and they smile at me and they say, yeah, we are. Who are you?
And I said, I'm Kashmir Hill. I'm the New York Times reporter who's been trying to get in touch with you. And their faces just fall. I said, I want to talk to you about Clearview AI. And they said, well, Clearview AI's lawyer said that we're not supposed to talk to you.
And I was around seven months pregnant at this time. And so I kind of, like, opened my jacket and just clearly display my enormous belly. And I was like, oh, I've come so far. It was cold, it was raining out. And David Scalzo, who's the main guy, main investor in Clearview at Kiranaga, he says, okay.
PJ Vogt
So Kashmir and the two investors go inside the office. Kashmir tells them all this, not talking. It's making Clearview AI look pretty nefarious. She has a point. And so one of them agrees to go on the record and starts talking about his vision for the company that he has invested in.
Kashmir Hill
David Scalza said, right now, they're just selling this to kind of like retailers and police departments. But our hope is that one day everybody has access to Clearview. And the same way that you google someone, you'll clear view their face and be able to see all the photos of them online. He says, yeah, I think, we think this company is going to be huge. And now they give Kashmir the information that she'd really been looking for, the names of the people who are actually responsible for this tool.
And they said, oh, yeah, we're really excited about the founders. And they say it's this guy, Richard Schwartz, who's kind of a media politics guy, worked for Rudy Giuliani when he was mayor. And then there's this tech genius, real mastermind, young guy, and his name's Juan Tantat. And we were in a conference room. So I'm like, can you write that up on a whiteboard for me?
How do you spell Juan Tantat? And so he writes it out, and this is the first time I figure out who the people are behind this.
PJ Vogt
After the break, the story of how Juan Tan tat and his engineers got your face and my face and 3 billion photos worth of faces into this powerful new search engine.
Search engine is brought to you by seed probiotics. Small actions can have big benefits, like how taking care of your gut can support whole body health. Seeds ds one daily symbiotic benefits your gut, skin, and heart health in just two little capsules a day. My relationship with my body is a bit of a nightmare. Probiotics can help with things that are important to me like digestion and skin health.
Your body is an ecosystem, and great health starts in the gut. Your gut is a central hub for various pathways through the body, and a healthy gut microbiome means benefits for digestion, skin health, heart health, your immune system and more. Probiotics and prebiotics work best when used consistently. Like other routine health habits, seeds subscription service easily builds DS one into your routine with no refrigeration needed. Trust your gut with seed's DS one daily symbiotic.
Go to seed.com search and use code 25 Search to get 25% off your first month. That's 25% off your first month of seeds. DS one daily symbiotic@seed.com search code 25 Search search engine is brought to you by Chilipad are you tired of sleeping hotter than hell? We are too. The chilipad bed cooling system is your new bedtime solution.
It lets you customize your sleeping environments to your optimal temperature, ensuring you fall asleep, stay asleep and wake up refreshed. Chilipad works with your existing mattress. It's a water based mattress stopper that continuously controls your bed temperature from 55 to 115 degrees, allowing your body to rest and recover. I do not sleep well ever, mainly because of temperature stuff like I feel like my whole life is just spent with 1ft out of a duvet cover and 1ft in. So I find this new technology very exciting.
If you'd like to check it out, visit www. Dot sleep dot me search to get your chili pad and save up to $315 with code search. This offer is available exclusively for search engine listeners and only for a limited time. Order it today with free shipping and try it out for 30 days. You can return it free if you don't like it.
With their sleep trial, visit Ww dot sleeps me search because you're not just investing in better sleep, you're creating a better life.
Now that Kashmir had a name to search Juan Tantat, she learned Juan had an Internet trail. This guitar you're hearing part of the trailer. Juan had a habit in one chapter of his life of posting YouTube videos of himself pretty capably playing guitar solos. In the videos he doesnt speak, but you see him, tall and slender, with long black hair, fashionable. Juan has vietnamese roots.
Raised in Australia, he moved to San Francisco in 2007. His Internet breadcrumbs suggest a strange collage of a person. A Bay Area tech guy who presents in a slightly gender fluid way, has photos from burning man, but then also seems like a bit of a troll. In a Twitter bio, he claims to be a quote anarcho transsexual, afro chicano american feminist studies major. What is clear is that Juan had come to America with big dreams of getting rich on the Internet.
He started in the Farmville era of Facebook apps, when you could make money building the right stupid thing online. Nothing he tried really took off, though. Not apps like friend quiz or romantic gifs, not later efforts, like an app that took an image of you and photoshopped Trump's hair on your head. In 2016, Juan would move to New York. At some point, he'd delete most of his social media, but Kashmir found an old artifact of who he was on the Internet back then.
Kashmir Hill
I found an archived page of his from Twitter on the way back machine, and it was mostly him kind of retweeting like Breitbart reporters and kind of saying, why are all the big cities so liberal? Yeah, he doesn't have a Twitter account. He doesn't have a Facebook account. It seemed like, wow, this is weird. Like, this guy is in his twenties, I think, but he doesn't have a social media presence beyond a Spotify account with some songs that he apparently had done.
It was a strange portrait, but it came away thinking, wow, this person is a character, and I really want to meet him. At this point, it seemed like the company understood that Cashmere Hill was not going to go away. A crisis communications consultant reached out and eventually offered to arrange an interview with Juan than Tat. When I met him, he was not what I expected him to be, which was he still had the long black hair, but now he had these glasses that felt very like office worker glasses. And he was wearing this blue suit, and he just looked like security startup CEO.
PJ Vogt
Okay. Which just, again, wasn't what I expected based on everything else I saw about him. We met at a wework because they didn't have an office. I would find out that he mostly kind of worked remotely, did a lot of the building of Clearview AI. At the time, he lived in the east village, and he kind of just did it in cafes, like places with free wifi.
Kashmir Hill
So they booked a room at Wework for our meeting. The crisis communications consultant was there. She'd brought cookies. What type of cookies? Chocolate chip.
And I feel like they were Nantucket or sausalito cookies. I can't remember the brand. Okay. But, yeah. And we had lattes at the Wework cafe, and we sat down and I just started asking my questions, and for the most part, he's answering them.
And we had a couple of hours to talk, and he really was telling me a lot. And so it was this complete 180 in person. He's very charismatic, very open, and would be evasive about some things. Wouldn't describe anyone else involved with the company besides Richard Schwartz, his co founder. But, yeah, I mean, he was open.
And I was like, you have built this astounding technology. Like, how did you do this? How did you go from what you're telling me about Facebook apps and iPhone games to building this? He said, well, I was standing on the shoulders of giants. And he said, there's been this real revolution in AI and neural networks and a lot of research that kind of the most brilliant minds in the world have done.
They've open sourced. Oh, they've put it on the Internet. Juan told Kashmir that in 2016, in the early days of building what would become the Clearview AI facial search engine, he taught himself the rudiments of AI assisted facial recognition by just essentially googling them. He'd gone on GitHub and typed in face recognition. He'd read papers by experts in the field.
PJ Vogt
He told her, quote, it's gonna sound like I googled flying car and then found instructions on it, which wasn't too far off until pretty recently. Facial recognition existed, but was somewhat crude. What Juan was learning on the Internet was that machine learning neural networks had just changed all that. Now computers could teach themselves to recognize a face, even at an odd angle, even with a beard, provided that the computer was given enough images of faces, training data to learn on. We reached out to Clearview AI for this story.
We didn't get a response, but in the years since his interview with Kashmir, Juan has done plenty of interviews with the pressure. One thing I do respect is the. Fact that you decided to come here live for an interview. So I appreciate you for taking the. Time, and thanks, Pat, for having me on.
Here's one with the YouTube show valuetainment. Juan's dressed as he was with Kashmir in a suit, looking again like a standard tech exec, just with unusually long hair. Here he describes what his research process for this search engine was like. I was looking at the advances in AI, so I saw imagenet, which is a competition for recognizing things in images. Is this a computer?
Juan Tantat
Is this a plant? Is this a dog or a cat? And the results got really good. And then I looked at facial recognition and I would read papers. So Google, Facebook, both had deep learning papers on facial recognition.
I said, hey, can I get this working on my computer? And we ended up getting it working. And what we realized was getting more data to train the algorithm to make it accurate across all ethnicities. Iranian people, black people, white people, brown people. That was really key to improving performance.
PJ Vogt
This would be Juans real innovation, a somewhat dark one. His advantage was how he would find the training data he needed. He built a scraper, a tool that would take, without asking, photos of human faces pretty much anywhere on the public Internet that could be nabbed. He also hired people who built their own scrapers to hoover up even more photos. He said, part of our magic here is that we collected so many photos, and he built this scraper.
Kashmir Hill
He hired people from around the world to help him collect photos. And so it's similar to when people talk about large language models right now. And companies like OpenAI, some of what they're doing is tuning their neural networks, but a lot of what they're doing is feeding their neural networks. It's like they have to find every text that's ever been published in every library, and then they run out of all the library texts, and they have to find transcripts of YouTube videos, which maybe they shouldn't be loading in there. It's part of what he had done correctly to get his product ahead of where the other ones were is just like he was not a genius at making the underlying AI that was mostly open source.
PJ Vogt
He was passionate about finding faces on the Internet to put into it. Yes. And so where was he looking? Oh, man. So the first place he got faces was Venmo.
Kashmir Hill
This is funny to me, because as a privacy journalist, I remembered people being upset at Venmo's approach to privacy, which at the time was, if you signed up for Venmo, your account was public by default. Yeah. And your profile photo was public by default. And so he built this scraper, this bot that would just visit venmo.com every few seconds. And Venmo, at the time, had a real time feed of all the transactions that were happening on the network.
And so he would just hit Venmo every few seconds and download the profile photo, the link to the Venmo profile. And he got just millions of photos this way from Venmo alone. And this is essentially what he was doing. But with, I mean, thousands, millions of sites on the Internet, Facebook, LinkedIn, Instagram, employment sites. Yeah.
Just anywhere they could think of where there might be photos.
PJ Vogt
I just want to butt in here to say all of this is completely astonishing. I know people at the dawn of social media who just didn't want to join Facebook or didn't understand why you would voluntarily offer your private life to the public. But I dont think anyone, or at least anyone I knew, had an imagination sufficiently tuned to dystopia to know that if you had the brazenness to upload your photo to Venmo or to LinkedIn, you could one day be feeding your face into a machine. A machine that today, if you go to a protest, is capable of using a photo of your face to find your name, your email, your employer, even your physical address. Who knew this was the future we were fumbling our way towards?
I asked Kashmir about all this. I used Venmo, I use Facebook. I'm fairly sure that when I signed up, I signed to terms of service. I did not read it carefully. I don't think there was a section in there like okaying that my face could be used in photoscraping software.
Is what he did legal? So Venmo and Facebook both sent Clearview AI cease and desist letters saying, stop scraping our sites and erase the photos that you took, delete the photos that you took. But they never sued. So this hasn't been tested in court, whether it's illegal or not, so it's still a bit of a gray area. And it hasn't been tested with Clearview because none of these companies have sued them.
In one interview shot, just a month after his conversation with Kashmir, Juan sat down with a CNN reporter who asked about this. The legality of his project. Is everything you're doing legal? Yes, it is. So we've gone through and have some of the best legal counsel from Paul Clement, who used to be the solicitor general of the United States.
Juan Tantat
He's done over 100 cases in front of the Supreme Court. And he did a study independently saying, this is not the way it's used, is in compliance with the Fourth Amendment. All the information we get is publicly available. And we have a First Amendment right to have public information on the Internet. And you have to understand what it's also being used for.
We're not just taking your information and selling ads with it or trying to get more views. We're actually helping solve crimes with this. So. So your council is making the argument that there's a First Amendment right to information that is publicly on the Internet? Yes.
And so if you take something like Google, Google, you know, crawls the Internet, collects all these web pages and you search it with keyword terms, we're the same thing. We take all this information on public web pages, but search it with the face. Juan's making a pretty radical argument here, even though his tone doesn't suggest it. He's saying that someone being able to take your face and use it to make a search that will pull up your name, possibly your address and more, is nothing new. It's just like Google.
PJ Vogt
His point is that Google collects every instance of your name on the Internet. Clearview AI is just doing that. But with your face and, you know, attaching it to your name, whether you agree with this idea or not, it has happened and it has fundamentally changed how privacy works. Kashmir says that most of us are just not prepared for this brave new world. I just dont think that most people anticipated that the whole Internet was going to be reorganized around your face.
Kashmir Hill
And so a lot of people haven't been that careful about the kind of photos they're in or the kind of photos they've put up of themselves, or the kind of photos they've allowed to be put up of themselves. And Juan actually did a clear view search of me there and I said, oh, well, last time this happened, there were no results for me. And he said, oh, there must have been some kind of bug. Sure, he wouldn't admit that they had put this alert on my face or that they had changed my results, but he ran my face and there were just tons of photos, like lots of photos I knew about. But in one case, there was a photo of me at an event with a source and I was like, wow, I didn't realize, I hadn't thought that through.
Now that if I'm out in public with a sensitive source and somebody takes a photo and posts that on the Internet, that could be a way of exposing who my sources are. And it was really stunning how powerful it was. I mean, for me, there were dozens if not hundreds of photos, me kind of in the background of other people's photos. I remember there were like, I used to live in Washington DC, and there were photos of me at the black Cat, which is a concert venue, just in the crowd at a show, it was incredibly powerful. And I remember asking him, I was like, you've taken this software somewhere no one has before.
You've created this really powerful tool that can identify anybody, find all these photos of them, you're just selling it to law enforcement. But now that you've built this and you've described to me the accessibility of building this, there's going to be copycats, and this is going to change anonymity, privacy as we know it. What do you think about that? And I remember he kind of was silent for a little bit and he said, that's a really good question. I'll have to think about it.
And it was just this stunning moment of seeing in action people that are making these really powerful technologies, who really just are not thinking about the implications, who are just thinking, how do I build this? How do I sell this? How do I make this a success?
PJ Vogt
Since Juan's interview with Kashmir, it seems like maybe he's had more time to think through better answers to hard questions. We've watched a lot of these subsequent interviews. What you notice is that now he'll say that as long as he's CEO, he'll make sure his tool is only ever in the hands of law enforcement and in some cases bank's. And he'll point again and again to the one strong reason why Clearview AI does need to exist. Without Clearview AI, there are so many cases of child molesters that would have never been caught or children who wouldn't have been saved.
Juan Tantat
Child predator will be extorting your children online. You don't even know about it. Sex torture, child abuse, child abuser, child crimes, crimes against children, dark web troves and troves of children's faces. These are kids that wouldn't have been identified. This is why our customers are very passionate about keeping the technology and making sure it's used properly.
PJ Vogt
It's hard to take the other side of that argument, but of course, Clearview AI is not just being marketed as an anti child predator tool. A Clearview AI investor told Kashmir he hoped one day it would be in the hands of regular people, and potential investors in the company were given the tool to use on their phones, like just to use as a party trick. Will Clearview AI actually ultimately roll this tool out for wide use? Well, it sort of doesn't matter whether they do or not because remember, the copycats already have after the break how people are using and abusing this technology right now.
Surge engine is brought to you by Junes Journey Junes Journey is a mobile game you can download on your smartphone. Everybody loves a good family mystery, especially one with as many twists and turns as Junes journey. Step into the role of June Parker and search for hidden clues to uncover the mystery of her sisters murder. Engage your observation skills to quickly uncover key pieces of information that lead to chapters of mystery, danger and romance. Where will each new chapter take you?
June's journey is a hidden object mystery game with a captivating detective story taking you back to the glamour of the 1920s with a diverse cast of characters. Each new scene takes you further through a thrilling murder mystery story that sets the main protagonist, June Parker, on a quest to solve the murder of her sister and uncover her family's many secrets. I have never met an iPhone game that I don't like to play. Instead of thinking, try June's journey. You can download it for free on iOS or Android.
Discover your inner detective when you download June's journey for free today on iOS and Android. Search engine is brought to you by Netsuite okay, quick math. The less your business spends on operations on multiple systems, on delivering your product or service, the more margin you have and the more money you keep. Obvious. But with higher expenses on materials, employees, distribution, and borrowing, everything costs more.
So to reduce costs and headaches, smart businesses are graduating. To Netsuite by Oracle Netsuite is the number one cloud financial system, bringing accounting, financial management, inventory, hr into one platform and one source of truth. With Netsuite, you reduce it costs because Netsuite lives in the cloud with no hardware required, accessed from anywhere. You cut the cost of maintaining multiple systems because you've got one unified business management suite, and you're improving efficiency by bringing all your major business processes into one platform, slashing manual tasks and errors. Over 37,000 companies have already made the move.
So do the math. See how you'll profit with Netsuite. Backed by popular demand, Netsuite has extended its one of a kind, flexible financing program for a few more weeks. Head to netsuite.com pj. Netsuite.com pj netsuite.com pj so you leave that conversation at that point.
Do you write your story? Yeah, I think the story came out about a week after that interview. And what was the response to the story and the size of the thing? Yeah, so it was a front page Sunday story. They call it a bulldog story.
Kashmir Hill
When you're a Bulldog story. Yeah, a bulldog heave because you open the paper and then it was the hole spread on the inside as well. And it was a big deal. I remember it landed and my phone was just blowing up cause I was being tagged. This was back when Twitter was still a healthy space for conversation.
So my Twitter's blowing up. I'm getting all these emails. People want to book me to talk about it on tv, on the radio, like it was just this huge deal. People were stunned that it existed, that it was using their photos without consent, just the way Clearview had gone about it. The fact that they had surveilled me as a journalist tried to prevent the reporting on the story.
It was a huge deal. I thought, this is going to be one of the biggest stories of the year, this was January 2020.
PJ Vogt
I see shit. So the pandemic happens, and does it just kind of like. Yeah, like I was hearing that there were gonna be, there were gonna be hearings in DC. They start getting cease and desist letters. Lawsuits happen.
Kashmir Hill
But then March 2020, COVID hits. And it just instantly changed the conversation in us and around the world to health concerns, safety concerns. And then I started seeing people talking about, can we use facial recognition technology to fight COVID? Can we start tracking people's faces, see where people were with other people? If there's a known exposure, can we track people?
And there was this talk about, yeah, using facial recognition technology. So it's like we almost skipped the scared outrage phase of the technology because the needle, it kind of juddered on everything with COVID Yeah, we did a little bit. I mean, for certain groups like european privacy regulators, they all launched investigations into Clearview, and they essentially kicked Clearview out of their countries. They said it was illegal. I mean, there were repercussions for Clearview.
But I feel like the bigger conversation, what do we do about this right now? It just got pushed aside by that larger concern around COVID. Somehow our debate about these search engines was just one of the infinite strange casualties of COVID a conversation we never quite got to have. In the meantime, Clearview AI's copycats have continued to go further than the company itself, offering their search engines online for the public to use at a small cost. None of these search engines is as powerful as Clearview, but all of them are powerful enough to do what privacy advocates were worried about back in 2011.
PJ Vogt
The tool I would end up finding online was one of those copycats. I have been noticing more and more people using them, mainly to settle scores with strangers on social media. I asked Kashmir where she has noticed people using these search engines in the wild since she published her book. I've seen news organizations using it. One of the more controversial uses was a news organization that used it after October 7 to try to identify the people who were involved in the attacks on Israel.
Oh, wow. And I was a little surprised to see it used that way. Why were you surprised? I was surprised just because it's still controversial whether we should be using facial recognition this way. And the same site that was using it had published stories about how controversial it is.
Kashmir Hill
Oh, interesting that these search engines have scraped the public web and that they invade privacy. Yeah. So I think it's still complicated. It was a news outlet that had done, maybe we shouldn't have the stories. And then they were also using the tag.
Yeah, like, maybe this technology shouldn't exist, but also it's there, so we're gonna. Use it, which sort of reveals, like, the story of every piece of technology we've ever had a stomach ache about, which is we say we don't want it to exist. And then some contingent circumstances arises in which at least some of us feel like, well, it's okay for me to use this here, even if I don't think it should exist. Right. Case by case basis.
PJ Vogt
I did ask Kashmir whether she'd seen these search engines used in a clearly positive way. I've heard of people using it on dating sites to figure out if the person they're talking to is legit. Make sure they're not being catfished. Make sure this person is who they say they are. I've heard about it being used by people who have compromising material on the Internet, say they have an onlyfans or just something they don't want to exist on the Internet.
Kashmir Hill
And they've used these search engines to figure out how exposed they are. And some of the search engines do let you remove results. And so they've done that. They've gotten rid of the links to stuff they don't want the world to know about. I've talked to parents who have used these search engines to figure out if there's photos of their kids on the Internet that they don't want to be out there.
PJ Vogt
Oh, wow. So one woman I talked to who, she's an influencer. She gets a lot of attention and she didn't want it kind of blowing back on her kids, so she stopped featuring them in any of her videos. And she searched for them with one of these search engines and found out that there was a news photo of one of her kids, a summer camp, I think that one of the kids had gone to had posted photos publicly and so she asked them to take it down. But, yeah, I mean, there are some positive use cases for these engines.
So what Kashmir is saying is that the most positive use cases for these search engines might just be finding compromising content on the Internet about yourself first before someone else using one of these search engines does, which seems like a questionable upside. Kashmir has also seen facial search engines used in a way that I have to say was just breathtaking in its pettiness. She recently reported on how the owner of Madison Square Garden was using facial recognition and surveillance to ban from the venue anyone who worked at a law firm. His venue was in litigation with Kashmir. Even tested this.
She tried to go see a hockey game with a personal injury lawyer. Something one used to be able to do freely. So I bought tickets to a Rangers game and brought along this personal injury attorney whose firm was on the ban list. Just. Cause I wanted to see this for myself.
Kashmir Hill
And. Yeah, so I met her. I was meeting her for the first time that night. We stood in line. Thank you.
Is this the ticket? Yeah. I just need to see the seat. There we go. We were walking in, we put our bags down on the conveyor belt and just thousands of people streaming into Madison Square Garden.
But by the time we picked them up, a security guard walked over to us. He said, I need to see some id from her. She shows her driver's license. And he said, you're gonna have to wait here. Just give me one moment.
So I just have to ask you to stand by. Management just has someone to speak with you. And we appreciate Payson just hanging out for me. Okay. A couple minutes here, we'll get someone.
Down to talk to you. And Amanda came over and gave her this note and told her she wasn't allowed to come in. Wow. She had to leave, but you wear that. A firm is involved with a legal action against Jordan, and your firm is not.
It was insane to see just how well this works on people. Just in the real world walking around. Yeah, it was so fast. God, that's crazy. When Kashmir reported this story, she actually heard from facial recognition companies who said they were upset that Madison Square and was doing this.
PJ Vogt
It was making their tools look bad. It was not how they said they were supposed to be used, but misuse of any technology. It's almost a given. And facial recognition is being misused not just by corporations, but also by individuals. So there was a TikTok account where the person who ran it, if somebody kind of went a little viral on TikTok, he would find out who they were and expose them.
Kashmir Hill
The one video that really struck me is during the canadian wildfires, when New York City kind of turned orange. Yeah. Somebody had done a TikTok that just showed Brooklyn and what it looked like, that it looked like something from Blade Runner. Yeah. And this guy walks by and he became the hot Brooklyn dad.
And so the TikTok account found out who hot Brooklyn dad was and then found out who his son was and said, if you want to date somebody who's going to look like hot Brooklyn dad one day, here's his son's Instagram account. That is wildly bad behavior. That's crazy. Cause that person didn't even consent to being in the first video. But I'm sure people sent it to him and were like, hey, the Internet thinks you're hot.
PJ Vogt
Don't worry, they don't know who you are. And they not only invaded his privacy further, but invaded his kid's privacy. Yeah, just for fun. And so that account was doing a lot of that, and 404 media wrote about it, and eventually TikTok took the account down.
I mean, the thing that sort of hovers around all of this is that prior to the invention of these things, it was like the Internet had taken a lot of everyone's privacy. But the one thing we had was the idea that if people didn't know your name or if you did something under a pseudonym, there's a degree of privacy. And now it's like your face follows you in a way it wasn't supposed to. Or the Internet follows your face, is how I think about it. It feels like there's a world in which technology like this would just be like fingerprint databases, where law enforcement would have it, the general public wouldn't have access to it.
Isn't that one way this could be going, instead of the way it's happening? Yeah, that is definitely a possible future outcome where we decide, okay, facial recognition technology is incredibly invasive, kind of in the same way that wiretapping is. So let's only let the government and law enforcement have access to this technology legally. They're allowed to get a court order or a warrant and run a face search in the same way that they can tap your phone line with judicial approval, and the rest of us shouldn't have the right to do it. I think that's one way this could go that seems preferable to me.
Kashmir Hill
It seems good. But then you also think about governments that can abuse that power. So, recently, here in the US, Gaza has been such a controversial issue, and you have people out doing protests. And there was a lot of talk about, well, if you are on this side of it, then you're aligned with terrorists and you are not going to get a job. We're going to rescind job offers to college students who are on this side of the issue, and it's very easy to act on that information.
Now, you can take a photo of that crowd of protesters and you can identify every single person involved in a protest, and then you can take their job away. Or if you're police and there's a Black Lives Matter protest against police brutality, you can take a photo and you can know who all those people are. But I think you notice now when you see photos from the protests, all these students are wearing masks. They're wearing COVID masks or they're wearing something covering their face. And it's because they're worried about this.
They're aware of how easily they can be identified. And the thing is, it might work. But I have tested some of these search engines and if the image is high resolution enough, even wearing a mask, somebody can be identified. Really? So just from like nose and eyes and forehead?
Yes. I did this consensually with a photo of my colleague Cecilia Kong, who covers tech and policy in DC. She sent me a photo of herself with a medical mask on. I ran it through one of the search engines and it found a bunch of photos of her.
PJ Vogt
Theres a world, you can imagine it, where someone passes a law and these tools are no longer offered to the public. They become like wiretaps, something only the police are allowed to use. We would get some of our privacy back, but, and this might not come as a surprise, there have been problems when the police use these tools as well. These search engines sometimes surface doppelgangers, images of people who look like you but who are not you, which can have real consequences. Kashmir reported the story of a man who was arrested for a crime he was completely innocent of.
The crime had taken place in Louisiana. The man lives in Atlanta. The police department had a $25,000 contract with Clearview AI, though the cops wouldnt confirm or deny that theyd used Clearview AI to misidentify him.
How do these search engines deal with errors? Do they, like, correct things? If they make a mistake? Is there a process? So in the minds of the creators of these systems, they don't make mistakes.
Kashmir Hill
They aren't definitively identifying somebody. They are ranking candidates in order of confidence. And so when Clearview AI talks about their technology, they don't say they're identifying anyone. They say that they are surfacing candidates and that ultimately it's a human being who's deciding which is a match. It's the human making the mistake, not the system.
PJ Vogt
So if I were running for local office somewhere and there was a video of someone who looks like me doing something compromising, and someone wrote a news story being like, hey, we put his face in the thing and this is what we found. And I went, hey, you're smearing me. They'd be like, we're not smearing you. We're just pointing out that you look like this guy doing something he's not supposed to do in a video. Right.
Kashmir Hill
It's the news service that covered it that smeared you. Not the facial recognition engine, but for. The person in jail. They know that they would not have been in jail if this technology didn't exist. Yes, exactly.
So there's this power of the government, right? Power of corporations, and then just as individuals. I think about this basically every time I'm at dinner now at a restaurant and there's people sitting around me, and I start having a juicy conversation, whether it's personal or about work. And I think, wow, I really need to be careful here because anybody sitting around me could, if they got interested in the conversation, snap a photo of your face, and with these kinds of tools, find out who you are. That's what I always think about.
PJ Vogt
I was at a restaurant recently, and it was outdoor dining, and I was with a friend. And in the next closed booth, there was this person. They took a phone call and they were like, one sec. This is Kamala Harris. And I think they were joking, but I could hear them, and I was like, oh, I could just zam their face.
I could kind of figure this out. I might be able to find out privileged stuff about a conversation with a very high ranking member of the US government. Like, this is. I felt real nausea. I felt nausea at the possibilities.
Kashmir Hill
Yeah. I mean, I think there's just so many moments in our daily lives where we just rely on the fact that we're anonymous. Like, you know, you're at dinner, you're having this private conversation, and then CreepyPJ is gonna be sitting there and looking up your connection to the vice president. Has it made you more? Are you different in public now?
Yeah, I mean, I just think that is the risk of facial recognition technology. The same way that we feel this concern about what we put on the Internet. Like the tweets you write, the emails you write, the texts you send, just thinking, am I okay with this existing and possibly being tied back to me, being seen in a different context? That is going to be our real world experience. You have to think all the time is something that I'm saying right now that could be overheard by a stranger, something that could get me in trouble or something that I would regret.
And, I don't know, that just terrifies me. I don't want to be on the record all the time, every minute, anywhere I am in public. You just kind of assume that these things that you're doing aren't going to haunt you for the rest of your life, or follow you for the rest of life, or be tied to you, unless you're a celebrity with a very famous face. And it's been funny because I've talked with various people who do have famous faces, and I talk about this dystopia where it's like everything you do in public will come back to haunt you. And usually after the interview they'll say, that's my life.
I'm like, yes. What this technology does is it makes us all like celebrities. Like famous people. Minus the upsides. Minus the upsides.
PJ Vogt
What do you do if you don't want to be in these databases? Don't have photos of yourself on the public Internet? It's hard not to get into these databases. These companies are scraping the public web so we can't get out of Clearview's database. And there's no federal law yet that gives us the right to do that.
Kashmir Hill
European privacy regulators have said that what Clearview I did was illegal and that Clearview needed to delete their citizens. And Clearview basically said, we can't tell who lives in Italy or who lives in the UK or who lives in Greece. So there's not really much we can do. It's funny, though, because I'm not a technology CEO, and if you asked me to actually fix that problem, I actually could fix that problem. Like, you could say, anybody can email us and ask to be taken out if they prove that they live in Greece.
PJ Vogt
You would think they could actually do something about it. Yeah. This is where it gets so complicated. For a while, Clearview AI was honoring requests from Europeans who wanted to be deleted from the database. But then at some point, they just stopped and said, actually, we don't feel like we need to comply with european privacy laws because we don't do business in Europe anymore.
God. Yeah. They're like ungovernable. Yeah. In some jurisdictions, you can get the company to delete you.
Kashmir Hill
In the US, there are few states that have laws that say you have the right to access and delete information that a company has about you. California is one of those states. If you live in California, you can go to Clearview AI and give them your driver's license and a photo of you and they'll show you what they have of you in the database. And if you don't like it, you can say, delete me. But there are only a few states that have such a law.
For most of us, like here in New York, we don't have that protection, so we can't get out of Clearview's database.
Facial recognition is hard because these companies are based in places that don't have great privacy laws, like the United States, and they're making people around the world searchable. It really is a hard problem and. On a larger sense, as a country, society, world, if we were like, we just don't want this technology to exist. I know this is kind of like a child's question, but what would it look like to put the genie in the bottle? I mean, make it illegal.
Force all companies to delete the algorithms. And you have to decide, are we talking about all facial recognition, your iPhone opening when you look at it, right? Or are we talking about just these big databases that are searching for your face among millions or billions of other faces? I don't think that's gonna happen. I don't think it's going away.
But I do think we have this kind of central question about facial recognition. Should these companies have the right to gather all these faces from the public Internet and make them searchable? And I think that is something that could be shut down if we wanted it to be.
PJ Vogt
Kashmir Hill. She's a reporter at the New York Times and author of the very excellent book your face belongs to us. Go check it out.
Stick around. After the break, we have some show news.
D
T Mobile has invested billions to light up America's largest 5g network, from big cities to small towns, including right here in yours. And great coverage is just the beginning. Right now, families and small businesses can save up to 20% versus at and T and Verizon when they switch. Visit your local T Mobile store today.
PJ Vogt
Plan savings with three lines of T Mobile essentials versus comparable available plans. Plan features and taxes and fees may vary. T Mobile has invested billions to light up America's largest 5g network, from big cities to small towns, including right here in yours. And great coverage is just the beginning. Right now, families and small businesses can save up to 20% versus at and T and Verizon when they switch.
D
Visit your local T Mobile store today.
PJ Vogt
Plan savings with three lines of t mobile essentials versus comparable available plans. Plan features and taxes and fees may vary.
Welcome back so quickly. Before we go. This week we are heading towards the end of season one of search engine. Is there going to be a season two of search engine? How has season one gone?
Great questions. We will be answering them, all of them, and whatever other questions you have about search engines present and future in questionably transparent detail at our upcoming board meeting. The date is Friday, May 31. We will be sending out the details with the time and a zoom link to join. This is only for our paid subscribers, people who are members of Incognito mode.
If you are not signed up but you want to join this meeting, you've got to sign up. You can do so at search engine show. You get a lot of other stuff too. You can read about all the benefits on the website. Again, that URL is search engine show.
If you're a paid subscriber, look out for an email from us next week and mark your calendar May 31, 2024 Surge Engine is a presentation of Odyssey and Jigsaw Productions. Surge engine was created by me, PJ Vote and Truthi pinimane and it is produced by Garrett Graham and Noah John. Fact checking this week by Holly Patton. Theme, original composition and mixing by Armin Bazarian. Our executive producers are Jenna Weiss Berman and Leah Reese Dennis.
Thanks to the team at Jigsaw, Alex Gibney, Rich Perello and John Schmidt, and to the team at Odyssey, JD Crowley, Rob Mirandi, Craig Cox, Eric Donnelly, Kate Hutchinson, Matt Casey, Maura Curran, Josephina Francis, Kurt Courtney and Hilary Schaff. Our agent is Oren Rosenbaum at Uta. Follow and listen to search engine with PJ. Vote now for free on the odyssey app or wherever you get your podcasts. Thank you for listening.
We will see you in two weeks when we'll have a double episode for you. It's our version of the Wall.