Here’s the File Clearview AI Has Been Keeping on Me, and Probably on You Too

Credit to Author: Anna Merlan| Date: Fri, 28 Feb 2020 20:22:24 +0000

After a recent, extensive, and rather withering bout of bad press, the facial recognition company Clearview AI has changed its homepage, which now touts all the things it says its technology can do, and a few things it can’t. Clearview’s system, the company says, is “an after-the-fact research tool. Clearview is not a surveillance system and is not built like one. For example, analysts upload images from crime scenes and compare them to publicly available images.” In doing so, it says, it has the power to help its clients—which include police departments, ICE, Macy’s, Walmart, the FBI, and thousands of others, according to a recent Buzzfeed report—stop criminals: “Clearview helps to identify child molesters, murderers, suspected terrorists, and other dangerous people quickly, accurately, and reliably to keep our families and communities safe.”

What goes unsaid here is that Clearview claims to do these things by building an extremely large database of photos of ordinary U.S. citizens, who are not accused of any wrongdoing, and making that database searchable for the thousands of clients to whom it has already sold the technology. I am in that database, and you probably are too.

If you live in California, under the rules of the newly enacted California Consumer Privacy Act, you can see what Clearview has gathered on you, and request that they stop it.

Do you work at Clearview or one of its clients? We'd love to talk to you. Contact Anna Merlan from a non-work device at anna.merlan@vice.com or via VICE's SecureDrop.

I recently did just that. In mid-January, I emailed privacy-requests@clearview.ai and requested information on any of my personal data that Clearview obtained, the method by which they obtained it, and how it was used. (You can read the guidelines they claim to follow under the CCPA here.) I also asked that all said data be deleted after it was given to me and opted out of Clearview's data collection systems in the future. In response, 11 days later, Clearview emailed me back asking for “a clear photo” of myself and a government-issued ID.

“Clearview does not maintain any sort of information other than photos,” the company wrote. “To find your information, we cannot search by name or any method other than image. Additionally, we need to confirm your identity to guard against fraudulent access requests. Finally, we need your name to maintain a record of removal requests as required by law.”

After a moment of irritation and a passing desire not to give these people any more of my information, I emailed Clearview a photo of my work ID badge and a redacted copy of my passport. About a month went by, and then I got a PDF, containing an extremely curious collection of images and an explanation that my request for data deletion and opt-out had been processed. “Images of you, to the extent the [sic] we are able to identify them using the image that you have shared to facilitate your request, will no longer appear in Clearview search results,” the “Clearview Privacy Team” wrote.

The images themselves are indeed all photos of me, ones that I or friends have put on social media, and they are exceedingly odd. (The source of them is odd, not my face, although, that too.)

The images seen here range from around 2004 to 2019; some are from my MySpace profile (RIP) and some from Instagram, Twitter, and Facebook. What’s curious is that, according to Clearview, many of them weren't scraped from social media directly, but from a collection of utterly bizarre and seemingly random websites.

The “Image Index” lists where the photos were obtained; the sites include Insta Stalker—one of dozens of sketchy Instagram scrapers available online—an enraged post someone wrote accusing me of yellow journalism, and the website of an extremely marginal conspiracy theorist who has written about me a handful of times.

Nicholas Weaver, a senior researcher at the International Computer Science Institute at UC Berkeley, said that the response "gives you an insight into the various sources being scraped." He noted that Clearview is not just obtaining images from social media sites like Instagram themselves, but also from other sites that have already scraped Instagram, like Insta Stalker.

The data presented here don’t necessarily confirm that Clearview is able to accurately do what it claims to: allow someone to upload a photo of a subject and return publicly available photos of that person. But I do know, thanks to the CCPA, who Clearview plans to share photos of my face with: “Clearview users: law enforcement, security, and anti-human trafficking professionals,” as they write in their explanation of how they intend to comply with the CCPA.

There’s also this baffling addendum, which seems to suggest that Clearview is going through a security penetration test at the moment: “Occasionally and for limited purposes and durations, third party service providers can use Clearview’s search tools to assess their accuracy and verify our cybersecurity performance.”

What is clear is that this information is available to far more people than Clearview likes to acknowledge, and that they have future, as-yet-unannounced plans for their photos of your face. Reporters at Gizmodo were recently able to download a version of Clearview’s app, which they found, they report, “on an Amazon server that is publicly accessible.”

“Other bits of code appear to hint at features under development,” the Gizmodo reporters wrote, “such as references to a voice search option; an in-app feature that would allow police to take photos of people to run through Clearview’s database; and a “private search mode,” no further descriptions of which are available through surface-level access.”

Adam Schwartz, senior staff attorney at the Electronic Frontier Foundation (EFF), wrote in an email to VICE: "EFF is disturbed that Clearview AI has made faceprints of people without their consent, and is selling personal information based on these faceprints to police departments and others. This is why we need privacy laws like the Illinois Biometric Information Privacy Act, which requires consent before collection of biometrics, and the California Consumer Privacy Act, which empowers consumers to access the personal information a business has collected about them."

Clearview has still declined to release a full list of the agencies who use their product. It’s also claimed that the app has been “tested for accuracy and evaluated for legal compliance by nationally recognized authorities,” without citing who those authorities are. And it, of course, represents a breach of privacy more extreme than anything any technology company has ever produced. But at least, if you live in California, you can see what they’ve got about you, and take your word for it that they’ll stop.

Clearview's lawyer and PR spokesperson did not immediately respond to questions asking how many requests the company has received, or how many records it has deleted under applicable laws.

Additional reporting by Joseph Cox.

This article originally appeared on VICE US.

http://www.vice.com/en_ca/rss