-1.1 C
New York
Tuesday, December 24, 2024

Exposing.AI: Project reveals abuse of Flickr photos for facial recognition

Researchers have developed a search engine that can be used to check whether Flickr images have been misused for training facial recognition systems.

For years, companies and scientists have been building their sets with training data for systems for biometric face recognition, often “wildly” without the consent of those affected. Adam Harvey and Jules LaPlace, who live as researchers and artists in Berlin, are now making this hustle and bustle in ethical and legal gray areas at least a little more transparent. With “Exposing.AI” you have developed a search engine that users can use to check whether their Flickr images have been misused for such purposes.

The project, which recently went online, allows searches to be carried out using Flickr IDs such as the user name, the NSID assigned by the image platform or a photo ID. Results will only be displayed if an exact match is found with corresponding data in the monitoring databases involved. The platform operators do not save any search data themselves and do not pass them on. The displayed photos are then downloaded directly from Flickr.com, copies of which are not kept.

Those interested can also search for photos of themselves that third parties have taken and posted on Flickr by using a hashtag. An abbreviation for events or private celebrations such as “#mybirthdayparty” is conceivable. However, the makers point out that this type of search takes a little longer: “Each photo can contain dozens of tags, which leads to millions of additional data records for the search.”

High potential for abuse and damage

“People need to realize that some of their most intimate moments have been turned into weapons,” Liz O’Sullivan , technology director at the civil rights organization Surveillance Technology Oversight Project (STOP), told the New York Times . The activist helped design Exposing.AI. According to her, it was originally planned to use automated facial recognition for the search engine as well. But the team refrained from doing this again: The potential for abuse and damage was too high.

Harvey recently reported at a conference based on findings from his forerunner project “Megapixels” on the approach taken by hunters after facial recordings in order to eradicate the sometimes still high error rates of the technology. Microsoft, for example, simply used pictures of celebrities and lesser-known people on the web for the Celeb database, while Duke University took pictures of students with telephoto lenses from a window of the institute for the “multi-tracking register” DukeMTMC. For “Brainwash”, the perpetrators even diverted image data from a live video stream from a cafรฉ in San Francisco.

Most of these databases have now been officially shut down , the artist knows: “But you can’t really get them online.” The content continued to circulate in “academic torrents” in peer-to-peer networks “around the world”. It has been proven that parts of it were taken over by the Chinese army and now also used for the suppression of the Muslim minority in the autonomous region of Xinjiang. Participating companies such as Megvii and universities would have to be liable, the activist demanded.

Delete photos

In addition to the aforementioned data sets , the project supported by the “Artificial Intelligence and Media Philosophy” group at the Karlsruhe University of Design and the Weizenbaum Institute also enables searches in MegaFace with over 3.5 million photos, DiveFace with over 115,000 photos from Flickr, VGG Face, Pipa, IJB-C, FaceScrub, TownCentre, UCCS and Wildtrack. Although Exposing.AI searches millions of records in this way, the creators say there are “countless other training records for facial recognition that are continuously being compiled from social media, news and entertainment sites”. Future versions of the project may be expanded accordingly.

Subsequent deletion of images from copies of data sets that are already in circulation is not possible, according to the website. For training databases that are still up-to-date, they are working on a function to use the search results to request operators to remove their own recordings. Photos that were removed from Flickr then no longer appeared on Exposing.AI. ( olb )

Christopher Patillo
Christopher Patillo
Christopher Patillo is an accomplished writer and editor with a passion for exploring the intersections of technology, society, and culture. With a Master's degree in Journalism Patillo has contributed to various publications. His writing focuses on emerging trends in artificial intelligence, digital privacy, and the ethical implications of technology in everyday life. He is also involved in community outreach programs aimed at promoting media literacy among youth.

Latest Posts

Don't Miss

Stay in touch

To be updated with all the latest news, offers and special announcements.