NIST shared dataset of tattoos that’s been used to identify prisoners – Naked Security

In 2017, the Electronic Frontier Foundation (EFF) filed a Freedom of Information Act (FOIA) lawsuit looking to force the FBI and the National Institute of Standards and Technology (NIST) to cough up info about Tatt-C (also known as the Tattoo Recognition Challenge): a tattoo recognition program that involves creating an “open tattoo database” to use in training software to automatically recognize tattoos.

For years, the EFF has been saying that developing algorithms that the FBI and law enforcement can use to identify similar tattoos from images – similar to how automated facial recognition systems work – raises significant First Amendment questions. The thinking goes like this: you can strip out names and other personally identifiable information (PII) from the tattoo images, but the images themselves often contain PII, such as when they depict loved ones’ faces, names, birthdates or anniversary dates, for example.

As part of the Tatt-C challenge, participating institutions received a CD-ROM full of images to test the third parties’ tattoo recognition software. That dataset has 15,000 images, and most were collected from prisoners, who have no say in whether their biometrics are collected and who were unaware of what those images would be used for.

Since 2017, when the EFF used a FOIA lawsuit to get at the names of the participating institutions, it’s been trying to find out whether the entities realize that there’s been no ethical review of the image collection procedure, which is generally required when conducting research with human subjects.

On Tuesday, the EFF presented a scorecard with those institutions’ responses.

The results: nearly all of the entities that responded confirmed that they’d deleted the data. However, 15 institutions didn’t bother to respond, or said “You can count us as a non-response to this inquiry”, to a letter sent by the EFF in January.

In that letter, the EFF requested that the entities destroy the dataset; conduct an internal review of all research generated using the Tatt-C dataset; and review their policies for training biometric recognition algorithms using images or other biometric data collected from individuals who neither consented to being photographed, nor to the images being used to train algorithms.

Nearly all the entities that responded confirmed that the data had been deleted. But at least one university was still conducting research with the dataset five years later: the University of Campinas (UNICAMP) School of Engineering Computer Engineering in Brazil. The university sent a letter saying that researchers are only required to seek ethics review for human data collected within Brazil. Thus, its researcher would keep working on the tattoo images through the end of year and then would delete them.

UNICAMP also refused to acknowledge that the images contained personal information, the EFF says. The group’s take on the matter:

Tattoos are also incredibly personal and often contain specific information and identifiers that could be used to track down a person even if their face and identity have been obscured. For example, even though the names of the inmates were removed from the Tatt-C metadata, the tattoos themselves sometimes contained personal information, such as life-like depictions of loved ones, names, and birth dates that all remain viewable to researchers.

UNICAMP also said that its researcher – Prof. Léo Pini Magalhãe – is adding to the dataset by grabbing images of tattoos from the web: a practice that the EFF noted has increasingly come under fire from Congress in light of the Clearview AI face recognition scandal.

National Cyber Security Consulting App







National Cyber Security Radio (Podcast) is now available for Alexa.  If you don't have an Alexa device, you can download the Alexa App for free for Google and Apple devices.