Professor Headed Project Photographing Unsuspecting Students And Faculty

In 2012, a professor at the University of Colorado at Colorado Springs headed a project that was partially funded by U.S. intelligence and military operations and took photographs of unsuspecting students and faculty members to aid in developing facial recognition technology.
The targeted students, faculty and others were walking at the time.
According to The Denver Post, “The photographs were posted online as a dataset that could be publicly downloaded from 2016 until this past April.” The story was first reported by the Colorado Independent.
Professor Terrance Boult and his team worked on the project; Boult stated, “The study is trying to make facial recognition better, especially at long range or surveillance applications. We wanted to collect a dataset of people acting naturally in public because that’s the way people are trying to use facial recognition.”
University of Denver law professor Bernard Chao told The Denver Post, “It’s yet another area where we’re seeing privacy intrusions that disturb us.”
In order to photograph his targets, Boult used a long-range surveillance camera 150 meters from from the West Lawn of the Colorado Springs campus. The camera took over 16,000 images; Boult asserted that he waited five years to release the photos in order to protect the targets’ privacy.
Jared Verner, a CU Colorado Springs spokesman, stated, “The research protocol was analyzed by the UCCS Institutional Review Board, which assures the protection of the rights and welfare of human subjects in research. No personal information was collected or distributed in this specific study. The photographs were collected in public areas and made available to researchers after five years when most students would have graduated.”
Chao countered, “There’s creeping concern that maybe he has all this data and all these photos, and what other use could be used for that?”
 
Boult insisted, “As long as the systems are bad, their potential misuse is consistent. If police use them and they match the wrong person, that’s not good. Our job as researchers is to balance the privacy needs with the research value this provides society, and we went above and beyond what was required.”
Chao concluded, “He may be helping them do something that’s not right in the first place. I’m not sure I want to be in a state where every place I walk, my picture is being taken and automatically uploaded into facial-recognition software. I actually know I would not like that. I think the response is, ‘Maybe we just shouldn’t be doing this, period.’”
The Colorado Independent noted:
The city of San Francisco rocketed facial recognition technology into the news in mid-May by passing a ban on its use — a shocker for the tech-happy metropolis that made international headlines. The Board of Supervisors passed the law on an 8-1 vote, making San Francisco the first major city to block the technology, which is increasingly used by police to target criminal suspects.
The Colorado Independent added:
 
A recent report from Georgetown Law’s Center on Privacy & Technology found Detroit and Chicago were using real-time facial recognition technology on vast surveillance camera systems. “Detroit’s system was designed to be able to operate on the city’s ‘Project Green Light’ network of over 500 cameras, including cameras outside places of worship, women’s reproductive clinics, and youth centers,” Georgetown’s website notes.
Powered by Blogger.