SUBSCRIBE TO OUR FREE NEWSLETTER

SUBSCRIBE TO OUR FREE NEWSLETTER

Daily news & progressive opinion—funded by the people, not the corporations—delivered straight to your inbox.

* indicates required
5
#000000
#FFFFFF
Layma Robinson

Lamya Robinson, a Black teenager, was misidentified by a facial recognition system at the Riverside Arena in Livonia, Michigan. (Photo: Fox 2/screenshot)

Skating Rink Boots Black Teen Misidentified by Facial Recognition System

"When we say this is a civil rights issue it goes beyond false arrests, it's about who gets to access public spaces in a world of machine-driven segregation."

In what civil rights advocates on Thursday framed as yet another example of the need for reforms by companies and governments alike, a roller skating rink in Michigan turned away a Black teenager after its facial recognition system misidentified her as someone no longer allowed in the building.

Lamya Robinson had her face scanned at the Riverside Arena in Livonia, Michigan and was mistaken for someone who was involved in an altercation there earlier this year. She took a photo of the scan, which her mother Juliea Robinson later posted to Facebook.

"So, i had the worst experience with Riverside Skating Rink," she wrote in a post that has since been made private, Newsweekreported Thursday. "They scanned my daughters face and stated that she's banned per this picture below in the corner that is not my child!!"

"To me, it's basically racial profiling," Juliea Robinson told the Detroit TV news station Fox 2. "You're just saying every young Black, brown girl with glasses fits the profile and that's not right."

"I was like, that is not me. Who is that?" Lamya Robinson told the station about the rink's facial scan. "I was so confused because I've never been there."

Her parents are now considering legal action. The teen's father, Derrick Robinson, suggested the rink's move endangered his daughter's safety: "You all put my daughter out of the establishment by herself, never knowing what could have happened."

The Riverside Arena said in a statement:

One of our managers asked [Lamya's mother] Ms. Robinson to call back sometime during the week. He explained to her, this [is] our usual process, as sometimes the line is quite long and it's a hard look into things when the system is running.

The software had her daughter at a 97% match. This is what we looked at, not the thumbnail photos Ms. Robinson took a picture of... If there was a mistake, we apologize for that.

In the segment, Tawana Petty of Data for Black Lives--which is among dozens of groups urging businesses to refuse to use facial recognition systems--pointed out that research has shown such software disproportionately misidentifies people of color.

"Facial recognition does not accurately recognize darker skin tones," Petty said. "So, I don't want to go to Walmart and be tackled by an officer or security guard, because they misidentified me for something I didn't do."

The digital rights groups Fight for the Future, in a tweet about the incident, said that "this is why we need to #banfacialrecognition in public places--it's racist and dangerous."

As Common Dreamsreported Wednesday, a coalition led by Fight for the Future is calling on retailers to reject the facial recognition technology in their stores "to protect the privacy and rights of shoppers and staff."

A scorecard from the campaign says that Apple, Ace Hardware, Albertsons, H.E.B. Grocery, Lowe's, and Macy's are already using it, while others have said they won't or might. Walmart was among the retailers that told the campaign it won't use such tools in its stores.

The coalition's new call directed at retailers came a day after a U.S. House subcommittee hearing on facial recognition that featured testimony from Robert Williams, a Black man in Michigan who was misidentified by facial recognition software as a shoplifting suspect.

Williams, an ACLU client, wrote for the civil liberties group's blog on Thursday:

I never thought I would be a cautionary tale. More than that, I never thought I'd have to explain to my daughters why their daddy got arrested in front of them on our front lawn. How does one explain to two little girls that a computer got it wrong, but the police listened to it anyway--even if that meant arresting me for a crime I didn't commit?

...I keep thinking about how lucky I was to have spent only one night in jail, as traumatizing as it was. Many Black people won't be so lucky. My family and I don't want to live with that fear. I don't want anyone to live with that fear. Congress should do something to make sure no one else has to.

During his congressional testimony, Williams highlighted the recently reintroduced Facial Recognition and Biometric Technology Moratorium Act, which would prohibit federal entities from using such technologies and put pressure on state and local entities via grant funding.

As Fight for the Future director Evan Greer tweeted Thursday, in response to Lamya Robinson's experience: "We need to ban this Jim Crow As A Service software yesterday."

Our work is licensed under Creative Commons (CC BY-NC-ND 3.0). Feel free to republish and share widely.