Fed up with facial recognition cameras monitoring your every move? Italian fashion may have the answer | CNN Business



Tel Aviv
CNN
 — 

The red-headed man wearing what looks like the ultimate Christmas sweater walks up to the camera. A yellow quadrant surrounds him. Facial recognition software immediately identifies the man as … a giraffe?

This case of mistaken identity is no accident — it’s literally by design. The sweater is part of the debut Manifesto collection by Italian startup Cap_able. As well as tops, it includes hoodies, pants, t-shirts and dresses. Each one sports a pattern, known as an “adversarial patch,” designed by artificial intelligence algorithms to confuse facial recognition software: either the cameras fail to identify the wearer, or they think they’re a giraffe, a zebra, a dog, or one of the other animals embedded into the pattern.

“When I’m in front of a camera, I don’t have a choice of whether I give it my data or not,” says co-founder and CEO, Rachele Didero. “So we’re creating garments that can give you the possibility of making this choice. We’re not trying to be subversive.”

Didero, 29, who’s studying for a PhD in “Textile and Machine Learning for Privacy” at Milan’s Politecnico — with a stint at MIT’s Media Lab — says the idea for Cap_able came to her when she was on a Masters exchange at the Fashion Institute of Technology in New York. While there, she read about how tenants in Brooklyn had fought back against their landlord’s plans to install a facial recognition entry system for their building.

“This was the first time I heard about facial recognition,” she says. “One of my friends was a computer science engineer, so together we said, ‘This is a problem and maybe we can merge fashion design and computer science to create something you can wear every day to protect your data.’”

Coming up with the idea was the easy part. To turn it into reality they first had to find — and later design — the right “adversarial algorithms” to help them create images that would fool facial recognition software. Either they would create the image — of our giraffe, say — and then use the algorithm to adjust it. Or they set the colors, size, and form they wanted the image or pattern to take, and then had the algorithm create it.

“You need a mindset in between engineering and fashion,” explains Didero.

Whichever route they took, they had to test the images on a well-known object detection system called YOLO, one of the most commonly-used algorithms in facial recognition software.

In a now-patented process, they would then create a physical version of the pattern, using a Computerized Knitwear Machine, which looks like a cross between a loom and a giant barbecue. A few tweaks here and there to attain the desired look, size and position of the images on the garment, and they could then create their range, all made in Italy, from Egyptian cotton.

Didero says the current clothing items work 60% to 90% of the time when tested with YOLO. Cap_able’s adversarial algorithms will improve, but the software it’s trying to fool could also get better, perhaps even faster.

“It’s an arms race,” says Brent Mittelstadt, director of research and associate professor at the Oxford Internet Institute. He likens it to the battle between software that produces deep fakes, and the software designed to detect them. Except clothing can’t download updates.

“It may be that you purchase it, and then it’s only good for a year, or two years or five years, or however long it’s going to take to actually improve the system to such a degree where it would ignore the approach being used to fool them in the first place,” he said.

And with prices starting at $300, he notes, these clothes may end up being merely a niche product.

Yet their impact may go beyond preserving the privacy of whoever buys and wears them.

“One of the key advantages is it helps create a stigma around surveillance, which is really important to encourage lawmakers to create meaningful rules, so the public can more intuitively resist really corrosive and dangerous kinds of surveillance,” said Woodrow Hartzog, a professor at Boston University School of Law.

Cap_able isn’t the first initiative to meld privacy protection and design. At the recent World Cup in Qatar, creative agency Virtue Worldwide came up with flag-themed face paint for fans seeking to fool the emirate’s legion of facial recognition cameras.

Adam Harvey, a Berlin-based artist focused on data, privacy, surveillance, and computer vision, has designed makeup, clothing and apps aimed at enhancing privacy. In 2016, he created Hyperface, a textile incorporating “false-face computer vision camouflage patterns,” and what might qualify as an artistic forerunner to what Cap_able is now trying to do commercially.

“It’s a fight, and the most important aspect is that this fight is not over,” says Shira Rivnai Bahir, a lecturer at the Data, Government and Democracy program at Israel’s Reichman University. “When we go to protests on the street, even if it doesn’t fully protect us, it gives us more confidence, or a way of thinking that we are not fully giving ourselves to the cameras.”

Rivnai Bahir, who’s about to submit her PhD thesis exploring the role of anonymity and secrecy practices in digital activism, cites the Hong Kong protesters’ use of umbrellas, masks and lasers as some of the more analog ways people have fought back against the rise of the machines. But these are easily spotted — and confiscated — by the authorities. Doing the same on the basis of someone’s sweater pattern may prove trickier.

Cap_able launched a Kickstarter campaign late last year. It raised €5,000. The company now plans to join the Politecnico’s accelerator program, to refine its business model, before pitching investors later in the year.

When Didero’s worn the garments, she says people comment on her “cool” clothes, before admitting: “Maybe that’s because I live in Milan or New York, where it’s not the craziest thing!”

Fortunately, more demure ranges are in the offing, with patterns that are less visible to the human eye, but which can still befuddle the cameras. Flying under the radar may also help Cap_able-clothed people avoid sanction from the authorities in places like China, where facial recognition was a key part of efforts to identify Uyghurs in the northwestern region of Xinjiang, or Iran, which is reportedly planning to use it to identify hijab-less women on the metro.

Big Brother’s eyes may become ever-more omnipresent, but perhaps in the future he’ll see giraffes and zebras instead of you.



Source link