Called “Fawkes”—an homage to the Guy Fawkes mask that’s become somewhat synonymous with the aptly named online collective Anonymous—the Chicago team initially started working on the system at the tail end of last year as a way to thwart companies like Clearview AI that compile their face-filled databases by scraping public posts. “It is our belief that Clearview. ai is likely only the (rather large) tip of the iceberg,” the team wrote. “If we can reduce the accuracy of these models to make them untrustworthy, or force the model’s owners to pay significant per-person costs to maintain accuracy, then we would have largely succeeded.
According to the Chicago team, this doesn’t only mean finding matching facial geometry or matching hair color or matching moles, but it also means picking up on invisible relationships between the pixels that make up a computer-generated picture of that face.
See, when a facial recognition company like Clearview is trained to recognize a given person’s appearance, that recognition happens by connecting one picture of a face (ie, from a Facebook profile) to another picture of a face (ie, from a passport photo), and finding similarities between the two photos.
Rather, it’s supposed to be a pain in the ass for the companies involved. “Fawkes is designed to significantly raise the costs of building and maintaining accurate models for large-scale facial recognition,” they write, pointing out that any of us would be more capable of “identifying [a] target person in equal or less time” using our own two eyes instead of using facial recognition software.