Signal’s Meredith Whittaker: AI is fundamentally ‘a surveillance technology’ | TechCrunch
Why is it that so many companies that rely on monetizing the data of their users seem to be extremely hot on AI? If you ask Signal president Meredith Whittaker (and I did), she’ll tell you it’s simply because “AI is a surveillance technology.”
Onstage at TechCrunch Disrupt 2023, Whittaker explained her perspective that AI is largely inseparable from the big data and targeting industry perpetuated by the likes of Google and Meta, as well as less consumer-focused but equally prominent enterprise and defense companies. (Her remarks lightly edited for clarity.)
“It requires the surveillance business model; it’s an exacerbation of what we’ve seen since the late ’90s and the development of surveillance advertising. AI is a way, I think, to entrench and expand the surveillance business model,” she said. “The Venn diagram is a circle.”
“And the use of AI is also surveillant, right?” she continued. “You know, you walk past a facial recognition camera that’s instrumented with pseudo-scientific emotion recognition, and it produces data about you, right or wrong, that says ‘you are happy, you are sad, you have a bad character, you’re a liar, whatever.’ These are ultimately surveillance systems that are being marketed to those who have power over us generally: our employers, governments, border control, etc., to make determinations and predictions that will shape our access to resources and opportunities.”
Ironically, she pointed out, the data that underlies these systems is frequently organized and annotated (a necessary step in the AI dataset assembly process) by the very workers at whom it can be aimed.
“There’s no way to make these systems without human labor at the level of informing the ground truth of the data — reinforcement learning with human feedback, which again is just kind of tech-washing precarious human labor. It’s thousands and thousands of workers paid very little, though en masse it’s very expensive, and there’s no other way to create these systems, full stop,” she explained. “In some ways what we’re seeing is a kind of Wizard of Oz phenomenon, when we pull back the curtain there’s not that much that’s intelligent.”
Not all AI and machine learning systems are equally exploitative, though. When I asked if Signal uses any AI tools or processes in its app or development work, she confirmed that the app has a “small on-device model that we didn’t develop, we use it off the shelf, as part of the face blur feature in our media editing toolset. It’s not actually that good… but it helps detect faces in crowd photos and blur them, so that when you share them on social media you’re not revealing people’s intimate biometric data to, say, Clearview.”
“But here’s the thing. Like… yeah, that’s a great use of AI, and doesn’t that just disabuse us of all this negativity I’ve been throwing out onstage,” she added. “Sure, if that were the only market for facial recognition… but let’s be clear. The economic incentives that drive the very expensive process of developing and deploying facial recognition technology would never let that be the only use.”