Who is Meredith Whittaker, the extreme bastion of privacy tech who leads Signal

by time news

Since 2022 he has led the foundation behind the app which, after Telegram’s about-turn, remains the last bastion of privacy absolutism. Former Google, which he left controversially in 2019, says of his app: «Signal works within the ecosystem of the technology industry, against it»

Our Lady of Privacy has a degree in Rhetoric and English Literature from Berkeley and a white quiff on her curly black hair. Since Meredith Whittaker became president of the foundation behind the Signal app in September 2022, she has become a beacon, the icon of another vision of technology, in which ravenous access to user data is no longer a mantra but something to be repudiated or at least to be managed with a thousand precautions. Now that even Pavel Durov, the founder of Telegram, seems to have succumbed to pressure, changing the policies of his app and granting access to the authorities (in cases linked to crimes), Signal remains the last free port.

Signal is relatively little known among instant messaging apps, it has between 100 and 150 million users (there are no updated public statistics), compared to Telegram’s 1 billion and WhatsApp’s 2 billion. However, the extremely rigorous approach to privacy has made it quite widespread among technologists, journalists, political personnel, entertainment and sports stars. And it also appeals to those who want discretion while organizing protests, from Black Lives Matter to the paramilitary militias of the Oath Keepers.
«Signal, first of all, is an app created by a non-profit organization, the Signal Foundation. It is an open source project, and therefore the source code, i.e. the instructions with which it is programmed, are visible to everyone and anyone with programming skills can contribute to developing it. It is for this reason that Signal cannot use “tricks”: it is a bit as if every detail of its structure was written on a noticeboard visible to everyone” he explains Riccardo Meggiatoexpert in cyber-security and computer forensics. Then he adds: «This does not mean that the cryptographic technology used by Signal guarantees one of the highest levels of security: it is “end to end”, so only the participants in a message exchange have the respective keys to encode and decode the contents exchanged ». There’s more: «Signal also adopts a “security code” system: each conversation is marked with a different security code, visible to the participants, which guarantees its integrity. If the number changes, it’s worth checking whether any of the participants have re-installed the app, or changed phones. If not, it’s best to be careful. Then there is the possibility of communicating only via username, so as not to share any telephone numbers.” Due to these characteristics, Signal has for some time now become the app of choice for those who need to communicate in a truly secure way, to the point that it has also become a source of illegal content and communications. «In some ways – concludes Meggiato – Signal is the “serious” version of what Telegram promised to be and which, ultimately, never was».

Born in 2014 from the merger of two open source projects, RedPhone and TextSecure, by Moxie Marlinspikean American hacker and cryptographer, is today managed by a non-profit foundation, the Signal Technology Foundationwhich is financed exclusively through donations and grants. A model that allows you to maintain total independence and not have to respond to pure profit logics that often conflict with the protection of privacy. While WhatsApp, despite using Signal’s own encryption protocol, is owned by Meta (formerly Facebook) and collects metadata on user interactions, Signal adopts a “zero-knowledge” policy: the data is encrypted end-to-end and not even Signal’s servers can access it. Telegram, on the other hand, does not encrypt messages by default and its code is not open source, making it impossible to independently verify its security.

Whittaker’s career

Whittaker’s career path is as interesting as the app he drives. He worked at Google for 13 yearsexperiencing the evolution of the tech giant and its transition towards artificial intelligence (AI) from the inside, well before the explosion of the sector with ChatGpt. An experience that profoundly affected her, leading her to lead internal protests against alleged sexual harassment in the workplace and Google’s contracts with the United States Department of Defense for the development of military AI. In 2019, Whittaker left the American giant controversially, to focus on fighting “for a responsible technology industry”, saying: “It is clear that Google is not a place where I can continue this work.” He has since been true to his word, and his mission seems to have become to warn about the risks of artificial intelligence and the dangerous concentration of power in the hands of a few big tech companies. He co-founded the AI ​​Now Institute, with which Whittaker tirelessly fights to investigate the ethical and social implications of AI on our future, starting from rights and collective well-being. Whittaker doesn’t believe in dystopian scenarios about AI exterminating us. Indeed, he invites us to demystify it, remembering that it is not about magic or an immaterial entity, but about a technology based on enormous quantities of data concentrated in the hands of a few actors.

And when, as in an interview with Corriere LOGIN last year, regulations such as the AI ​​Act adopted by the EU are mentioned, she always raises the ball a little further: «It’s not enough to simply contain the risks: we need a deeper rethinking of the assumptions on which the development of AI is based. We need to stop for a moment and think: what is artificial intelligence? It is not something intangible that we have brought into this world: it is a technology that is based on an enormous amount of data concentrated in the hands of a small circle of subjects.” Hence no sci-fi flights into Terminator-style futures (which Elon Musk, among others, like so much), but a concrete cry of alarm: «The real risk is that AI begins to present misogynistic and racist reality as natural, deluding us that a machine’s responses are objective. These issues can be avoided at least in part by ensuring that everyone is correctly represented in the datasets that power AI.”
This is why Signal, with its end-to-end encryption and its non-profit vocation, represents an alternative model for Whittaker: «Signal is working within the tech industry ecosystem, against it. He’s trying to create something that interrupts the direct flow of data.”

September 28, 2024 (modified September 28, 2024 | 7:58 pm)

You may also like

Leave a Comment