Privacy in the age of AI is possible, says Proton’s CEO, but one thing keeps him up at night.

Follow ZDNET: Add us as a favorite resource on Google.
Highlights taken by ZDNET
- AI and Big Tech are taking away personal privacy.
- Proton’s encrypted devices are becoming more and more attractive.
- Proton CEO Andy Yen worries about a future filled with corrupt agents.
As the popularity of AI continues to grow, privacy and security concerns surrounding the technology have continued, especially over the past year.
AI is now a common tool for cybercriminals, making it much easier for bad actors to steal your data. The technology also allows the promotion of mass surveillance to a new level. AI agents such as OpenClaw have continued to perform poorly despite being adopted by technology giants such as Nvidia and Meta, leaking or exfiltrating sensitive data.
Also: Proton just launched an alternative to Google Workspace – and it’s fully encrypted
Earlier this month, I attended Semafor World Economy in DC, where 500 CEOs joined government leaders to discuss the state of global business, including the impact of AI on security and privacy. Andy Yen, CEO of VPN and private digital service provider Proton, spoke on the topic; I sat down with Yen after his panel to discuss how privacy can coexist with AI, what its future looks like, and why he thinks Proton is well-positioned for success.
Privacy in public
AI and cryptocurrency go hand-in-hand: the thinking goes that the more data AI tools have access to, the more efficient they are, whether for business or individual applications. That directly bends implementation and effectiveness in risk tolerance. However, popularity has increased significantly in the past two years, especially in critical use cases such as healthcare.
Also: How to review what ChatGPT knows about you – and reclaim your data privacy
Since Proton was founded in 2014, long before the use of AI exploded among everyday consumers, the company has offered users the first privacy options in tools from Big Tech like Google, Microsoft, and Meta. However, Yen does not think that the proliferation of AI tools has increased data privacy concerns among the public. In his view, the problem is the productivity mismatch between privacy awareness and technology adoption.
“There are a lot of people who really care about privacy, but they don’t have enough technical knowledge and don’t know how to protect themselves,” he said. “And then there’s the kind of middle-aged people — we’re actually worse off because we don’t have the privacy of our parents to focus on, yet we’re taking all this technology. So we’re clueless and we’re more exposed.”
That said, Yen hopes that education will solve that.
Also: 5 reasons you should shut up about your chatbot (and how to fix past mistakes)
“The best way to protect someone is to just educate them about the risks,” he said. “If one piece of education is done right, then everything else will follow naturally.”
Beyond this solution, however, he hopes that the lack of knowledge in the masses is only a matter of time.
“I think we have to take this from a long-term perspective,” he said. “When we started Proton in 2014, maybe one in ten [people] he understood the business model of Google and Facebook. Today, maybe 4 out of 10, and when OpenAI started playing ads and suggesting income bias, it was seen by many people — maybe 7 out of 10.”
For now, Yen believes that the next generation is better prepared for the world created by AI, despite what appears to be apathy.
“Young people are the most aware — they know how Google makes money, how ads work, about algorithms, but they don’t seem to care,” he said. “Given the choice between not knowing and not caring, I choose an aware and indifferent audience, because you can get them to care.”
Also: This first privacy chatbot is taking off – here’s why and how to try it
Duck.ai, a chatbot from independent browser company DuckDuckGo, saw a surge in web traffic earlier this year. Despite not getting industry leaders like ChatGPT and Claude, the spike is in line with a trend Yen said he’s seeing at Proton, and assures him that more people will eventually turn to privacy-first options.
“Lumo is the fastest growing product within Proton today,” Yen said of the company’s encrypted chatbot. “That kind of shows that people need AI; they use it every day, it’s a big part of life today, but basically, no one trusts it. The ability to get the benefits of AI, but have the guarantee that your conversation remains private in the future, that’s very powerful. As time goes on, more people will want that.”
The biggest threat of AI
But the protections offered by Proton have their limitations. When I asked Yen what he believed he and Proton were unprepared for when it came to AI, he quickly answered: Agents.
“You can have the strongest encryption in the world, but if as a user you give your agent access to Proton Mail on your device, and that agent goes crazy and sends all the information over the Internet somewhere, Proton encryption won’t save you,” he said. “That’s the natural limit to what we’re capable of.” Theoretically, he said, Proton could develop its own agent against this weakness, but that has not yet worked.
Also: The permissions behind your AI Chrome extensions deserve a close look – they may be checking you out
Yen sees local AI as one of the best ways to deal with privacy issues. (Proton’s Scribe AI writing assistant gives users the option to use it locally.) Right now, it’s hard to scale computing on personal devices, but he thinks local AI will become more effective in the next few years.
“If you look at the modern iPhone and compare it to the first smartphones from 10 years ago, the amount of computing, of storage, is orders of magnitude higher, and that trend will continue,” said Yen. “But LLMs aren’t too bad. In fact, we’ll have less and less effective models as time goes on.”
Early intervention
One way to protect future generations from data privacy risks is to keep them out of the Big Tech ecosystem altogether. Yen said he is particularly focused on protecting children, because that is where he believes Proton can have the most impact. Last month, the company introduced an option for parents to store their children’s first email address with Proton, even before they are born.
Also: Worried about AI privacy? This new tool from the creator of Signal adds edge-to-edge encryption to your conversations
“For many people, the time they start caring is when they have children,” he said. “You have a choice: are you going to sign them up to the Google ecosystem, with all the downsides and pitfalls that entails, and lock them up for a lifetime of being a commodity abused by big technology? Or are you going to take another path and set them up for a different start to life?”
For Yen, timing is of the essence in that decision.
“If I’m providing an alternative to someone at the age of 40, after being exploited for twenty years by Google, yes, it’s better late than never, but I think it’s best if we can get the next generation a very good start at the beginning,” he said.
Can privacy-first AI compete?
A future with small data powered by AI is perhaps only meaningful if it is implemented at scale. Companies like Proton are facing the challenge of getting individual consumers and business customers to care enough about privacy to abandon legacy systems and the attractive features they offer. For example, personalization is one of the most attractive aspects of AI, which is only possible with tons of data. Does that limit what cryptographic AI can do, or how effectively it can grow?
Yen noted that it is possible to successfully compute with encrypted data, but the biggest difference between early privacy AI and leading labs is cost.
“There’s a Google Workspace and a Proton Workspace, and they look the same,” Yen said of his newly released enterprise platform. “But in fact, our work is 10 times more difficult, because we have encryption on top of all that. So it will be more expensive, and it will take longer. But in the end, it will bring a better product to many users, because it will actually protect the data.”
Also: Proton presents an alternative to Google Workspace – and it’s fully encrypted
Privacy may produce a better product, but who pays for those extra costs? Proton’s own Workspace announcement says it’s competitively priced, from $12 per month (paid annually) to $15 (paid monthly) for the standard tier, and from $20 per month (paid annually) to $25 (paid monthly) for the Premium tier. Proton also said it does not increase prices every year even for existing customers. To clarify, a Proton spokesperson told ZDNET that operating an “efficient store” keeps prices low for customers despite the higher costs Yen mentioned.
“I don’t really see any technical barriers to achieving the same performance,” Yen added. “It’s going to take a long time.” On the bigger picture of the company’s business model, he said Proton’s premium offerings have proven to be worth the money so far.
“The fact that we don’t have VC investors shows that, in fact, this model is probably more risky than most people think.”



