Many people talk about hatred: how it is destroying our society and politics. But I think there is an even more powerful emotion responsible for our downfall: fear. Why is it more powerful? Because people won't give up a freedom out of hatred, but they will if they are afraid others will take something that belongs to them or harm someone they care about.
Fortunately, we can use rationality to face fear. Unless we are dealing with a phobia, usually it's enough to show someone that the fear is unfounded. That is exactly what I would like to do with Google's justification for restricting what users can install on their phones.
As of September 2026, every app will have to come from a verified developer1. Are you a privacy‑concerned developer? Goodbye. Are you a hobbyist developer or a student? We are preparing a special account for you (i.e., you will be able to install your app on a maximum of N devices if we like you). Their claim is that this would make their customers better protected from malware.
What I think is behind this decision is a fear of a software boom. By this I mean the following: Google is primarily a software company. Some of its products are so complex (Android and Chrome, for example) that they don't face much competition. Although I believe a solo developer won't be able to create any of these products from scratch in the near future (even with the help of the best AI assistant), I'm not so sure about a medium‑sized company or a collective of individuals in an open‑source project.
That's why I'm afraid Google will take more and more monopolistic measures to protect its market share (deals with device manufacturers, copyright lawsuits, restrictions on the Play Store, and so on).
I see some people a little upset because AI has empowered many creators to do what a few did until now. I don't view AI so pessimistically. It has increased everyone's capacity to handle complexity. Amateurs can now build simple apps. Those who used to code these apps can move on to more challenging work. Small companies can try to make things that only the big ones could before. And Big Tech can push the edge of technology. To be frank, I do think Big Tech will try its hand at many futuristic enterprises. The problem is that they will also fight tooth and nail to keep their share in what is now less impressive technology.
The future is open source
I have noticed that nowadays it's very easy to check the quality of code. GitHub does this automatically, and there are a lot of plugins for VS Code too. Scanning thousands of lines of code for bugs or maliciously injected code is so tedious that, in my opinion, it is the single most useful application of AI.
Not so long ago, people used to complain about the challenge of auditing a vast number of open-source repositories. Now, I’ve begun to realize that the safety and privacy of open‑source software may soon be indisputably greater than those of proprietary software, despite the trustworthiness of the brands behind the latter, since those brands are subject to pressures from shareholders, governments, and other actors.
I was recently following a Hacker News discussion about code being cheap. I think not only is code cheap now, but so are text and video: people are a little tired of AI‑generated content. But if code is cheap, code that can't be audited is even cheaper. The same applies to code that violates your privacy or locks you into a specific platform.
Which would you choose: code written in a weekend that’s a black box, or code that does the same thing but has its source available for you to inspect for safety, privacy, and interoperability? I have a feeling these features will increasingly be highly valued by users, much like artistic content produced by real people today2.
This brings us to the initial question: how in the world can sending a black box to be tested for, for example, usability and adherence to arbitrary UI standards be safer than making your app open source?
If Google doesn't allow FOSS to be installed on Android, it will be clear that user security was just an excuse.
- Initially in a few countries and later everywhere. See: https://developer.android.com/developer-verification
- Okay, maybe there's a bit of wishful thinking here.