What kind of phone do you have?
Do you know where it comes from? Do you know where the parts come from? Do you know how the parts work? For many of us, the answer to that last question is no. The modern cell phone has evolved greatly in a short span of time. It is a highly complex device that promises many forms of security such as physical and digital– types of security that protect our phones from getting broken into or stolen and types of security that protect our communication and information. There are many phones to choose from and many different systems behind them.
Do you trust your phone?
Most of us do things on our phones that we would prefer to keep private. For example, some of us use banking applications, send sensitive emails or messages, express beliefs, or use our payment information to make purchases. For these reasons and more it is important to most users that their phones be secure. For users in countries with oppressive governments or censorship and information restriction, the privacy and security of these devices are even more important. Luckily, research communities are putting more effort into gathering information to determine which phones are trustworthy– especially for at-risk users. The site this blog is posted on is one such effort being supported by the Open Technology Fund’s Information Controls Fellowship!
What makes a phone trustworthy?
How do you decide whether or not to trust a phone? Phones are incredibly complex systems; the basis of trust is understanding. Making a phone involves balancing a myriad of hardware and software designs. In these designs a few areas are of special interest in deciding whether one should trust their phone:
You might have heard of cryptography or end to end encryption, and you’d be right to understand it as a technology that is capable of securing both communications between two users and securing data that is locally on your device. Cryptography utilizes large numbers called keys and some special math to create a system in which data is either encrypted or decrypted with one’s keys. Different cryptography models use keys differently, but private keys are almost always treated as secrets, and shouldn’t be known by any untrusted parties. What’s more, in the modern web almost every communication needs to utilize encryption to maintain security, otherwise an attacker need only be on the same network as you to access any information you share– including a website’s credentials, such as your bank’s. In order to store and manage such keys, many phones come with special purpose-built designs that utilize hardware to keep your keys secure from anyone who might try to install malware (viruses) on your phone or even keep them secret from someone who has physically stolen your device.
Most people understand that their phone is running an operating system, or OS for short, and that this operating system handles lots of behind-the-scenes things that happen on a phone. A fun fact is that modern phones also often contain multiple operating systems, usually at least two! At least one of these operating systems is reserved for running secure software. This secure or trusted software is chosen by the manufacturer of the device and can handle any number of use-cases. Phones are complex systems that combine communication between these operating systems to accomplish certain activities with less risk, such as storing or using cryptographic keys! Since this software has a huge security impact, depends on the phone you have, and isn’t usually accessible by the user, it is of special interest for research efforts looking to understand which phones are trustworthy. If an untrustworthy party were to have full control of the trusted software on your phone, they could implement any number of privacy violating, security compromising, or information controlling capabilities.
Main operating system
Every modern cellphone user is probably familiar with the main OS their phone runs– the most popular being Android and iOS. In modern cellphones the manufacturer of the cellphone often provides alterations to the original OS, often in the form of features such as design changes or the set of privileged apps that come with a phone. The operating system is complex because it must interact with and utilize most of the other components in the system which, together, make up a phone. It ultimately controls how every request an application makes is handled, and how every action the user performs is handled. This means that an operating system is capable of preventing privacy-preserving apps from being available to install, or it can leak information in the app. It can also explicitly enforce censorship or aid in an app’s ability to thwart privacy or security.
Hardware restrictions and information controls
This blog is about how something called hardware restrictions might give manufacturers or other interested parties the ability to introduce information controls. But what are information controls, and what are hardware restrictions?
Previously, we saw examples on how several components in the complex machines we call phones can be an opportunity for a manufacturer or other entity in control of the design process, such as a parts manufacturer or government, to implement capabilities that either control or monitor the information that our phones send, receive, and store. We call these capabilities “information controls”. Information controls are usually implemented on networks or through import/export laws in a country, but with the increasing level of control in modern phones the cost for implementing controls on phones may be dropping sharply.
If your phone was capable of enacting information controls, why couldn’t you just turn them off in settings? Not only could you not turn them off, it is unlikely that anyone could avoid them through the means of software. Unfortunately, phone makers have historically often not exposed the full capability of software to the user. This is done for various reasons, often because it is cited as too confusing, or that there are security risks to a non-technical user to modify the behavior of the device. But if one owns their device and knows the password, what is actually stopping one from disabling these information controls?
In the past users have been able to modify the behavior or code of applications on their devices. They could download and use other peoples’ applications directly from the source and could easily make applications that access whatever information they wanted which they could then distribute however they wished for whoever to download and use. The user could take control of any application on their device, which meant an application was subject to the user’s intentions. The user could design or download their own operating system and use it on their hardware. The user was in charge of either devising a method to keep their device secure or picking a party that they trusted (usually an operating system vendor) to decide the strategy. With modern phones and hardware-based restrictions of the user’s capability, this agency or freedom of the user is often not only no longer the default shipped on devices, it is not available at all.
When we mentioned the three concepts from before (“cryptographic keys”, “trusted software”, and “main operating system”) we talked briefly about how they were designed with security in mind. In most cases the user’s security is the driving force for design patterns, but in some cases security decisions are not purely for the user’s security. The pursuit of security in modern devices has sometimes even meant the sacrifice of the user’s agency. This sacrifice of users’ agency can provide obscurity for the device manufacturer’s security efforts and can stop certain unwanted parties from accessing the device– such as hackers who have physical access to the device, or hackers who have compromised an application on the device. It can also help stop social engineering attacks, a type of hacking strategy in which the hacker attempts to convince the user to compromise their own system; since users do not have the ability to completely compromise their own system (only partially), they cannot be convinced to do so by strangers. However, these designs also make room for some capabilities that are seemingly motivated by the profit of the manufacturer or other interested parties; these capabilities allow security models like Digital Rights Management or DRM, in which the main security objective is to protect the data from the user accessing it so that digital content such as songs/movies/games can be delivered to a device without risk of piracy or even fair-use alterations. Another capability seen is the strategy to lock users into certain application distributors or “app stores”, which can deny the user the ability to access certain information or content but in exchange gives the manufacturer the ability to take a cut of app sales and ensure only apps that have passed certain security and usability standards may exist on the device. Other capabilities exist in the form of anti-cheat for games, region locking devices, locking devices into certain carriers, ‘jailbreak’ or ‘root’ detection and more.
Hardware restrictions are not usually a documented part of a system that can be easily looked up. Hardware restrictions are often either implemented as a feature or part of a component of the system’s overall design. This means their parts are integrated at every level in the device, with varying uses and security models (their goals). For example, in the process of powering on or booting a device hardware restrictions can often be utilized to deny the loading of code or in many cases operating systems and trusted software that do not exactly match the version the manufacturer chose. The risk of locking the user into the manufacturers chosen main operating system is that once locked in, not only is the degree of freedom for certain information avenues like web browsers, app stores, and messaging apps decided by the manufacturer, the way that an operating system utilizes trusted software, cryptographic keys, and also hardware is also locked in; this means that in an environment where the manufacturer has this level of control, there are tons of ways they could implement information controls! This doesn’t mean that if a user can control the main operating system they are immune to information controls, but it is a huge advantage to any interested parties trying to implement hardware restrictions to make sure they have control of this system. This is just one example of a hardware restriction and the various ways it could be utilized to make information controls more effective, in reality there are several ways in which a manufacturer designing the hardware and software of a phone could restrict the user’s operating capacity through hardware means with an end goal of controlling information on a device.
The research we have planned
It is my (the Author’s) personal nightmare that we may find ourselves in a future where countries with the means to develop information technology might have the inclination to deploy, sell, or build information technology devices or infrastructure that enacts that country’s own –or provides an authoritarian government the means to set up their own– information controls. I believe properly handling these risks is paramount to human rights, especially when you consider the at-risk users in these circumstances. Authoritarian governments and foreign nations have been utilizing their technological developments and resources to manipulate under-developed nations and minority populations for much of modern history, control of information must not happen. Since hardware restrictions can lock the user in to the manufacturer’s choices and also obscure the details of how a device works, they can make it hard to trust a device in a world where the cost of effort to implement information controls on a device is decreasing rapidly– especially if the device originates from a company or nation that exercises its own information controls, has an authoritarian government, or has an incentive to control the flow of information of a population.
As countries and manufacturers continue to influence and develop their own main operating systems, trusted software, and especially hardware, the opportunity to introduce novel hardware restrictions and information controls presents itself as all too easy.
To help mitigate the risks of this situation and to discover to what degree this harmful use of information controls and hardware restrictions exists in our modern day, the Open Technology Fund’s Information Controls Fellowship has green lit a project dedicated to exploring hardware restrictions and information controls, specifically: which exist, which are currently possible with today’s hardware, and what risks are greatest in modern phones from authoritarian countries with information restrictions. This project is important since public researchers usually have no more access than regular users to these devices, as such there is a scattered history of public analysis and awareness on the capability of these designs. The project will start with this non-technical blog and continue over the year with the analysis of several target devices. At the end of the project the findings will be presented in a non-technical manner and the technical resources gathered will become publicly available to assist in other research.
What can you do?
If you have information you would like to contribute or need to make contact you can reach out to email@example.com. In the meantime, the organization funding this research– the Open Technology Fund has funded excellent projects that provide resources on information controls and open technology. Subscribe to our RSS feed or sign up for notifications below to get updates on the project!