How Android Apps Reveal Our Secrets Without Us Knowing | science and technology

Digital location was one of the short-lived successes of the Covid pandemic. In Spain, an app called Radar Covid, designed to help users tell if they were near an infected person, quickly crashed and burned to death. The idea never quite worked, but what really scuppered it was the fact that, despite all the initial promises to protect user security and privacy, a Google glitch resulted in data being leaked from Android mobile devices through a unexpected place: the app’s logs.

Now, new research has revealed that Android users’ private data continues to escape through this gap, giving businesses more access than they should.

“This research uncovers a very important gap that is not well regulated or studied,” says Carmela Troncoso, a researcher at the École polytechnique fédérale de Lausanne (EPFL), a public research university in Switzerland, where she heads the SPRING Lab department works to understand and mitigate the impact of technology on society. Troncoso, who in 2020 led a European group tasked with developing Covid-tracking apps, warns of the technology’s shortcomings. “It’s a general issue with tracking apps and everything. The bottom line is that you can’t intentionally do something private on a platform like Android, which by definition is flawed.”

Logs are like a long and detailed diary that collects everything that happens in an app. Its original and accepted use is to detect bugs (errors in code) before apps are released to the public. But in reality, that’s not the only thing that’s happening. Google encourages app developers to remove logs after apps are released as they may contain sensitive information. But recent research shows that they are still there and contain records of everything.

“We have found that logs do not contain purely technical information, but can also contain, whether accidentally or intentionally, personal data or information revealing the user’s activity,” says Juan Tapiador, professor at Carlos III University in Madrid and co-author of the study. “An example is the case of Microsoft Teams or Discord or the pharma apps CVS and Drug Mart, whose activities provide a lot of information. For example, in the case of Teams, it is possible to know the exact time the user made a call. Among other things, CVS and Drug Mart store the product categories used to filter search results.” For example, it tracks the type of drug someone is searching for, from contraceptives to cholesterol pills.

Permission to access this vast amount of personal private information on Android is restricted to Google, the manufacturers of the devices, and the preinstalled apps that those manufacturers put on it. Among them are companies that do advertising. The loot they have access to in the guts of Android mobile devices is hard to calculate. All the apps run there and it can be anything from our location to our interests to our love affairs.

Android is based on an open source project maintained by Google. But it’s not a closed ecosystem like the Apple iPhone. “Any phone manufacturer can make changes to the operating system and apps of other organizations with which it has commercial agreements, including apps from companies that are part of the industry that commercializes personal data and advertising,” says Narseo Vallina-Rodríguez, a researcher at Imdea Networks and co-founder of AppCensus, which analyzes app privacy. “The big problem is that these preinstalled apps are part of the operating system and can privilege access to sensitive data and resources that a regular app cannot access. This is the case with the system logs from Android version 4.1.”

A jungle of hardware and software

It’s a complex balance in a chaotic landscape. Android devices live in a jungle-like environment where dozens of companies are trying to extract data and monetize it without it being obvious. “Security and data protection risks arising from the supply chain are complex to solve. Many parties are involved in the manufacture of a product and all the software it contains, sometimes with complex relationships between them, so the risks of one entity can easily be transferred to others,” says Juan Tapiador.

When asked by EL PAÍS, a Google spokeswoman replied that they were trying to do everything at the same time: protecting the user and also giving app developers more options. “User security and privacy are top priorities for Android. We really value the research the community is doing to help keep Android safe. “We’re constantly improving Android features to ensure user data is safe and private, while empowering developers to build the best apps possible,” the spokesperson said.

Google said that all apps with access to a device’s data are apps authorized by the device manufacturer, shifting some of the responsibility for potential intrusions to other actors.

“I was really surprised at how much sensitive data is logged by hardware device manufacturers,” said Serge Egelman, a researcher at UC Berkeley and co-founder of AppCensus. “If these devices are certified by Google as official Android devices, then it really needs to be verified that they’re following Google’s policies and basic best practices.”

But nobody monitors or ensures that this information is not accessible to actors who could potentially misuse it. Bart Preneel, professor at the Catholic University of Leuven and technical lead of the Belgian digital tracking app Coronalert, describes the problem in three points: “Firstly, it becomes easier for developers to record a lot of information and most of it contains sensitive data, in particular when protocols from multiple applications are combined. Secondly, this information is useful for Google and manufacturers. But many other applications authorized by them also have access, so the risk of misuse is very high: It allows user profiles to be created by a large number of parties. And third, Google warns developers against logging too much, but developers do it anyway and it’s not monitored,” says Preneel.

An inadequate remedy from Google

In this case, the information shown in the logs should not actually be there. For years, Android has advised against including private activities in these logs. But it is not controlled or monitored, and app developers and makers are not directly affected by the problem. It is a clear example that the worst consequences fall on the user who does not know what is happening. It is software that is already there when it gets into your hands, on your mobile device.

After learning about the investigation, Google introduced a warning to users in version 13 of Android: “The mechanisms that Google introduced in Android 13 to improve transparency and inform users about access to logs for preinstalled apps , are a good step,” says Vallina-Rodríguez. “They will allow users to control when and who can access this information. However, improving the permissions system only mitigates this specific problem and cannot solve the general problems associated with the lack of control over the supply chain of digital products,” he adds. This is not a sufficient remedy, says Preenel: “It’s just a patch, most users have neither the time nor the inclination to control such settings.”

Google obviously does not bear full responsibility. App developers should be more careful about what information they allow to appear in logs, and remember that they are not the only ones who have access to this information: “App makers could log less data,” says Joel Reardon, a researcher at University of Calgary and co-founder of AppCensus. “Many apps use services like Crashlytics to collect error logs, which allows them to debug with the app already deployed. Previously, users of such software were referred to as beta testers and participation was voluntary. If app makers don’t intend to look at the logs, there’s far less reason to log as much data as we’ve found.”

Sign up for our weekly newsletter for more English language news from the EL PAÍS USA Edition