
A security leak involving manufacturing keys from major device manufacturers (such as LG and Samsung) has created a route for malware apps to get onto user devices under the guise of legitimate updates.
These malware apps can give an attacker full access to an Android device because the operating system trusts any app signed with this key with full system-level access. This attack would not necessarily require the end user to download a new app; it could be injected as an update to an existing app on the device. It doesn’t matter whether the app was originally installed via the Play Store, a manufacturer-specific point of sale such as the Galaxy Store, or downloaded to the device independently.
The security leak was uncovered by Google and did not name the manufacturers involved. However, independent researchers were able to learn the names of some of the companies that had keys stolen from subsequent VirusTotal listings: Samsung, LG, Mediatek (one of the world’s largest chip designers), RevoView (a maker of networking equipment and cameras), and SZROCO ( maker of the Walmart-exclusive Onn budget tablet line).
Although Google only recently revealed the security leak to the public, it says that Samsung, LG and all other known affected companies fixed the issue by May 2022. Samsung signed keys were only recently uploaded. Another worrying aspect is that VirusTotal reports some exploits with signed malware apps dating back to 2016.
Google says Android offers multiple layers of protection against these malware apps, including active scanning by the Google Play Protect service and “mitigations” implemented independently by each device manufacturer. Potential harm is also minimized if manufacturers regularly rotate the keys they use, although there’s no real way for the general public to know if this is being done. Google also says it hasn’t detected any of these signed malware apps available on the Google Play Store. Given all of this, the biggest risk seems to come from sideloaded apps downloaded from an independent site like APKMirror.
The manufacturers involved say they have fixed the issue in their own environments, but it’s impossible to know if there are more manufacturers that have been affected (and what their current status is). Ivan Wallis, Global Architect at Venafi, notes that it is crucial for any manufacturer with these signing keys to take immediate action: “This is a great example that demonstrates the lack of proper security controls for code signing certificates, particularly the signing keys for the Android -Platform. These certificate leaks are related to exactly where these vendor certificates made it into the wild, opening the door to abuse and the possibility of signing malicious Android applications masquerading as specific “vendors”. Attackers can gain essentially the same privileges as the core service. The lack of who/what/where/when surrounding the code signing makes it difficult to see the impact of a breach as this private key can reside anywhere. At this point it must be considered a full compromise of the code signing environment and the key/cert rotation must be immediate.”
Although these security keys are sometimes used to sign manufacturer apps, their main purpose is to verify the legitimacy and status of the Android version running on a device. This allows the malware apps to gain full administrative access to a device if one of these keys is successfully deployed. Manufacturers should be careful in securing their certificates, but as Samsung’s recent cybersecurity issues show, this is hardly a foolproof system.
A malware app usually has to trick the user into granting it elevated privileges in some way, by clicking on a file, a link, or a command prompt. This security leak is more serious than usual as simply downloading a malicious app update could completely compromise the device. Google recommends manufacturers limit the use of these keys in their apps as much as possible, but some (like Samsung) use them by the hundreds (including highly sensitive apps like Samsung Pay and Samsung Account). An attacker would have to have access to the manufacturer’s internal network or their app store to get it into the app, making it very unlikely for malicious updates to be leaked, but users may want direct downloads or updates from Samsung and avoid LG, until those affected companies can confirm that they have fixed the issue.
The security leak could grow to more vendor keys; The best source of information going forward is probably VirusTotal, as Google seems to have opted out of naming the parties involved. However, Android devices that exclusively use the App Store for software are likely insulated against these malware apps. Manufacturers can negate the exploit by rotating the security keys used by the apps, but this is more difficult with the older “V2” versions of certificates; these must be included in a device security update, while the newer “V3” version allows for spontaneous rotation. This means that newer devices that still receive security updates are better protected against this exploit.
Tony Hadfield, Sr. Director of Solutions Architects at Venafi, suggests that better internal documentation of how signing keys are handled is needed to prevent future security leaks of this nature: “This is a great example of what happens when organizations create code without a plan to manage code signing keys. If the keys fall into the hands of an attacker, it can lead to catastrophic security breaches. The only way to avoid this kind of problem is to have an auditable “who/what/where” solution: how do you control signing keys, where are they stored, who has access to them, and what kind of code is being signed? You need this information to protect your keys and also respond quickly to a breach by rotating your public and private keys.”
Around
Contact
Our advertising
Privacy Policy
Cookie Policy
Terms of Use
News, insights and resources for privacy, privacy and cybersecurity professionals.
Around
Contact
Our advertising
Privacy Policy
Cookie Policy
Terms of Use
Do not sell my data
source