Compromised Sky Apps and the Security Aftermath
After our earlier article warning users to uninstall the Sky apps from their devices, it’s time to take a look at the technical significance of this attack. Firstly, the attackers have managed to do two key things here, each of which should each be impossibly difficult for the Play Store update system to be secure:
- Gained access to the Play Store Developer Console of Sky, presumably through gaining access to the associated Google Account
- Obtained access to, or managed to otherwise generate or reproduce, the private RSA keys used to sign the Sky Android app packages
The former is obviously important to security, since without access to the Developer Console, it is not possible to push out an update to an existing app. While obviously a malicious user could publish his own app, he would not be able to push an update to an existing app already installed via the Play Store, unless he can do so using the account of the developer who originally published the application.
The latter is equally (if not more) important a security measure: Even if an attacker gains access to your Developer Console, they cannot push an update to an existing app if it is not signed using the same keys as before. This check is also enforced on each individual Android device, meaning even if there was a bug in the Play Store implementation of this security, your own device would reject the update! All bets are off though, if the private signing key is compromised or accessed.
In the case of the current Sky attack, it seems likely that both the Developer Account and the private signing keys were compromised. At this point, the safest option would arguably be for Google to use its remote uninstall trigger on these packages, if there is any indication the actual packages themselves were compromised. Sky will no doubt resist this, as they would not want to see their apps disappear from users’ devices. Unfortunately for them though, it is long past that. These users need to uninstall the app now, as Sky can no longer continue to use these keys.
And herein lies the ultimate problem in the Android security chain (and indeed in most certificate based security systems). There is no system for effective, wide-reaching key revocation. If your Android signing keys are compromised, the trust chain ends. There is no way for you to revoke your compromised key, so clients will no longer trust an app update signed with it. There is also (less importantly from security, but more importantly from the developer’s perspective) no way to securely supersede these keys with new ones (while ensuring an attacker cannot replace the keys with his own).
What does this mean for developers?
Take all reasonable steps to protect your Google Play Store Developer account:
- Use two-factor authentication on the account.
- Enable a password or PIN lock on your phones which contain the authenticator app. If you enter a backup phone number, consider the security risks of an attacker compromising your telephone provider via social engineering, or compromising your login to issue a new SIM on your account (to allow them to obtain a one-time login token).
- Use a very, very, very, very, very secure password for your Developer account, and do not use this password anywhere else. Do not use this Google account for anything else. Log into it only from your own computer, only over WPA-2 (or better) encrypted WiFi, which you control. If you can type this password in under a minute, it’s not secure enough. It should also be random. Random is not your partner’s name followed by the year they were born in. Unless their name is “r$kGmn9d4Fl9&*sEm.Xs2Fl0_3fGjdk” and they were born in year “hJfMn?32VwmndkD2lsk34Rojks83”
Take immense care of your private code signing key as well:
- It is called private for a reason. Users who install your applications are trusting you to keep this key safe. Do that, and do so with your life. You should plan on being dead for many years before anyone can gain access to this. Many years would likely be a number greater than the life of the universe. Or at least the expiry date on the certificate.
- Do not decide to “just backup the key to Dropbox for safekeeping” (or indeed any other cloud or remote storage system).
- Don’t give anyone else access to the key. If you work in a team, can you perhaps operate a system whereby only one person signs the final updates? If it’s really necessary, then share the key, and ensure the other person takes equal levels of precautions.
- Attackers will always attack the weakest point in your system, and will password-reset your Dropbox account if it gets them access to the shared folder where you stored your private key (and password to decrypt the key in the corresponding readme file, which you knew to never do, but did anyway), or snoop on your emails if you ever attach the key.
- Protect your key with a very strong and random password (remember the lecture before about what random means?) – do not use this password anywhere else. Do not store this password somewhere an attacker can gain access to. Do not store it with or near your key.
- Store your private key on an encrypted USB flash drive, and disconnect the drive from your computer when not in use. Then put this drive in a safe when not in use. Store a second copy of the key in a safe deposit box in your bank. This will obviously be heavily encrypted, using a long, secure, random key.
Finally though, what lessons should we all learn (and perhaps Google start to ponder)?
- There is currently no way to revoke a compromised application signing key. While arguably this is because anyone can sign an app using their own key, and install it on Android (thus meaning that revoking a key is of limited use), this isn’t the case as Google pushes forward with trying to force automated updates upon unsuspecting users. Automated updates are a huge security risk until there is a rapid and effective key revocation system available to developers.
- There is no way for developers to recover from a compromised signing key. Perhaps Google ought to review the signing system, such that developers create a CA key, which is then used to cross-sign other keys (such as their private signing key), such that if an application key is compromised, they can generate a new one, and sign it with their CA key. (this relies upon the developer understanding this, and realising he must guard the CA key with both his and his family’s lives, and guard the application signing key “simply” with his own life)
- The Sky apps in question had fairly generous permissions access on the devices they were installed upon. Perhaps developers should stop making their apps attractive targets, and stop bestowing themselves with such wide-reaching permissions on our devices. There is no reason for the majority of apps to make use of any permissions whatsoever (perhaps asides from being able to access the SD card, a permission I argue should be sandboxed to an app-specific area anyway)
Hopefully, Google will make use of their remote uninstall ability to remove the app from user devices, and also let them know somehow (email to their Gmail accounts) what happened, and that Sky did not look after their key properly. This is a major embarrassment to Sky, and they should hang their heads firmly in shame for not taking sufficient precautions to ensure that everyone with access to their signing key was suitably competent to prevent it falling into the wrong hands.
At the end of the day, it is the end user whose security is affected by the failings of the developer. And for this reason, security lapses such as these are unforgivable.