8

Android and Java provide a crypto API that is relatively easy to use for crypto non-experts.

But since we know that no code can really be protected from reverse engineering, especially string constants used as seeds or shared secrets, I am wondering: What is the point of going through the ordeal of encrypting and decrypting in Android applications?

Am I missing something?

Trying to make my question clearer and more concrete: Suppose I have an application in which certain strings used by the code and in the code (i.e. not user data) need to be secret: One approach is to store them in encrypted form in the compiled .apk and decrypt them (using an obfuscated hard-coded password) at runtime. Another approach would be to store them in encrypted form in a remote server, fetch them (over the Internet) and decrypt (using a shared password) them at runtime.

I don't see much difference between the two, since both require the "secret key" being present in the (reverse-engineer-able) code.

Is there a solution to this problem?

If there isn't a solution, why encrypt at all?

Cœur
  • 37,241
  • 25
  • 195
  • 267
ateiob
  • 9,016
  • 10
  • 44
  • 55
  • 8
    The same reason that you lock your house or your car. It won't keep someone out that really wants to get in, but it will encourage some of them to move on to something that is unlocked. – Wonko the Sane Aug 18 '11 at 20:36
  • 1
    Are you talking about encrypting user data, or encrypting the code of an application? – Nayuki Aug 18 '11 at 20:37
  • 9
    You're missing the fact that you *shouldn't store secret keys in your code*. – dlev Aug 18 '11 at 20:37
  • @dlev OK, but then how do you address the problem described in the update to my post? – ateiob Aug 18 '11 at 20:57
  • @Nayuki Minase No, I am not talking about encrypting user data. I am mainly interested in encrypting **application** data. I just updated my question to make this clearer. – ateiob Aug 18 '11 at 20:59

5 Answers5

5

This is not strictly a problem with Android or Java. Anything can be reversed, it's just harder if it's native code. And bear in mind that they don't even have to reverse it: you have to eventually decrypt the data on memory to manipulate it. At this point, the attacker can just take a memory dump and they will get your data. If they have physical access to the device, and you are manipulating the data in software, there is really nothing you can do to stop them. The solution for this is to use a dedicated hardware module (HSM) that is tamper-resistant or at least tamper-evident (if some one messes with it, it either deletes all data or at least keeps some logs of the event). Those come in different shapes and sizes ranging from smart cards to network connected devices that cost a lot. Currently not available for Android, but maybe it will get something similar to a TPM, so you can store your keys securely and do crypto operations in hardware.

So consider just how secret your data needs to be and decide on an adequate protection level.

You might want to have it downloaded it over SSL (that would protect it in transit), making sure you authenticate both the server (so you know you re getting the right data from a trusted place) and the client (so you can be sure you are only giving the data to the right person). You can use SSL client authentication for this, and it will be much more secure than any custom encryption/key exchange scheme you (or anyone who is not a cryptography expert) might come with.

Nikolay Elenkov
  • 52,576
  • 10
  • 84
  • 84
  • Excellent analysis & answer. I agree that this is not strictly a problem with Android or Java, but Java is so much more easily de-compilable into Java than x86 machine code into C++, for example. – ateiob Aug 19 '11 at 13:05
3

The shared secret in the crypto API is not something that you would store in the app (as you say, that would be vulnerable to reverse-engineering -- though perhaps not as vulnerable as you would expect; obfuscation is pretty easy).

Imagine instead you wanted to create/read encrypted files on your phone (for your secret grocery list).

After creating one, you save it using a master password (that is immediately discarded by the program). Then when you want to read it, you have to re-enter your master password. That's the shared secret the API refers to, and it is completely tangential to reverse-engineering.

Alex Churchill
  • 4,887
  • 5
  • 30
  • 42
  • Thanks +1 for answering but manually re-entering the master password isn't very practical in certain cases. Is there a way to do this automatically? – ateiob Aug 18 '11 at 21:01
  • Sure; as long as the password isn't stored in the code it's still all good. For instance, if I want to securely store encrypted data on a server (maybe using the Dropbox API, for instance), I can enter my master password and save it as a pref, then every time I upload, it goes to disk, looks for the password (prompting me if it doesn't find it), encrypts and uploads. It does the same for download. They key is that you should store no secrets in your code (your code should be equally secure even if it were open-source). – Alex Churchill Aug 18 '11 at 21:05
  • Don't forget you can use filesystem permissions to store keys on the device, as the next best thing to pulling them from the internet or having the user enter a master password. Store it in a flat file with MODE_PRIVATE. You can't protect the user from themselves (or other people who steal their phone) that way, but at least other applications can't read it. – Prime Aug 18 '11 at 21:06
  • @alex c Storing the password in the prefs is not an option because it's not the user's password, it's the application's password. I hope this is clearer. How do I get out of this catch 22 conundrum? :) – ateiob Aug 18 '11 at 21:08
  • Ah, so do you want a way to verify the application is yours, say to communicate with a server? – Alex Churchill Aug 18 '11 at 21:11
  • 1
    Edit: based on your new post, I'd say it would be impossible to manually do crypto to do what you want. Because the strings have to be decrypted at some point, it must be reverse-engineerable (if nothing else, by inspecting memory at the correct time). Maybe I'm misunderstanding what you're asking though. I'll give it some thought. – Alex Churchill Aug 18 '11 at 21:16
  • @alex c You understood correctly. The problem I described seems to be unsolvable. – ateiob Aug 18 '11 at 21:51
  • After giving it more thought, what you're asking can't really be done without a degree of trust of the hardware (which you simply don't have at current). You will need to wait until TPM is built into many Android devices and Android builds APIs for TPMs (which may or may not ever happen). As @rossum suggests, you should try to remove as much of this secret logic from the app as possible (offshore it to the server if you can). Sorry it's not a better solution; good luck! – Alex Churchill Aug 18 '11 at 22:15
2

The problem you are describing is somewhat similar to storing a master password for a password manager problem.

In that case the solution offered is using salt for password hashes.

Community
  • 1
  • 1
uTubeFan
  • 6,664
  • 12
  • 41
  • 65
1

ateiob Any time you store the master password in the app you are really just making it a bit harder for unauthorized users to access the encrypted data.

First we can agree that encrypting data with a "master key" embedded in an application and storing that data on the phone is open to having the "master key" reverse engineered and the data decrypted.

Second I think we can agree that encrypting data with a secret password and then deleting the secret password should be reasonably safe using strong encryption, 256 bit keys and strong passwords. Both techniques apply to programming on mobile devices. In fact, iOS, supports BOTH needs out of the box.

[keychainData setObject:@"password" forKey:(id)kSecValueData]; 

Perhaps a real world example may help.

Say if on low memory a temporary data field must be persisted and protected, it can be encrypted with a master password and cleared when the user clears the temporary data field. The temporary data field is never stored as plain text.

So there are two passwords, a master password, embedded in the app for temporary short term encryption and a secret password, that usually must be entered by the user, for longer term persisted encrypted data.

Finally, if you are encrypting files, consider adding another level of indirection. So that the current secret password is used to encrypt a random key which is used to encrypt all the user's files. This allows the user to change the secret password without any need to decrypt, encrypt all the encrypted files.

JAL
  • 3,319
  • 2
  • 20
  • 17
-1

The attacker is assumed to have a copy of your code. The secrecy of your data should depend entirely on the key. See Kerckhoffs's Principle.

To keep your key secret you must separate it from your code. Remember it. Keep it on a piece of paper in your wallet. Store it on a USB stick that you usually keep in a safe. Use a program like PasswordSafe. There are many possibilities.

It is of course possible to make any attacker work her way through many layers of keys to get to the key she actually needs. PasswordSafe and similar are one such option. You will notice that such programs do not give you an option to "remember your password" for you.

rossum
  • 15,344
  • 1
  • 24
  • 38
  • 1
    He is talking about having the key embedded in the code. Not about hiding a password. – alexanderblom Aug 18 '11 at 22:52
  • To quote the question, "I don't see much difference between the two, since both require the "secret key" being present in the (reverse-engineer-able) code." I suggested ways in which the key was **not** present in the code. – rossum Aug 18 '11 at 23:41
  • @alexanderblom Understood my question correctly. I am not asking about encryption of user data. I was specifically asking about encrypting certain data used by the **code** itself. "The two" refers to storing the encrypted strings locally vs. in a remote server. – ateiob Aug 19 '11 at 00:07
  • And I was offering some third options. You are right that putting the key in code is dangerous. For example, insert the USB for a few seconds every morning to let the server derive the day's keys from the master key on the USB. That can be done with the server disconnected from the network. Then at most one day's keys can be compromised. – rossum Aug 19 '11 at 00:41