35

Recently we developed and published a mobile banking app on the app store, for a big banking organization. The bank hired a security firm to perform ethical hacking over the app to see if it, in anyways compromises confidential data.

We recently received the hacking report from the firm, which in-spite of saying that no serious security issues are present, contains a list of all the class files, method names and the assembly code of the project.

Now the client insists that we fix these security loop holes and republish the app. However we don't have any idea how did they manage to get all these details from the application's IPA. I searched this over SO and found a particular post mentioning this link, which states that you can't save your app from being hacked.

Please help me how to fix these security vulnerabilities , or if not possible, how to convince the client.

Edit: Recently came across this page. Seems like EnsureIT from Arxan can prevent app IPAs from reverse engineering. Anyone experienced with this?

Vin
  • 10,517
  • 10
  • 58
  • 71
  • 1
    You should check this question: http://stackoverflow.com/questions/5556849/iphone-ipad-app-code-obfuscation-is-it-possible-worth-it – andreamazz Aug 04 '11 at 09:49
  • Thanks for your answer andreamzz. However the accepted answer in the post says that obfuscation in objective C is not possible, and the manual method that has been described, would be too tedious and time consuming for incorporation in an already completed app. – Vin Aug 04 '11 at 10:03
  • ethical hacking? what if someone tries unethical hacking? – peko Aug 04 '11 at 11:53
  • @peko.. that is what the client is afraid of. That's why they want us to fix the app in such a way that nothing is visible to any hacker in any way. – Vin Aug 04 '11 at 12:04
  • But thats not possible and should not be any thread, there are many projects that have their code public and still being considered secure like truecrypt, sshd and many other. – peko Aug 04 '11 at 12:18
  • Did you guys removed all the symbol information from your app ? (the debug symbols I mean... ) of course there are far better things to do to secure your app, but maybe you guys let that simple thing pass ? – Goles Aug 04 '11 at 13:10
  • @peko and Gando we have not left any debug symbols. The security firm said that there were no serious security threats. Only thing that they could find was names of all class files and method names(at least that is what is mentioned in their report). But the client is appalled that this implies, any serious hacker could break into the app and get the business logic(web service urls etc.) – Vin Aug 04 '11 at 14:20
  • If knowing the URLs is a a problem for your client, it sounds like the client should secure their servers first. – lilbyrdie Aug 04 '11 at 15:50
  • Another problem with URLs is that even if you could hide them in the application trought some kind of obfuscation, a attacker could sniff network traffic and get them anyways. In other words: you can't hide any URL-requests. – peko Aug 04 '11 at 16:01
  • What kind of solution did you end up with? Did you go with the Arxan setup? – Sander Mar 05 '12 at 11:54
  • Yes, we are using Arxan setup now. – Vin Mar 05 '12 at 14:05
  • After 3 years iOS evolution, do you have any new knowledgement to share with us? – Yi Jiang Dec 05 '14 at 02:36
  • Im in the same situation, the bank brought a security firm and they were able to obtain the objective c code. So, I searched and was able to jailbreak the device and also was able to reverse engineer the app, so I also obtained the code. Now, I need to figure out how to prevent reverse engineer of objective c code...Can you please tell me what did you do in your case? – Peter Haddad Mar 19 '19 at 10:19
  • @PeterHaddad There are numerous security products available in the market now Arxan, IXGuard to name a few. – Vin Mar 19 '19 at 10:35
  • @Vin yes I have seen IXguard but the company doesnt wanna pay lol – Peter Haddad Mar 19 '19 at 11:28

2 Answers2

13

There's always a risk involved. Even if you don't introduce vulnerabilities yourself, the platform may allow for exploits which in the end may offer an entry point for a malicious attacker.

As to your question: It is not safe to assume that a hardcoded URL, even if obfuscated beyond belief, can't be peeled out of your product. Always design your apps such that safety of user data is guaranteed (as far as possible) even if built in ressources get compromised. If the knowledge of that URL alone poses a security threat, then your whole approach and your clients API is inherently insecure. Remember that such information could possibly be captured by a man-in-the-middle attack (and other modes of attack) as well.

Avoid security by obscurity. Store sensitive data only on disk if it is necessary. As a rule don't allow PIN / TAN storage.

Some thoughts which may (or may not) convince your client that your app is as safe as it can be:

  • As long as the app runs on a non-jailbroken device, it is unlikely that an attacker, even with knowledge of your apps internals is able to get to any user data, because the iPhone normally doesn't offer opportunities to interfer with your app
  • If the attacker is able to get to your users data, and provided you have been protecting that data with all means available under iOS (-> keychain -> crypto chip ->...), then it's not your fault. It means the device is either jailbroken or there are vulnerabilities to the system itself which have been exploited, you just can't do anything about either possibility.
  • It is impossible to prevent reverse engineering of your app. Even if you had put more effort into obfuscation, an attacker with strong motivation would still be able to get what he wants. Your client needs to get used to this as it's a fact.
  • Other platforms suffer from similar vulnerabilities, yet on the iPhone at least you have a somewhat closed environment and a reduced risk of being attacked with trojans and the like.
  • The governments and security firms get hacked on a regular basis, although they should now how to protect themselves. This means life is inherently insecure, cope with it.
Toastor
  • 8,980
  • 4
  • 50
  • 82
  • Your point is kinda like, hey even goverments get hacked so why shouldn't i? seriously this are no arguments. you can't assume that the device is not jaibroken, etc.. But as you said there is NO security through obscurity. – peko Aug 05 '11 at 10:20
  • I made that point only to put things into perspective. It's not meant as an excuse for not caring about security at all. Rather it should explain that, even with the most sophisticated security measures, you can never be 100% sure your product is safe. – Toastor Aug 05 '11 at 10:25
  • 5
    Of course I cannot assume the device is not jailbroken. But I can assume that the user with that jailbroken device knew what he or she was doing when he was jailbreaking and that he o she might compromise the security of his or her device. Thus, if due to the jailbreak the user looses critical, personal data to an attacker, that's within his or her own responsibility. – Toastor Aug 05 '11 at 10:30
4

I was recently researching over this and I found this article helpful, especially the part quoted:

The code for a native app is stored in the form of a binary executable file, which is further encrypted; its decryption is performed only when the executable file is loaded by the processor into the random access memory and the whole decryption process happens at the hardware level. That is why it is very hard to create offline decryption tools. The only way to decrypt encrypted binary data is on a jailbroken device with a few special tools installed.

Security in iOS: Protecting .ipa File Content by Stoyan Stoyanov

Asfand Shabbir
  • 1,234
  • 13
  • 9