Vulnerability Epidemic in Financial Mobile Apps - Episode 3 [Video]
Misconceptions surrounding "app security"
So another aspect of this is, I think there is a pretty loudly repeated misconception of the fact that the device has security. There's a sandbox around the app. There's a vetting process in the app store. It does scanning in the app store. Why are things like this still occurring when--isn't the device supposed to protect us? Aren't the companies that are providing these apps supposed to protect us?
Yeah, like is iOS more secure than Android?
I wasn't going directly there, but sure, as a follow up that's a great question.
I can tell you that in my research I didn't do any testing on the iOS side, just for the sake of brevity and time allocated to this research. But certainly, one vulnerability in one platform will of course cross-pollinate to vulnerabilities in the [INAUDIBLE]. It's all the same plumbing, right?
I'm trying to be vague on purpose. But I can tell you that one of the things in my testing that I did find was one of the FI sectors that I tested was stock trading. So we tested three of the major trading platforms, and one of the things that I was very surprised to find was there were no sandbox. There was no sandboxing being done on that. When you look at robo-trading, you look at just stock trading in general, I mean security-- Imagine the theft of stock trading algorithms from a broker. I mean, that would be devastating, right?
So one of the things that I really want to do is focus a lot of my research in this area because I do stock trading in mobile apps on my device-- which I'm not going to be doing anymore until these get fixed, or refer them to Arxan.
But one of the things I found was they were writing directly to the file system. There was no sandboxing being done. There was this sort of risk transference to the device, where, hey you know, we did our part. Let's rely on the security of the mobile equipment. It's almost if you look in the developers response and them transferring that risk to the device. I was like, oh, if this gets compromised, well, the problem is 4G and 3G and the lack of you know.
I can tell you just having-- prior to this I was doing connected car penetration testing in Germany, and one of the first things that I can tell you is that the first step was setting up a rogue base station using a bladeRF and setting up a rogue cell tower. And every cell phone in the area automatically associating to me without notifying or warning the user because they're designed not to. So it's really easy to fire up a rogue BTS, and all of a sudden, OK, well, you're transferring that risk to the cellular network. It is possible to sniff cellular. It is possible to sniff what you're talking about. But like you said, your point, most people automatically associate to wireless access points that abuse data.
One of the interesting things that I think we found in some of the research that we do internally is that you have an entity which has an iOS app and an Android app, and a lot of times, even if you know-- There's this tendency to trust Android less. The problem is that maybe you protect some of the things on Android, and you protect the other things on iOS. A lot of times we engage in sort of differential reverse engineering, where we get some of the answers from the iOS app, we get some of the answers from the Android app. Because by and large, all of these applications are talking to the same API. They've got the same underlying credentialing system, and so the ability to sort of compare and contrast actually can be another vector attacking us...
Yeah, so you know a lot of that's being propelled by the media. We just saw that news report where it was, what? 500 apps were just pulled from the Google Play Store because they were rogue apps. I mean, there's definitely I think this component where that perception is very much being promulgated by the media, where it's really being seen as the Wild West, where you can easily publish a rogue app or repackage an app and upload it.
Were any of the techniques that you used during the research of this report Android-only? Or, given the amount of time and the tools that you used, could you have done the exact same thing with the iOS apps?
Sure. Yeah, no, they weren't Android-specific. The methodology that I was using is just analyzing the source code, analyzing-- Like you look at private keys being hard-coded into the source code. That is stuff that you can see on any platform. At the end of the day, you're just repackaging it for a specific platform to execute, but it's still the same plumbing. It's still the same guts. And so insecure coding on one platform is going to be insecure coding on the next platform.