Skip to main content
Apr 03, 2019

Vulnerability Epidemic in Financial Mobile Apps - Episode 1 [Video]

Summary of research and findings


Well, we're here today to talk about the state of application security. And our guest, Alissa Knight, is here with some interesting research to talk about.

I'm Alissa Knight. I am a security analyst with Aite Group, and I am a 19-year veteran as a recovering hacker.

Great. Aaron?

I'm Aaron Lint. I'm Arxan's VP of research and chief scientist.

And I'm Rusty Carter. I'm the vice president of product management for Arxan. So Alissa, can you tell us a little bit about your research and how you went about it?

Sure. So I had started this research project into application security, and specifically around application shielding And how systemic within the financial services industry companies were not shielding their apps. And so I had worked with my research director to identify the three apps in each financial services industry sector that we were going to do, which included wealth management, retail banking, basically all of the individual verticals within the horizontal sector.

And the results were very shocking. I knew that they would be bad, but I didn't know that they would be that bad. Out of all of the apps, only one company had implemented application obfuscation to obfuscate the source code and shielding. And that was not a US company. So it is very systemic across the financial services industry.

And so the methodology that I used was after downloading the application to the Android device--I used an Android tablet running Android 7. And after using a tool to extract it off of the device, had loaded it onto my local system for what's called static code analysis, where I then loaded it into--using different tools like Apktool to actually extract the APK file and then load it into a decompiler to decompile it.

And these are all the applications that are just right off of the Google Play Store?

Correct. So I didn't go to any third party app stores. Everything was the official app distribution from the financial--or the FI, I'll call them--within the Google Play Store.

So these are the things that we've been reading about that are supposed to be secure and scanned and protected.

Right. You would think especially within the financial services industry that these would be highly protected, that you wouldn't even be able to decompile it, that it would just be garbage. But it was shocking.

The results that I found when performing this analysis was private keys being stored inside the app, some of them stored in subdirectories with no encryption precomputed, no passwords. I could just load them into--on the Mac, it's called Keychain. So I could just load them into Keychain without being prompted for a password. And these are-- we're not talking about-- some of them were car insurance apps, but others were wealth management companies where security should be a top priority.

Some of the other findings included finding out that what looked to be debug logging, they were logging absolutely everything, including user input to the log files, to the logs of the app where anything--like I said, the only way I could describe it was it looked like debug logging, debug output, where absolutely everything was going to log files.

If you took your skills, and you applied this to the research, and you used some common tools that were available, were there any really acute issues that you found that someone without your level of skill would be able to discover that would put these businesses at risk?

I would say that the most critical findings were the P2P payment money transfer apps. So each one was categorized. And I tried to anonymize the data as much as possible, and especially in the report that will be published. I tried to use percentages. I can tell you that the highest number of critical findings where you could potentially have account takeover as a result of the findings, or that the user's either account credentials, or even be able to intercept and replay the traffic, were in the P2P money transfer apps.

The second worst category of findings were definitely in the retail banking apps.

And then everything else followed suit from there. We did include healthcare providers. So three major health insurance companies were selected, and those findings were about just as devastating as the money transfer apps.

With the health care apps, they were different findings, but they were certainly--it's almost as if the developers who wrote the code didn't realize that you could actually access the Android operating system or Android file system to get what they are actually storing to text clear text files on the Android device. It's like a security through obscurity approach, almost. And these are-- this is health care information. This is PHI that in any other circumstance, on a server or a file server or a shared folder, would have been encrypted data at rest. It was just being written to log files or text files on the Android OS.

So I averaged roughly about 8 and 1/2 minutes per app. I got what I would call to a point of staging, where I staged the apps and was ready to actually analyze and look at the code, all those took roughly about 8, 8 and 1/2 minutes. And then I spent the rest of the week, and a period of five days, just looking at source code and looking at log files and what's called dynamic code analysis. But my research was limited to the static network interdiction where I set up what's called Burp Suite to actually interdict the traffic that was going out, and was just as alarmed with the findings on the network side on layer 3.

But aren't all the apps using HTTPS or SSL to connect?

You would think, especially the financial services apps. In my findings, there were quite a few URLs and traffic being passed over HTTP. Some of the other findings included QA and dev URLs in the code where I actually went to the-- connected to the APIs, and they were live and would respond to TCP requests. And a simple flip of a switch in the code, or what have you, would basically redirect that traffic to a QA or a dev server.

Did you see similar findings with certificate pinning?

Yes. The PKI side of the findings was very surprising as well. Like I said, something as simple as protecting the private keys on the file system, stuff like that, or just being hard-coded in the app, was very surprising as well across multiple sectors. It wasn't just one sector. They seem to be systemic across all of the individual verticals.

Rusty Carter

Rusty Carter is a security software executive with over 20 years experience, and the current Vice President of Product Management at Arxan Technologies, an application security company that provides application shielding and protection against reverse-engineering and tampering to the world’s largest companies. Prior to Arxan, Mr. Carter led product management at Symantec, McAfee, and Pulse Secure (formerly Juniper), and was responsible for the introduction and growth of multiple new products and lines of business. Mr. Carter holds international patents in the fields of information security, AI / machine learning, telecommunications, mobile devices, and user interaction. Mr. Carter’s background includes system and software architecture, engineering, and has a bachelors degree in Psychology from the University of Arizona.

The Vulnerability Epidemic in Financial Mobile Apps

Join the webinar with Aaron Lint, Arxan chief scientist, and Alissa Knight, Aite Group senior cybersecurity analyst, on Tuesday, April 23, 2019.
More from the Blog
Apr 02, 2019

Vulnerability Epidemic in Financial Mobile Apps [Infographic]

Sep 18, 2018

The App Is The Endpoint

Traditional Endpoint Security is dead, that is to say that hardening the laptop, desktop, or device is not a panacea. ...
Read more
Nov 14, 2018

Securing mobile apps against reverse engineering and hacking [Podcast]

Listen to Alissa Knight interview Ken Jochims about Arxan Technologies and application s
Read more