Mobile Vulnerabilities Exposed: Change What's Possible in Mobile App Security Testing

Mobile Vulnerabilities Exposed: Change What's Possible in Mobile App Security Testing

If you’re connected to security research in any way, then you’re aware of the many challenges that come with mobile app security testing. Between the insufficiency of device simulators and emulators and the tedium of managing physical devices, mobile app security testing can take extensive time, energy, and resources. In addition, to fully test apps, security researchers and pen testers need to perform dynamic testing in addition to static testing, like reverse engineering, but physical devices and emulators don’t provide the access to data that is required.

The Corellium team recently conducted the Change What’s Possible Webinar Series, during which our Chief Evangelist Brian Robison and Corellium Researcher Steven Smiley discussed the many challenges and types of mobile app security testing, as well as the value of mobile AppSec with arm virtualization. 

Let’s explore the key topics Brian and Steven discussed and dive into ways to navigate the obstacles of mobile AppSec testing. 

The Challenges of Mobile App Security Testing 


The time and resources it takes to source specific devices with specific operating systems that are also able to be jailbroken is one of the primary obstacles to mobile AppSec testing. Obtaining new devices is not difficult, but security researchers often require older devices to test a variety of operating systems. Older mobile devices can be available on the used and refurbished market, but retailers and their products are unreliable, and researchers can never be completely sure they’ll get what they need. There are also physical device labs in which researchers can gain access to certain devices, but these are limited in both scale and scope, either not conveniently located or lacking the necessary devices for testing.

Security research also often requires a physical connection to the devices. Researchers might be able to set up a connection themselves, but if not, they’ll need to request access from a third party. This can be sufficient for QA testing, but using third-party access for mobile app security testing and pen testing is not enough.

Operating Systems 

Mobile app security researchers need access to both Android and iOS operating systems for testing. There are many services that offer emulated Android devices, as well as others that offer access to physical iOS devices, but very few of these services offer access to both — in the same platform and using the same APIs. In addition, what they are able to offer is typically only a few operating system versions, rather than the entire suite that needs to be tested.

Comprehensive mobile app pen testing also requires rooted or jailbroken devices. Without full access to the file system, it’s difficult — if not impossible — for researchers to perform “data at rest” dynamic testing.

Time and Focus

Managing all of these devices and services takes up a lot of time, energy, and resources for mobile AppSec researchers. Managing and maintaining hardware requires:

  • Keeping the devices live

  • Keeping the devices updated (or choosing which devices to update and not update)

  • Shipping physical devices to team members around the world

Security research is an endless cycle of try, fail, and reset/restore. Physical devices require rebooting and re-jailbreaking time and again, which only adds time to an already tedious process. Spending so much time and energy managing devices pulls focus away from the task mobile AppSec researchers are actually meant to be doing: testing the security of the app.

Device Virtualization

To perform the most thorough mobile AppSec testing, researchers need to be able to test both data at rest and data in transit, as well as practice reverse engineering. However, conducting these methods most effectively is nearly impossible with the current tools many researchers use. Arm Virtualization is changing what’s possible for mobile app security testing. It empowers mobile AppSec researchers to perform all relevant tests on all forms of data with virtual access to every device and every operating system update. 

Device virtualization offers nearly instant access to any iPhone 6 through the latest iPhone 14 Pro Max with any iOS version that shipped on these devices. Instant and permanent jailbreaks without relying on a vulnerability to exploit. Corellium Virtualization offers the time savings that we've all enjoyed for the past 20 with x86 virtualization. VM (phone) creation, boot, shutdown, and most importantly virtual phone snapshots and cloning of snapshots to other users. Restoring a device from a snapshot in seconds vs reflashing/rebuilding a physical device saves significant time.

“We’re not talking about emulation. Emulation approaches are lacking and flawed. Corellium’s virtualization approach is truly running Android and iOS directly on Arm code and hardware. We’re talking about literally building a model of a phone and putting it on virtualized hardware so we can gain access to the root file system without having a public vulnerability.”
— Brian Robison, Chief Evangelist, Corellium

Testing: Data at Rest

Testing data at rest refers to when data is in a state of immobility, simply remaining stored in one static location. When conducting mobile app security testing in a data at rest dynamic, the most critical place to start is to understand where and how data is stored and what vulnerabilities might exist in those places. But the locations and exploitability of mobile applications vary from one device and one operating system to another. Let’s discuss where and how to identify gaps in mobile app security in both iOS and Android device locations. 

iOS Devices 

Image of Corellium user interface with an iOS phone.

iOS Keychain Storage

iOS Keychain Storage can be accessed by the iOS Keychain Dumper or through Objection Security Framework. The information stored in the Keychain is most commonly usernames, passwords, and other credentials. A breach of this data can be significant, as it can give attackers access to any data behind a password that does not require two-factor authentication. Keychain data can be secured by configuration data protection with the kSecAttrAccessible key or by encrypting the Keychain data.


NUserDefaults is an iOS system that allows an application to customize its behavior to match a user’s preferences. These preferences can refer to settings like themes, units of measurement, application preferences, etc. If not set up securely, sensitive data can be stored within these preferences. NUserDefaults can be accessed through Objection Security Framework or the preference file within the local data directory.

iOS Application Databases

iOS databases are stored within the local directory which is only accessible via the application unless jailbroken. All application database information stored in plain text is vulnerable to anyone with access to your device.

Corellium’s AppSec testing Arm Virtualization provides root file system access for all iOS versions with multiple databases. Researchers can follow:

  • Identify the path to the application data directory

  • Search the local directory for database content

  • Review the databases for sensitive information

PLIST Files (Property Files)

A PLIST file is a settings file for iOS that typically contains critical information regarding the application's configuration. These files can be identified in two different locations: within the IPA structure or within the local data directory. Used incorrectly, they can expose sensitive data, including API keys, usernames, passwords, and more. PLIST files can be exploited by unzipping the IPA file or by finding it within the local directory.

Android Devices

Shared Preferences

Shared Preferences are XML files that store private primitive data in key-value pairs. It is generally fine to store nonsensitive preferences data or configuration data within Share Preferences. However, any sensitive information like location data, credentials, or credit card data should not be stored in this location — but often is. 

Shared preferences can be world-readable, which means that any information stored within them is accessible from any application and can be read by anyone with access to the device. When used incorrectly, sensitive data can be stored within the XML files instead of generic preferences that provide no sensitive or additional information. To test the app security of Android Share Preferences, they can be found in the storage directory. If you don’t know the app’s package name, researchers can use the “Apps” tab in Corellium.

External Storage

Every Android device contains a shared external storage. This could be an SD card or even added internal device storage. Similar to Shared Preferences, all files within external storage are world-readable. Unlike the Android data directory, however, these files do not get deleted when the app has been deleted. As a result, it’s easy for users to think the files are no longer accessible after deleting the app. For app security testing purposes, external storage can be accessed through the SD card or by virtualizing the internal additional device storage.

Android Application Databases

Unencrypted application databases are stored within the local data directory and can be easily exploitable. Encrypted ones are certainly more protected, but unfortunately, they are still vulnerable depending on how the encryption is implemented. For instance, if the password or code is easily found, then the application database encryption is irrelevant. Or, if it is stored in plain text, it will be eventually exploited by attackers.

For both iOS and Android mobile applications, many engineers and developers will rely on device security alone to ensure the protection of user data within the app. As app pen testers, however, we understand that device security is not a surefire method for data protection. This is why it’s critical for mobile AppSec researchers to have the tools necessary to thoroughly test apps on every device and operating system.  

Testing: Data in Transit

The goal of mobile app security testing of a data in transit dynamic is to protect data being transmitted between the mobile client and the backend server. As data at rest has its own unique set of risks, so does data in transit. Data being moved from one location to another is particularly vulnerable and is a common target for attackers. Let’s talk about the primary kinds of mobile data vulnerability for data in transit, as well as best practices for protecting data in this state.     

Mobile Data Leakage

Mobile data leakage occurs when internal or sensitive data is made accessible to users that are not authorized to see it. This can lead to even greater attacks such as identity theft, reputational risk, and advanced attacks. For example, it’s common for mobile apps to not mask credit card numbers correctly, which allows unauthorized users to sell that data to many different attackers who can use it to access financial accounts and steal identities. In addition, if a user submits a malformed request to which the server responds with framework data or backend systems information, advanced attackers can use it to create advanced attacks that bypass the infrastructure. To discover how your application is protecting data:

  1. Review the traffic being sent from the application

  2. Look through the traffic being sent to third parties

  3. Check for data encryption 

  4. Use session tokens

  5. Inspect URL query strings and eliminate personal data

  6. Analyze security controls implemented to protect user data

Data in Transit Best Practices

When it comes to protecting data in transit within mobile applications, there is a set of baseline methods required to be put in place, such as HTTP and HTTPS. But many mobile AppSec researchers believe these protocols are not enough to protect data on the move and turn to additional methods to ensure its security. 

Network Protocol Usage

HTTP is the foundation of all data exchanges on the world wide web. Mobile applications still utilize plaintext HTTP, which can be vulnerable. HTTPS, however, uses TLS to encrypt exchanged information in order to prevent unintended exposure. Users will notice that many browsers will warn them before entering a website using HTTP instead of HTTPS, alerting the potential vulnerability. Apps should be following the same precautions, favoring the security of HTTPS for the transmission of any sensitive or private data. 

Alternate Data in Transit Security Methods 

Is HTTPS enough protection? There is no complete answer — it really depends on the application, the data, and the level of risk. If there are concerns about the security with HTTPS alone, there are other options to protect data in transit.

  • Certificate Validation: This validates the contents of the certificate being used when making a network connection. Without proper validation, the app can be vulnerable to man-in-the-middle (MITM) attacks.

  • Certificate Pinning: This is the process of associating a host with their expected certificate or public key. The certificate is “pinned,” allowing only trusted certificates to be accepted and all other connections to be dropped.

  • Certificate Transparency: A certificate is issued by a Certificate Authority, a signed certificate timestamp is added to the certificate, and uploaded to a public, distributed network of log servers.

  • App Transport Security: Only available for iOS, ATS ensures all HTTP connections are forced to use HTTPS unless an exception is made in the info.plist.

  • Android Network Security Configuration: Similar to ATS on iOS, Android Network Security Connection is an XML file that developers use to customize network security settings for Android. This includes disallowing clear text traffic, trust anchors to define if an application will allow the use of system certificates, and certificate pinning.

Mobile App Pen Testing

If the above best practice security controls are in place, how can mobile app security researchers perform effective pen testing? Using the Corellium Network Monitor, researchers have access to communication where SSL/TLS is stripped and certificate pinning is already bypassed to quickly review traffic and conduct efficient pen testing for mobile apps.

Testing: Reverse Engineering

Reverse engineering is the process of disassembling an app to reveal its code, internal logic, components, and more. Researchers do this to gain a better understanding of the application and identify paths to exploitation to be addressed. For mobile app security, this involves deconstructing, analyzing, and observing compiled apps to understand underlying functions. Reverse engineering helps mobile AppSec researchers to:

  • Find hardcoded values stored within the application

  • Identify paths to further exploitation of the app

  • Gain insight into the application to build additional scripts


There are two different kinds of reverse engineering. The first is decompilation, which is primarily used for Android apps. A decompiler translates a binary’s low-level code into human-readable high-level code. Depending on the obfuscation in the app, nearly all source code is able to be recovered using a decompiler.


The second type of reverse engineering is disassembly, used for iOS apps. Machine language code is converted into human-readable assembly code. Assembly code requires much more “best guess” work on the part of mobile app security researchers, making iOS app reverse engineering more challenging than decompilation with Android apps.


There are several popular tools for reverse engineering in mobile app security testing, such as Ghidra, IDA Pro, otool, and Class-dump. Each tool has its strengths, and researchers may choose to utilize different tools depending on the data, device, and reverse engineering tasks they are trying to accomplish. Leveraging Corellium with these and other reverse engineering tools can streamline the process by integrating directly with a virtualized device in its native architecture.

Mobile AppSec Testing with Corellium’s Arm Virtualization

At Corellium, we recognize the obstacles that mobile app security researchers face when it comes to managing physical devices and using insufficient device emulation. So we changed what’s possible for researchers to achieve with our Arm-native model that facilitates malware research, app pen testing, and OS vulnerability research.

Virtual Devices   

No more requesting questionable third-party virtual access to devices or spending hours managing and jailbreaking physical devices. Our device virtualization allows you to access Arm-powered devices with any OS/model combination to be sure you can test every possible scenario. 

Arm Server

Virtualization on our Arm-design platform is much more than emulation. These are real devices available in a streamlined, virtual interface that changes how you can test devices on their native architecture.

Powerful Console

“It’s a single platform that lets you run iOS and Android devices as virtual machines and helps you take advantage of the time savings.”
— Brian Robison, Chief Evangelist, Corellium

  • Root access: Root or jailbreak devices instantly — no need to add code or apply security vulnerabilities

  • Control: Configure device buttons, sensors, location, environment, battery, device IDs, ports, cameras, and mics

  • X-ray vision: Powerful app, file, system call, and console access control

  • Forensics: Advanced OS, kernel, and boot control and tooling

  • Network analysis: HTTP/S, traffic inspection, tracing, and logging

  • Replication: Snapshot, clone, restore, and audit device states

  • Teaming: Easy project workspace and team access management

  • Tooling: Simplified connection of additional IDE, debug, and security testing tools and scripts

  • Automation: Corellium’s extensive API allows teams to automate as much security testing as possible

Want to see how it’s done? Watch Brian and Steven perform mobile AppSec testing on a fictional application with many vulnerabilities including unsecure data storage, unsecure communication, and common misconfigurations.