For some time, there's been a widespread assumption that iOS security is virtually impenetrable. Through the closed-source nature of the operating system, app encryption, and tight sandboxes, developers often assume that their iOS-based apps are secure by design.
Of course, anybody who knows anything about security will tell you there’s no such thing as 'impenetrable security'. In fact, we’ve been debunking this misconception for some time now.
Unsurprisingly, Apple is reluctant to discuss their security policies and defenses in detail. This means misconceptions about their security approach often extend as far as developers and security professionals. And when it comes to mobile app encryption, it turns out the widespread assumption was even further off the mark than we thought.
I have recently discovered that the encryption in an iOS app was much less extensive than expected. In short, this means your iOS apps are probably less secure than you think. Here’s what you need to know about the discovery:
A big misconception is that iOS app encryption protects your application, and I’d say that’s not correct. It’s mainly there to prevent the arbitrary passing around of apps—it’s about protecting your business model, not the source code or business logic or your apps. It’s there to stop end users from downloading an app and then distributing pirated or cracked versions of it.
It’s commonly assumed that when iOS apps are published to the app store, the entire app is encrypted as part of Apple’s wider suite of built-in security tools. Here’s what the basic process involves:
On the surface, it would seem that all iOS apps are encrypted by design whenever they’re uploaded to the Apple Store. But while there is certainly some level of encryption, Apple is generally pretty quiet on how extensive this is and what it involves.
This means there are a lot of misconceptions around what this iOS mobile app encryption is and what it’s designed to achieve:
So how far does iOS encryption really go?
I took an old jailbroken iPhone that ran iOS 16 and analyzed a well-known streaming app. This is virtually the same process that a hacker would use to analyze/reverse engineer an app. The results were a shock to both me and the wider Promon security team. The truth is, iOS app encryption is a lot less extensive than we thought.
Most people operate under the assumption that the entire main binary is encrypted from start to finish. But, at least on these older devices, that’s not necessarily true. In fact, only a couple of pages of memory were actually encrypted, which was about 16kb of this multi-megabyte binary file. There was no encryption in either the libraries or other files and very little of the binary itself was encrypted, it was just a series of headings and a few sections of code.
In all likelihood, this is a deliberate feature to enhance the performance of apps. The process of decrypting code and converting it back to computer-readable binary takes time, meaning more encryption would likely slow down the app’s startup processes.
Nonetheless, this still means the built-in protections in iOS apps are far less extensive than we thought.
So why isn’t this common knowledge? The truth is, few developers are analyzing apps in this way, because they’re focused on creating apps and shipping them quickly to market, not security. This explains why the misconceptions around iOS security are so widespread.
Of course, this only applies to the iOS 16 operating system that I used to analyze the app. More recent iOS versions may encrypt the entire text section of the main binary. But since many customers are still running apps on older devices, it still leaves open a key vulnerability.
Read more: Securing streaming apps: How app shielding protects your intellectual property
All this means that a hacker with a jailbroken device has access to much more of the source code and business logic than we originally thought was the case.
Of course, this will differ depending on how the app itself is architected. If most of the business logic is in libraries rather than the binary, they’ll be able to access virtually all of it.
But as we’ve discovered, even the whole binary isn’t encrypted—at least not on the iOS 16 device we tested. This creates several key risks:
All of these threats stem from one fundamental issue: the hacker can access the app’s code, logic, or sensitive information using a jailbroken device.
While jailbreaking is still fundamentally possible, this will always be an issue. And since there is no 100% effective method of detecting jailbroken devices, we need to turn to other solutions to help keep our apps safe.
If there’s one thing you should take away from this article it’s this: You can’t rely on out-of-the-box Apple security to keep your apps safe from either security or copyright issues.
Luckily, there are other options. In fact, there are a range of tools that can help protect your code and business logic from both runtime and static analysis. These include runtime application self-protection and code obfuscation.