diff --git a/Document/0x04b-Mobile-App-Security-Testing.md b/Document/0x04b-Mobile-App-Security-Testing.md index 8fcb9c8d7f..20a3a8dcb5 100644 --- a/Document/0x04b-Mobile-App-Security-Testing.md +++ b/Document/0x04b-Mobile-App-Security-Testing.md @@ -366,13 +366,3 @@ The security of an application developed with DevOps must be considered during o - [paul] - M. Paul. Official (ISC)2 Guide to the CSSLP CBK, Second Edition ((ISC)2 Press), 2014 - [mcgraw] - G McGraw. Software Security: Building Security In, 2006 - -### OWASP MASVS - -- MSTG-ARCH-1: "All app components are identified and known to be needed." -- MSTG-ARCH-3: "A high-level architecture for the mobile app and all connected remote services has been defined and security has been addressed in that architecture." -- MSTG-ARCH-4: "Data considered sensitive in the context of the mobile app is clearly identified." -- MSTG-ARCH-5: "All app components are defined in terms of the business functions and/or security functions they provide." -- MSTG-ARCH-6: "A threat model for the mobile app and the associated remote services has been produced that identifies potential threats and countermeasures." -- MSTG-ARCH-7: "All security controls have a centralized implementation." -- MSTG-ARCH-10: "Security is addressed within all parts of the software development lifecycle." diff --git a/Document/0x04i-Testing-User-Privacy-Protection.md b/Document/0x04i-Testing-User-Privacy-Protection.md new file mode 100644 index 0000000000..e448ece853 --- /dev/null +++ b/Document/0x04i-Testing-User-Privacy-Protection.md @@ -0,0 +1,143 @@ +# Mobile App User Privacy Protection + +**IMPORTANT DISCLAIMER:** The MSTG is not a legal handbook. Therefore, we will not deep dive into the GDPR or other possibly relevant legislations here. This chapter is meant to introduce you to the topics and provide you with essential references that you can use to continue researching by yourself. We'll also do our best effort to provide you with tests or guidelines for testing the privacy related requirements listed in the OWASP MASVS. + +## Overview + +### The Main Problem + +Mobile apps handle all kinds of sensitive user data, from identification and banking information to health data. There is an understandable concern about how this data is handled and where it ends up. We can also talk about "benefits users get from using the apps" vs "the real price that they are paying for it" (usually and unfortunately without even being aware of it). + +### The Solution (pre 2020) + +To ensure that users are properly protected, legislation such as the [General Data Protection Regulation (GDPR)](https://gdpr-info.eu/ "GDPR") in Europe has been developed and deployed (applicable since May 25, 2018), forcing developers to be more transparent regarding the handling of sensitive user data. This has been mainly implemented using privacy policies. + +### The Challenge + +There are two main dimensions to consider here: + +- **Developer Compliance**: Developers need to comply with legal privacy principles since they are enforced by law. Developers need to better comprehend the legal principles in order to know what exactly they need to implement to remain compliant. Ideally, at least, the following must be fulfilled: + - **Privacy-by-Design** approach (Art. 25 GDPR, "Data protection by design and by default"). + - **Principle of Least Privilege** ("Every program and every user of the system should operate using the least set of privileges necessary to complete the job.") +- **User Education**: Users need to be educated about their sensitive data and informed about how to use the application properly (to ensure a secure handling and processing of their information). + +> Note: More often than not apps will claim to handle certain data, but in reality that's not the case. The IEEE article ["Engineering Privacy in Smartphone Apps: A Technical Guideline Catalog for App Developers" by Majid Hatamian](https://www.researchgate.net/publication/339349349_Engineering_Privacy_in_Smartphone_Apps_A_Technical_Guideline_Catalog_for_App_Developers) gives a very nice introduction to this topic. + +### Protection Goals for Data Protection + +When an app needs personal information from a user for its business process, the user needs to be informed on what happens with the data and why the app needs it. If there is a third party doing the actual processing of the data, the app should inform the user about that too. + +Surely you're already familiar with the classic triad of security protection goals: confidentiality, integrity, and availability. However, you might not be aware of the three protection goals that have been proposed to focus on data protection: + +- **Unlinkability**: + - Users' privacy-relevant data must be unlinkable to any other set of privacy-relevant data outside of the domain. + - Includes: data minimization, anonymization, pseudonymization, etc. +- **Transparency**: + - Users should be able to request all information that the application has on them, and receive instructions on how to request this information. + - Includes: privacy policies, user education, proper logging and auditing mechanisms, etc. +- **Intervenability**: + - Users should be able to correct their personal information, request its deletion, withdraw any given consent at any time, and receive instructions on how to do so. + - Includes: privacy settings directly in the app, single points of contact for individuals’ intervention requests (e.g. in-app chat, telephone number, e-mail), etc. + +> See Section 5.1.1 "Introduction to data protection goals" in ENISA's ["Privacy and data protection in mobile applications"](https%3A%2F%2Fwww.enisa.europa.eu%2Fpublications%2Fprivacy-and-data-protection-in-mobile-applications%2Fat_download%2FfullReport&usg=AOvVaw06m90YDUaLCeeD2r-Ompgn) for more detailed descriptions. + +Addressing both security and privacy protection goals at the same time is a very challenging task (if not impossible in many cases). There is an interesting visualization in IEEE's publication [Protection Goals for Privacy Engineering](https://ieeexplore.ieee.org/document/7163220) called ["The Three Axes"](https://ieeexplore.ieee.org/document/7163220#sec2e) representing the impossibility to ensure 100% of each of the six goals simultaneously. + +Most parts of the processes derived from the protection goals are traditionally being covered in a privacy policy. However, this approach is not always optimal: + +- developers are not legal experts but still need to be compliant. +- users would be required to read usually long and wordy policies. + +### The New Approach (Google's and Apple's take on this) + +In order to address these challenges and help users easily understand how their data is being collected, handled and shared, Google and Apple introduced new privacy labeling systems (very much along the lines of NIST's proposal for [Consumer Software Cybersecurity Labeling](https://www.nist.gov/system/files/documents/2021/11/01/Draft%20Consumer%20Software%20Labeling.pdf): + +- the App Store [Nutrition Labels](https://www.apple.com/privacy/labels/) (since 2020). +- the Google Play [Data Safety Labels](https://android-developers.googleblog.com/2021/05/new-safety-section-in-google-play-will.html) (since 2021). + +As a new requirement on both platforms, it's vital that these labels are accurate in order to provide user assurance and mitigate abuse. + +### How this Relates to Testing Other MASVS Categories + +The following is a list of [common privacy violations](https://support.google.com/googleplay/android-developer/answer/10144311?hl=en-GB#1&2&3&4&5&6&7&87&9&zippy=%2Cexamples-of-common-violations) that you as a security tester should report (although not an exhaustive list): + +- Example 1: An app that accesses a user's inventory of installed apps and doesn't treat this data as personal or sensitive data by sending it over the network (violating MSTG-STORAGE-4) or to another app via IPC mechanisms (violating MSTG-STORAGE-6). +- Example 2: An app displays sensitive data such as credit card details or user passwords without user authorization via e.g. biometrics (violating MSTG-AUTH-10). +- Example 3: An app that accesses a user's phone or contact book data and doesn't treat this data as personal or sensitive data, additionally sending it over an unsecured network connection (violating MSTG-NETWORK-1). +- Example 4: An app collects device location (which is apparently not required for its proper functioning) and does not have a prominent disclosure explaining which feature uses this data (violating MSTG-PLATFORM-1). + +> You can find more common violations in [Google Play Console Help (Policy Centre -> Privacy, deception and device abuse -> User data)](https://support.google.com/googleplay/android-developer/answer/10144311?hl=en-GB#1&2&3&4&5&6&7&87&9&zippy=%2Cexamples-of-common-violations). + +As you can see this is deeply related to other testing categories. When you're testing them you're often indirectly testing for User Privacy Protection. Keep this in mind since it will help you provide better and more comprehensive reports. Often you'll also be able to reuse evidences from other tests in order to test for User Privacy Protection (see an example of this in ["Testing User Education"](#testing-user-education-mstg-storage-12)). + +### Learn More + +You can learn more about this and other privacy related topics here: + +- [iOS App Privacy Policy](https://developer.apple.com/documentation/healthkit/protecting_user_privacy#3705073) +- [iOS Privacy Details Section on the App Store](https://developer.apple.com/app-store/app-privacy-details/) +- [iOS Privacy Best Practices](https://developer.apple.com/documentation/uikit/protecting_the_user_s_privacy) +- [Android App Privacy Policy](https://support.google.com/googleplay/android-developer/answer/9859455#privacy_policy) +- [Android Data Safety Section on Google Play](https://support.google.com/googleplay/android-developer/answer/10787469) +- [Android Privacy Best Practices](https://developer.android.com/privacy/best-practices) + +## Testing User Education (MSTG-STORAGE-12) + +### Testing User Education on Data Privacy on the App Marketplace + +At this point we're only interested into knowing which privacy related information is being disclosed by the developers and try to evaluate if it seems reasonable (similarly as you'd do when testing for permissions). + +> It's possible that the developers are not declaring certain information that is indeed being collected and or shared, but that's a topic for a different test extending this one here. As part of this test you are not supposed to provide privacy violations assurance. + +### Static Analysis + +You can follow these steps: + +1. Search for the app in the corresponding app marketplace (e.g. Google Play, App Store). +2. Go to the section ["Privacy Details"](https://developer.apple.com/app-store/app-privacy-details/) (App Store) or ["Safety Section"](https://android-developers.googleblog.com/2021/05/new-safety-section-in-google-play-will.html) (Google Play). +3. Verify if there's any information available at all. + +The test passes if the developer have complied with the app marketplace guidelines and included the required labels and explanations. Store and provide the information you got from the app marketplace as evidence, so that you can later use it to evaluate potential violations of privacy or data protection. + +### Dynamic analysis + +As an optional step, you can also provide some kind of evidence as part of this test. For instance, if you're testing an iOS app you can easily enable app activity recording and export a [Privacy Report](https://developer.apple.com/documentation/network/privacy_management/inspecting_app_activity_data) containing detailed app accesses to different resources such as photos, contacts, camera, microphone, network connections, etc. + +Doing this has actually many advantages for testing other MASVS categories. It provides very useful information that you can use to [test network communication](0x06g-Testing-Network-Communication.md) in MASVS-NETWORK or when [testing app permissions](0x06h-Testing-Platform-Interaction.md#testing-app-permissions-mstg-platform-1) in MASVS-PLATFORM. While testing these other categories you might have taken similar measurements using other testing tools. You can also provide this as evidence for this test. + +> Ideally, the information available should be compared against what the app is actually meant to do. However, that's far from a trivial task that could take from several days to weeks to complete depending on your resources and support from automated tooling. It also heavily depends on the app functionality and context and should be ideally performed on a whitebox setup working very closely with the app developers. + +### Testing User Education on Security Best Practices + +Testing this might be especially challenging if you intend to automate it. We recommend to use the app extensively and try to answer the following questions whenever applicable: + +- **Fingerprint usage**: when fingerprints are used for authentication providing access to high risk transactions/information, + + _does the app inform the user about potential issues when having multiple fingerprints of other people registered to the device as well?_ + +- **Rooting/Jailbreaking**: when root or jailbreak detection is implemented, + + _does the app inform the user of the fact that certain high-risk actions will carry additional risk due to the jailbroken/rooted status of the device?_ + +- **Specific credentials**: when a user gets a recovery code, a password or a pin from the application (or sets one), + + _does the app instruct the user to never share this with anyone else and that only the app will request it?_ + +- **Application distribution**: in case of a high-risk application and in order to prevent users from downloading compromised versions of the application, + + _does the app manufacturer properly communicate the official way of distributing the app (e.g. from Google Play or the App Store)?_ + +- **Prominent Disclosure**: in any case, + + _does the app display prominent disclosure of data access, collection, use, and sharing? e.g. does the app use the [App Tracking Transparency Framework](https://developer.apple.com/documentation/apptrackingtransparency) to ask for permission on iOS?_ + +## References + +- Open-Source Licenses and Android - +- Software Licenses in Plain English - +- Apple Human Interface Guidelines - +- Android App permissions best practices - + +### OWASP MASVS + +- MSTG-STORAGE-12: "The app educates the user about the types of personally identifiable information processed, as well as security best practices the user should follow in using the app." diff --git a/Document/0x04i-Testing-user-interaction.md b/Document/0x04i-Testing-user-interaction.md deleted file mode 100644 index ed0b1b85df..0000000000 --- a/Document/0x04i-Testing-user-interaction.md +++ /dev/null @@ -1,66 +0,0 @@ -# Mobile App User Interaction - -## Testing User Education (MSTG-STORAGE-12) - -A lot has happened lately in terms of responsibilities that developers have to educate users on what they need to know. -This has shifted especially with the introduction of the [General Data Protection Regulation (GDPR)](https://gdpr-info.eu/ "GDPR") in Europe. Ever since then, it is best to educate users on what is happening with their private data and why. -Additionally, it is a good practice to inform the user about how to use the application properly. This should ensure a secure handling and processing of the user's information. -Next, a user should be informed on what type of device data the app will access, whether that is PII or not. -Last, you need to share OSS related information with the user. -All four items will be covered here. - -> Please note that this is the MSTG project and not a legal handbook. Therefore, we will not cover the GDPR and other possibly relevant laws here. - -### Informing users on their private information - -When you need personal information from a user for your business process, the user needs to be informed on what you do with the data and why you need it. If there is a third party doing the actual processing of the data, you should inform the user about that too. Lastly, there are three processes you need to support: - -- **The right to be forgotten**: Users need to be able to request the deletion of their data, and be explained how to do so. -- **The right to correct data**: Users should be able to correct their personal information at any time, and be explained how to do so. -- **The right to access user data**: Users should be able to request all information that the application has on them, and be explained how to request this information. - -Most of this can be covered in a privacy policy, but make sure that it is understandable by the user. - -When additional data needs to be processed, you should ask the user for consent again. During that consent request it needs to be made clear how the user can revert from sharing the additional data. Similarly, when existing datasets of a user need to be linked, you should ask the user's consent about it. - -### Informing the user on the best security practices - -Here is a list of best practices where a user could be informed of: - -- **Fingerprint usage**: When an app uses a fingerprint for authentication and it provides access to high risk transactions/information, inform the user about the issues there can be when having multiple fingerprints of other people registered to the device as well. -- **Rooting/Jailbreaking**: When an app detects a rooted or jailbroken device, inform the user of the fact that certain high-risk actions will carry additional risk due to the jailbroken/rooted status of the device. -- **Specific credentials**: When a user gets a recovery code, a password or a pin from the application (or sets one), instruct the user to never share this with anyone else and that only the app will request it. -- **Application distribution**: In case of a high-risk application it is recommended to communicate what the official way of distributing the app is. Otherwise, users might use other channels in which they download a compromised version of the application. - -### Access to Device Data - -Although partially covered by the Google Play Store and the Apple App Store, you still need to explain to the user which services your app consumes and why. For instance: - -- Does your app require access to the contact list? -- Does your app need access to location services of the device? -- Does your app use device identifiers to identify the device? - -Explain the user why your app needs to do this kind of things. More information on this subject can be found at the [Apple Human Interface Guidelines](https://developer.apple.com/design/human-interface-guidelines/ios/app-architecture/requesting-permission/ "Apple Human Interface Guidelines") and the [Android App permissions best practices](https://developer.android.com/training/permissions/requesting.html#explain "Android App permissions best practices"). - -### Other Information You Have to Share (OSS Information) - -Given copyright laws, you must make sure you inform the user on any third party libraries that are used in the app. For each third party library you should consult the license to see if certain information (such as copyright, modifications, original author, ...) should be presented to the user. For this, it is best to request legal advice from a specialist. An example can be found at [a blog post from Big Nerd Ranch](https://www.bignerdranch.com/blog/open-source-licenses-and-android/ "Example on license overview"). Additionally, the website [TL;DR - Legal](https://tldrlegal.com/ "TL;DR - Legal") can help you in figuring out what is necessary for each license. - -## References - -### OWASP MASVS - -- MSTG-STORAGE-12: "The app educates the user about the types of personally identifiable information processed, as well as security best practices the user should follow in using the app." - -### Example for open source license mentioning - -- - -### Website to Help with Understanding Licenses - -- - -### Guidance on Permission Requesting - -- Apple Human Interface Guidelines - -- Android App permissions best practices - diff --git a/Document/SUMMARY.md b/Document/SUMMARY.md index 06bdcd22ed..a9d67d0a57 100644 --- a/Document/SUMMARY.md +++ b/Document/SUMMARY.md @@ -16,7 +16,7 @@ - [Cryptography in Mobile Apps](0x04g-Testing-Cryptography.md) - [Testing Code Quality](0x04h-Testing-Code-Quality.md) - [Tampering and Reverse Engineering](0x04c-Tampering-and-Reverse-Engineering.md) -- [Testing User Education](0x04i-Testing-user-interaction.md) +- [Testing User Privacy Protection](0x04i-Testing-User-Privacy-Protection.md) ## Android Testing Guide diff --git a/README.md b/README.md index 1721dcc0d9..ac71641017 100644 --- a/README.md +++ b/README.md @@ -32,7 +32,7 @@ The MSTG and the MASVS are being adopted by many companies, standards, and vario - [Cryptography in Mobile Apps](Document/0x04g-Testing-Cryptography.md) - [Testing Code Quality](Document/0x04h-Testing-Code-Quality.md) - [Tampering and Reverse Engineering](Document/0x04c-Tampering-and-Reverse-Engineering.md) -- [Testing User Education](Document/0x04i-Testing-user-interaction.md) +- [Testing User Privacy Protection](0x04i-Testing-User-Privacy-Protection.md) ### Android Testing Guide