By Simon Rice, Group Manager for Technology.
How long could people get by without using mobile apps? Maybe as long as a day without checking the weather or bank balance. Probably only a few hours without checking emails. And maybe even less than that for Facebook, or Twitter, or Whatsapp.
A study suggested that last summer suggested that smartphone users spent 89% of their mobile media time using mobile apps. And that usage equals money. App development is big business, with the European app economy supporting more than one and a half million jobs.
But as organisations look at where apps can improve the service they offer (and often the profits they target), we’re keen to make sure that privacy isn’t forgotten.
In 2013, we published our guidance for app developers, which covers the areas of the Data Protection Act developers need to consider, to ensure their organisations aren’t at risk of a data breach that could damage their reputation and leave them open to a £500,000 fine.
There’s certainly room for improvement in the sector. Last year we carried out a detailed review of 21 popular apps. We installed and used each app, and analysed its network traffic. Where encrypted connections were seen, we used a fake digital certificate to try and perform a ‘man-in-the-middle’ attack and decrypt the traffic*. We also looked for evidence of cookies being set and whether consent was sought, and noted any other privacy or security issues.
The results weren’t panic-inducing, and as yet we’ve found nothing as flagrant as the torch app enforced against by the US Federal Trade Commission, for instance. However, there was plenty to be concerned about, even from our small first sample size of 21.
One important problem found in three of the apps was the use of unencrypted connections to transmit personal data. While not all of the data was sensitive, we did discover two login methods that used unencrypted (plain HTTP) connections. This meant that an attacker could could easily snoop on the communication and grab usernames and passwords, if they could get into a position to do so. That might sound difficult, but using the same wifi hotspot as you can often be enough – and with widespread open wifi in public areas such as cafes, hotels and restaurants, that’s a serious concern.
Probably the most concerning from a purely technical point of view was that three of the apps that were using encrypted connections did not check digital certificates adequately. HTTPS remains an effective method for keeping data confidential in transit, but only when it’s set up and used properly (and recent vulnerabilities such as DROWN demonstrate how important this is). While encryption on its own guards against casual snooping, it doesn’t stop an attacker from impersonating a server. Proper certificate checking allows an app to be sure that it’s communicating with the intended server.
We discovered these certificate-checking flaws because our man-in-the-middle attacks worked – the three apps in question accepted our fake certificate and we were therefore able to intercept what should have been secure transmissions. Interestingly, one of these three apps was checking certificates properly for the most part, with only one instance of improper checking discovered. This demonstrates that even when developers know how to code securely, they still need to ensure they do so throughout their code.
Other issues we found during our work included:
- default passwords and weak password requirements;
- setting of cookies without consent;
- transmission of passwords in the URL;
- unexplained usage of tracking ID numbers;
- misleading interface design; and
- just plain annoyance (e.g. advertising in the notification bar).
Where appropriate, we wrote to the developers about the issues we found, and we’re happy that all the changes that needed to be made with those apps have now been made.
We’re now moving on to further investigations into mobile apps. This week we’ve started looking at apps in the finance and wellbeing areas. Clearly these apps can be processing some very sensitive information, and there’s no excuse for them not processing data properly. We’ll publish what we find later this year.
In the meantime, developers need to keep thinking about data protection issues. As people rely more and more on mobile apps, the risk of an app getting it wrong becomes greater. Now is a good time to read those 24 pages of ICO guidance.
* We’d like to credit the mitmproxy development team, whose software is used in our testing set-up. We’d also like to thank CERT for making available their Tapioca VM, which demonstrates how easily a man-in-the-middle attack can be mounted in practice.
|Simon Rice is the Group Manager for the Technology team which provides technical expertise to all ICO departments in order to support the broad range of activities undertaken by the ICO.|