Questions Raised on Apple’s Use of ‘Differential Privacy’ in iOS 10

0
2236

Consumer information specifically what they are up to while using apps or websites is one of the most valuable things a technology company can collect earning them millions – even billions of dollars – worth of revenue each month. All major Internet companies including Google, Facebook, Microsoft, to name a few employ a range of tactics to collect browsing habits of their users to not only provide a better user experience, but also to profit from it.

Apple is also jumping this bandwagon and company’s CEO Tim Cook revealed at the WWDC (World Wide Developer Conference) that Apple’s next iteration of iOS – iOS 10 – will employ something that they call differential privacy (DP) to collect more relevant data about its users. Cook revealed company’s intentions by giving a rather non-harmful looking example – how many users are using a certain emoji. Another example was where to people travel frequently.

Apple iOS 10 will have DP designed to make use of its search – Spotlight and Lookup Hints – feature, which the company says will improve context sensitivity and recommendations for apps, music, restaurants, or whatever based on what you are doing.

Craig Federighi, Apple’s SVP of software engineering, explained at the WWDC that the technology behind DP works by adding “incorrect information” to the data that Cupertino collects. Apple has designed the technology such that its algorithms extract useful insights while making it very difficult for anyone to link data back to an individual user.

If we compare DP with tactics employed by Google and Facebook – both the companies use privately identifiable information (PII) to provide greater user experience — Apple’s DP claims to mask PII. While all this sounds ‘helpful’ there is a darker side to this say many industry experts.

The first among the many points raised by industry experts is that it is much harder to anonymizing consumer data and that chances are that such a level of anonymization can’t actually be achieved.

There are those who believe that DP is effective in individual databases once its lumped together over time it is inevitable that PII will be exposed.

Some are also of the opinion that these are all marketing gimmicks and that at the crux of its all it artful deception that intends to mask up the actual ‘spying’ of user activities.

Going into the technology behind DP, some say that in DP there is fundamental trade-off between accuracy and privacy, which can be a big problem when training complex machine leading models. Once data has been leaked, it’s gone. Once you’ve leaked as much data as your calculations tell you is safe, you can’t keep going – at least not without risking your users’ privacy.

LEAVE A REPLY