Apple’s Hidden Privacy Agenda

Is Apple being reckless?

One observation that some Apple pundits like throwing around is that Apple tends to add features with a broader future implementation in mind. For example, Apple added TouchID initially for unlocking your phone only. Then after a year or two, they added Apple Pay.

Although I think it would be wrong to expect Apple to be doing this for every feature, I do consider it very helpful to keep this in mind. That is, do not dismiss their actions unless you have throughly considered the possibility of a hidden agenda that will only reveal itself a few years into the future.

Apple’s stance on privacy is one of these actions.

  1. Most people have commented that Apple’s focus on privacy will strongly hinder, maybe even cripple their artificial intelligence efforts. This is very dangerous for Apple’s future because it is predicted that artificial intelligence will be a huge part of future personal computing.
  2. The plus side of a privacy focus is that it becomes a selling point for their products. However, we also know that today’s consumers do not care too much about privacy; at least, they seem to be happy to post photos on Facebook and search on Google.

Taking the two points above, it would seem reckless for any tech company to take the privacy position that Apple is holding today. The demerits are huge while the merits look benign. It looks like a totally irrational move for Apple that maybe enforced only because of Tim Cook’s personal beliefs in human rights. It does not make any sense, that is unless Apple has a larger agenda for the future; an agenda in which privacy plays an essential role.

Looking at Apple’s future markets

As I have mentioned previously, Apple cannot grow significantly larger than it is today without expanding into markets outside of tech. The market that tech can directly address, the market to which Apple can sell its current devices, is limited by the size of the economies in the countries which it sells to, and the amount of money each household is willing to spend on communications and entertainment. Apple has to move into different household buckets of spending. Furthermore, these buckets have to be large enough to drive revenue that can significantly contribute to Apple’s huge earnings.

Looking at what households actually spend their money on, one obvious contender is health. US households spend a huge proportion of their income on health, and for the countries which have an adequate healthcare system in place, health is a huge proportion of their government expenditure. There is a lot of money in health, and as populations in both developed and developing countries age, it is only going to get larger.

Apple is already actively involved in health. Not only does Apple have HealthKit, it also has ResearchKit which allows researchers to easily conduct large studies on patients and CareKit which allows patients to track and manage their own medical conditions. Importantly, privacy of health information is taken very seriously (unlike web history or location tracking data), and although I am no expert, it seems that there are rules and laws even in the USA for this.

For any company that seriously wants to get into health, data privacy is a hugely important issue. In particular, IT giants like Google or Apple will be held to higher standards, and expected to develop the necessary technologies if not yet available. They will be scrutinised by not only the authorities, but also by the regular press. If Apple wants to go further into health, prove the value of their services, and to extract revenue from this huge market, then they have to get the privacy issues sorted out first, and apply leading edge technology to protect patient privacy. This will be the prerequisite.

This is where I find Apple’s hidden privacy agenda. Apple does not need to have strict privacy to compete in the tech world against Google and Amazon. In fact, its privacy stance is detrimental for cutting edge artificial intelligence since server hardware will always be much more powerful than tiny smartphones for machine learning, and differential privacy will always negatively impact what patterns can be observed. However, to impact some key non-tech markets that Apple needs to venture into, privacy will be important and essential. Apple’s stance on privacy should be viewed not by which markets they are selling now, but on which markets they intend to sell to in the future.

Doing The Hard Things In Tech

When observing all the mega-hits that Apple has brought to the market the past 40 years, there is one consistent theme. Apple tries to do the things that are considered hard or even impossible at that time.

With the original Mac, they created a GUI-only computer that had a mere 128K bytes of memory. With the iPod, they synced 1,000 tunes (5GB’s worth) to your PC in an age where the predominant I/O (USB 1) was woefully inadequate (and tiny hard drives had just become available). With the iPhone, they shrunk a full blown PC into the size of a chocolate bar. With Mac OS X, they implemented a radically new graphical rendering system (Quartz Compositor) that taxed memory and CPU power and was unbearably slow on the hardware at the time, which only became usable years later with powerful new GPUs (MacOS X 10.2).

In all these cases, Apple was not shy to do something that most people at that time considered very difficult, if not impossible. Sometimes even Apple failed to do it well enough, and suffered the consequences of an inadequate product (low early Mac sales, super slow MacOS X 10.0, 10.1). But in the end, that is why they managed to differentiate, because others had not even started.

Apple’s approach to privacy can be seen in the same way. Whereas the common narrative was that you needed huge servers and massive data sets for good photo recognition, Apple has implemented machine learning on a smartphone that fits into your pocket. Of course they may be taking shortcuts, but so did the Mac 128K. What is important is that they took the challenge while everybody else was doing machine learning the old way (on powerful servers with less regard for privacy). Similarly, Apple has implemented a differential privacy approach which still has no guarantee of success. Even experts in the field are split and some say that the privacy trade-offs between machine learning effectiveness might result in a product that won’t work. Apple made the bet nonetheless. Apple chose to take the hard, possibly impossible way, by hobbling itself with the self-imposed shackle that is a privacy focus. They have thought different.

The simple reason why Apple’s approach has worked even once, is Moore’s law. Moore’s law is the central source of rapid technical progress and disruption, and it makes what is impossible today into something easy to achieve tomorrow.

No one who has seen the progress of silicon would doubt that Moore’s law will eventually make the processing tasks done exclusively on high power servers today, possible on the smartphones of tomorrow. We should also consider that the amount of data collected from smart devices must be growing even faster than Moore’s law (thanks to the shrinking size and ubiquity made possible by Moore’s law in the first place). Tomorrow, we will have many times more data than we collect today, and it is totally possible that the sheer vastness of data will make it possible to infer meaningful conclusions from differential privacy data, even when anonymised under very stringent noise levels.

Therefore, I predict that even though Apple’s approach to privacy may lead to a worse experience for the next couple of years, as Moore’s law kicks in, the difference will end up being negligible. By the time the general public become acutely aware for the need for privacy, Apple will have a powerful solution that in terms of user experience is just as good as Google’s.

The boldness to go all-in on a technology that just barely works, based on the hope that Moore’s law will save them in the next couple of years, is a defining feature of Apple’s hugely successful innovations. This is a formula that has worked for them time and time again.

This is what I see in Apple’s current privacy approach, and this is why I find it so typically and belovingly Apple.

Opening Up iOS And Implications

In the 2016 WWDC Keynote, Apple showed how it was going to open up Siri, Messages and Maps. It also showed how it was going to allow VoIP apps to show incoming calls just like how the default Phone app does; using the full screen.

Now if this was just Messages, then we might think that this was in response to the popularity of messaging apps like WeChat which work as platforms. However, if you listen to the State of the Union presentation after the Keynote, then you learn that even Xcode has opened up. It then becomes apparent that this is not just a simple response to WeChat, but a deliberate iOS-wide and even Apple ecosystem-wide direction that Apple is coordinating with their extensions system.

This extension system is not something that is new. In fact, it is an extremely old idea that is more often referred to “plug-in”. It is the idea that allowed browsers to provide rich multimedia experiences before the advent of HTML5. It is the idea that allows programming editors like Eclipse to become very rich tools for a huge number of programming languages. It has already been proven that this mechanism allows programs to be used for occasions that were never envisioned by their original creators, and can be very useful and effective. Although it does tend to add a layer of complexity for the end user, it is undoubtedly a feature that can have widespread impact.

Given that the extensions are likely to be very popular, then it is worthwhile to try to predict how they will advantage Apple and/or dis-advantage its competitors.

  1. Let’s ponder whether Google would open up Maps for example. Would they let third party apps provide the restaurant and shop recommendations layered onto Google Maps? What would be the implications for their business model that depends on showing sponsored recommendations in a more prominent way?
  2. Would wireless carriers be happy with VoIP apps that can integrate into the iOS to behave in just the same way as the default Phone app?
  3. Would Amazon open up its store so that random online stores can integrate themselves in the categorical listings and search results?

Many of Apple’s competitors provide the app layer for free and monetise at the extension layer. Google Maps plans to monetise by providing advertisements relevant to your location, but the Apple Maps extensions will allow third parties to provide this instead of Apple. Similarly Amazon provides an online store website with good search, recommendations and reviews. It monetises when people actually make purchases, which is similar to the layer that Apple’s extensions live in.

What we see here is that Apple has created a powerful extensions mechanism ecosystem-wide, that is almost guaranteed to be popular, and which may conflict with the business models of Google, Amazon and many other competitors.

The implications will be interesting to watch.

Thoughts on WWDC 2016

Here I want to jot down some of my key thoughts after viewing Apple’s WWDC 2016 keynote.

Core Apps as platforms

We saw a lot of the core apps being opened up to developers. We saw this for Siri, Maps, Messages and even the regular Phone app. Developers can now write code that directly extends the functionality of these core apps. This makes each app its own platform.

  1. This provides a path through which Apple Maps may become much better than Google Maps for many parts of the world. Third parties can innovate on how to provide better shop recommendations/information, transit information, rather then replicating core functionality.
  2. The same can be said of VoIP apps. I have never had a VoIP app that had nearly as nice a UI as the iOS default Phone app. Now VoIP apps can simply focus on providing good connection and voice quality.
  3. Ditto for Siri and Messages.
  4. This approach is only possible in some cases because Apple’s business model does not rely on advertising. For example, Google Maps could have trouble integrating information from Yelp, because this would conflict with their business model of profiting from the recommendations.

Differential Privacy

This is still a bold experiment. It has not yet been proved that this will allow sufficiently advanced artificial intelligence. In the following months, this will be put to the test. Differential privacy may prove to be just as useful as the lax privacy that companies like Google employ.

More importantly in my view though, is that differential privacy will allow Apple to get the most valuable data.

Privacy of health data is considered to be very important, especially genomic data. In genomic experiments using human-derived samples, great care is often demanded to defend the privacy of the donor. Google’s approach would probably be considered too relaxed to entrust such data, whereas Apple’s differential privacy may be sufficient. As a result, people might be very hesitant to give Google their DNA sequence information but not so for Apple (it might even be an FDA recommendation).

If this becomes the case, then Apple will have a huge advantage, not because it has better AI algorithms or more data, but because it has the most valuable data.

The same may occur with many other types of data. If this becomes that case, then Apple may gain preferential access to the more valuable and important data (that is not readily available by spying on your interactions with your phone). This will benefit Apple in the kind of conclusions that its AI will be able to make.

Google’s Justification

At Google I/O 2016, CEO Sundar Pichai showed a future filled with their artificial intelligence (AI).   It is all very interesting, but I do have some questions.

How much data does Google’s AI need?

Google’s AI is backed by enormous amounts of data about us. Data that is collected from photos that are publicly posted onto the Internet, and from photos that we upload onto the Google Cloud services from our mobile phones. Data from our messages on Gmail or events on Google Calendar. Data from the GPSs on our Android phones which tell Google where we are every hour of the day. Data from our browsers which tell Google (often without us knowing it), which website we have been visiting. No other company has access to similar amounts of private information.

However, what has not been answered is how much data Google’s AI actually needs.

Can effective AI be created without too much data?

A recent article by Steve Kovach on Apple’s next generation AI system is very interesting.

Siri brings in 1 billion queries per week from users to help it get better. But VocalIQ was able to learn with just a few thousand queries and still beat Siri.

This suggests that it is possible to construct a advanced AI system with magnitudes smaller data sets; data sets that do not have to be aggregates of private user information, but can simply be collated from a relatively small number of people who were paid for the work.

Of course we need to see the results to be sure. At the same time, I find it interesting that IBM Watson was able to win Jeopardy without tapping into huge data sets like those that Google uses.

Does an intelligent assistant mean you have to give up your privacy?

Apple tries hard not to see your private data. Apple believes that your private data belongs to you only, and that you should be the only one who holds the keys. Many people have questioned this approach, based on the assumption that widespread access to private information from millions of people on the server level is the only way to create a sufficiently good AI system.

Apple’s approach does not preclude the storage and analysis of personal data, as long as it happens in a way that Apple itself cannot see. One way to do this is to handle analysis on the smartphone. This is what the NSDataDetector class in the Mac/iOS API does. It’s actually pretty neat, and Apple has a patent on it. Similar but more advanced approaches could easily be implemented in iOS, given the performance of today’s CPUs.

The question is, is this approach sufficient? Will analysing your private data on your device always be much less powerful than analysing it on the server? Furthermore, will there be a significant benefit in collating the private data from strangers to analyse your own? If so, then Google’s approach (which sacrifices your privacy) will remain significantly superior. If not, then Apple’s approach will suffice. That is, you will not necessarily have to give up your privacy to benefit from intelligent assistants.

Does Google need the data for other purposes?

Let us assume that there existed a technology that allowed you to create an effective intelligent assistant, but that did not require that you give up your personal data. Would Google still collect your personal data?

The answer to this question is quite obviously YES. Google ultimately needs your private information for ad targeting purposes.

Could Google be using the big data/AI argument to justify the collection of huge amounts of private data for ad targeting purposes? I think, very possibly YES.