Doing The Hard Things In Tech

When observing all the mega-hits that Apple has brought to the market the past 40 years, there is one consistent theme. Apple tries to do the things that are considered hard or even impossible at that time.

With the original Mac, they created a GUI-only computer that had a mere 128K bytes of memory. With the iPod, they synced 1,000 tunes (5GB’s worth) to your PC in an age where the predominant I/O (USB 1) was woefully inadequate (and tiny hard drives had just become available). With the iPhone, they shrunk a full blown PC into the size of a chocolate bar. With Mac OS X, they implemented a radically new graphical rendering system (Quartz Compositor) that taxed memory and CPU power and was unbearably slow on the hardware at the time, which only became usable years later with powerful new GPUs (MacOS X 10.2).

In all these cases, Apple was not shy to do something that most people at that time considered very difficult, if not impossible. Sometimes even Apple failed to do it well enough, and suffered the consequences of an inadequate product (low early Mac sales, super slow MacOS X 10.0, 10.1). But in the end, that is why they managed to differentiate, because others had not even started.

Apple’s approach to privacy can be seen in the same way. Whereas the common narrative was that you needed huge servers and massive data sets for good photo recognition, Apple has implemented machine learning on a smartphone that fits into your pocket. Of course they may be taking shortcuts, but so did the Mac 128K. What is important is that they took the challenge while everybody else was doing machine learning the old way (on powerful servers with less regard for privacy). Similarly, Apple has implemented a differential privacy approach which still has no guarantee of success. Even experts in the field are split and some say that the privacy trade-offs between machine learning effectiveness might result in a product that won’t work. Apple made the bet nonetheless. Apple chose to take the hard, possibly impossible way, by hobbling itself with the self-imposed shackle that is a privacy focus. They have thought different.

The simple reason why Apple’s approach has worked even once, is Moore’s law. Moore’s law is the central source of rapid technical progress and disruption, and it makes what is impossible today into something easy to achieve tomorrow.

No one who has seen the progress of silicon would doubt that Moore’s law will eventually make the processing tasks done exclusively on high power servers today, possible on the smartphones of tomorrow. We should also consider that the amount of data collected from smart devices must be growing even faster than Moore’s law (thanks to the shrinking size and ubiquity made possible by Moore’s law in the first place). Tomorrow, we will have many times more data than we collect today, and it is totally possible that the sheer vastness of data will make it possible to infer meaningful conclusions from differential privacy data, even when anonymised under very stringent noise levels.

Therefore, I predict that even though Apple’s approach to privacy may lead to a worse experience for the next couple of years, as Moore’s law kicks in, the difference will end up being negligible. By the time the general public become acutely aware for the need for privacy, Apple will have a powerful solution that in terms of user experience is just as good as Google’s.

The boldness to go all-in on a technology that just barely works, based on the hope that Moore’s law will save them in the next couple of years, is a defining feature of Apple’s hugely successful innovations. This is a formula that has worked for them time and time again.

This is what I see in Apple’s current privacy approach, and this is why I find it so typically and belovingly Apple.

  • obarthelemy

    I think there’s a strong selection bias here.

    1- I learned about statistical anonymizing techniques (including the very ‘”flip a coin, then answer or flip another” I saw used as an example) 25 years ago in Marketing Research class. “Doing it with a computer” is nice, but not earth-shattering either. I’m sure there’s more to it (and that it works better with a computer, I tried using it once with Real People once, it’s impractical and not credible), but that’s the gist of it.

    2- There’s more important stuff Apple isn’t doing, or is recycling from others. Democratizing IT. Melding Mobile and xtop. Transitioning IT to an ad-funded model like other media. Voice recog (they bought it). Mapping the whole world (they joined much later). Indexing all human knowledge, and then exploiting it in smart assistants…

    I’m not sure applying standard statistics anonymizing techniques to computer-generated data ranks up there with *these* world-changing achievements. I’m convinced anyone could do it, users probably don’t care enough for it to be worth the bother. I’m anonymous enough in my insignificance. I’m intrinsically insignificant, no need to obfuscate me.

    • The key point in my mind is that despite Apple’s announcements at WWDC, many if not most pundits still think that Apple’s privacy focus will significantly, if not tragically hobble their AI. They think that Google, Amazon and any other company in AI is justified to de-prioritise privacy and anonymity, because there is a huge trade-off. They do not think that Apple can have both good privacy and good AI features, even if they poured in billions in R&D.

      That’s what I call hard.

      And Tim Cook has committed to it in a way that would be very, very embarrassing to back down.

      • obarthelemy

        They won’t back down from the PR that’s for sure, but that’s the “Holding it wrong” and “Designed for you hand… and now… designed for this year’s new larger hands” company. They’re good at selling crazy PR.

        They also won’t let anyone audit that what they’re doing is actually working (nor *what* they’re actually doing). That’s the “security-focused” company that got hacked to the point its own store distributed malware, and on another occasion told its tech support to neither confirm nor deny a (very real and quite big) malware attacks, leaving their own users in the lurch. That obscurity and bad faith helps with staying on any message.

        In this case, they’re adding random noise to the data. What gets hairy is that data is linked and correlated (and the value of the data is mostly in the correlations, so I assume Apple is keeping that), so the risk is that users can be identified (say, anonymized data that someone got a 5-person uber ride to the airport a 6AM, and non-anonymized data that a family of 5 took a 9AM plane to Paris -> that bundle of data is that family) and the noise weeded out (they’re supposed to have ordered Domino’s at home while they were in Paris -> that’s noise).

        I don’t think we can assume Apple will pull it off, that it won’t have consequence for their AI quality, and that they’ll be frank about either issue. But they won already: people are already praising them based on their PR.

        • I would not rule about the possibility of an audit. In fact, to make their privacy stance tangibly beneficial outside of just PR, they will have to accept audits. For example, if they want to use healthcare data in their personalised AI efforts, then they probably can’t escape an audit.

          That’s where this whole discussion around privacy would get really interesting, and not longer just an ideological argument. (Of course, if Trump became president, the public awareness around privacy will probably escalate too).

          • obarthelemy

            Trump couldn’t do much worse than what Julian Assange and Edward Snowden and Chelsea Manning have already told us is already is already being done. That hasn’t shaken the apathy.

            Apple have been making oft-disproved claims about encryption and security for years now, and not only has there never been an outcry for an audit, people are still taking the PR at face value in spite of all the disproof. Don’t see that changing for anonymity.

          • I agree that an audit is a very good idea, and necessary. If Apple, Google or Amazon start to go into real healthcare and health records, that’s when an audit will be required. The only question in my mind is *when* Apple will venture into this area, and how others will respond.