Apple is the company that, more than any other I can think of, has made privacy one of its core values. Tim Cook is Apple’s CEO. He has repeatedly stated that privacy is an essential human right. It has been a huge part of the company’s marketing, and–until now–Apple has largely lived up to that reputation.
Especially in comparison to its tech brethren, many of whom take a different approach to managing user data and don’t provide the same degree of protection with tools like encryption. Although there are many reasons to be critical of Apple, privacy has never been one.
For example, many people rightly praised Apple’s position when it resisted the FBI’s efforts to gain access to the devices of suspected mass shooters or terrorists. It wasn’t because those people were on the side of terrorists, but because if there’s a backdoor into the device of a bad guy, there’s a backdoor into everyone’s device.
Now, a lot of people are upset with Apple’s Child Safety Initiatives, specifically a feature that will detect known child sexual abuse material (CSAM) uploaded to iCloud Photos. It’s not like people support anyone uploading this type of content.
There are, however, concerns that the technology represented a major shift in Apple’s stance on privacy. The fact that the technology runs on the device might make it more privacy-protective in reality, but that was not the perception at all.
The problem is in the perception of the shift. This is where the problems start to get complex. When it comes to trust, perception really is everything.
Ironically, Apple’s recent announcements will not have any bearing on your use of the device. If you hadn’t read this, or one of the hundred or so other articles on the subject, you would never even know anything was different unless you planned to store some amount of CSAM in iCloud Photos. You aren’t being truthful, and I am willing to grant you that.
It doesn’t matter, because it is crucial to know what’s going on with your device. The reason Facebook gets away with collecting so much information and monetizing it through personalized ads is that people don’t really understand or think about the tradeoff they make when they use the free service. It is important that you understand what’s happening.
Essentially, Apple has devised a way to detect this type of material without compromising user privacy. You can turn off iCloud Photos if you are uncomfortable about the possibility that Apple may be detecting content on your phone. Apple will confirm that this technology is not running on your device.
Apple’s biggest problem is its expectations. Apple sets expectations for customers by stating that its operations are different from its competition. However, this was not what Apple customers expect. Even though Apple claims it respects user privacy, there are many questions about the technology.
That’s mostly because the rollout was pretty bad. This information had been spread before Apple made an announcement to the media. Apple was unable to explain its position clearly and without making it look like it was trying to play games with words. Words have meaning. Problem is that words don’t have the exact same meaning for everyone you are talking to.
It’s up to you to ensure that the audience gets your point. It’s up to you if they don’t understand. All of this confusion led to people doubting Apple’s commitment to privacy protection.
If you set expectations and people start to believe you are doing other things, trust is broken. Over time, trust is earned. Trust is built by keeping your word, time after time. It is incredibly difficult to win, but surprisingly easy to lose. Even a tiny shift in trust can destroy the brand’s reputation.
You don’t have to be consistent in what you do. If it isn’t, then you are already losing the argument. Trust has been lost.
Perhaps it isn’t so complicated after all.
Publiated at Wed 11 August 2021, 12:01:35 +0000