Apple's most recent iOS release and updates to its terms and conditions may represent just the latest perverse disincentives for users -- purporting to keep users more secure while actively contributing to poor security culture.
On September 18 -- one day after Apple released iOS 12 (along with tvOS 12 and watchOS 5) -- VentureBeat broke the story that with those updates came an insidious-looking update to Apple's iTunes Store conditions and privacy agreement. Specifically, a new clause states that Apple will "compute... trust score[s]" for each device based on the number of phone calls and emails users send and receive should they attempt an iTunes purchase.
The terms go on to note that these "scores" will be "stored for a fixed time on [Apple's] servers." Apple defends the moves as an anti-fraud measure new in iOS 12, and further asserts that the trust scores are guarded against reverse engineering -- in other words, that it would not be possible to take the numerical score and work backwards to figure out call or email data. Nonetheless, privacy wardens' hackles have been duly raised by the slipthrough paragraph -- particularly as the popular punditry wonders in print exactly how some of these data points can possibly make users safer against fraud.
Moreover, iOS 12 will be keeping close tabs on user habits not only in the name of anti-fraud handwaving. Two new features of iOS 12
-- "Screen Time" and "App Limits" -- act, in theory, to curb the addictive behavior of those glued to their devices, as the former measures and reports on the amount of time a user spends on particular apps and websites while the latter is used to lock users out of certain apps if they spend too much time on them. The dark side of all of this is that the data gathered by Screen Time promises to be a goldmine to successful intruders -- while App Limits could potentially be used as a form of distributed denial of service (DDoS) by attackers.
Apologists might point to iOS 12's numerous new security features, including (inter alia) custom-length numerical passcodes, a password manager, and USB restrictions. The sad part is, however, that even these features may not be enough to convince users to upgrade -- particularly if they are nonplussed by the significant UI changes.
For these reasons, others who do upgrade may find themselves regretting it.
It wouldn't be the first time. UI-wise, iOS 11 represented a giant middle-finger emoji to those who upgraded. In most default apps in iOS 11, sizable swaths of precious screen real estate became coopted by giant headers (apparently in assumption that iPhone users are too stupid to know that they're looking at their text messages without the word "Messages" in giant bold font at the top of the screen). For privacy-conscious users who had location data turned off, Siri stopped working entirely for voice-directed calls to local businesses listed in users' contacts. And Netflix's original series American Vandal (SPOILER ALERT) gave iOS 11's infamous typing glitch newfound notoriety by making it a plot-turning clue.
And all of this is to say nothing of the performance degradations (purposeful, on Apple's part), software crashes, and double-time-plus battery draining through which iOS 11 users have suffered.
Unsurprisingly, iOS 11's reputation likewise suffered; professional reviewers and regretful consumers alike warned those still on iOS 10 to skip this particular update.
Then came the DevSecOps know-it-alls, prodding and coercing users to update their iOS anyway. It was the InfoSec community, not the Cupertino megacorp, who added insult to injury with their messaging and behavior -- harshly chastising those who pointed out iOS 11's flaws. Their argument against pointing out the emperor's nakedness? That global InfoSec posture would suffer as lay users became convinced to not update their software to the newest (and least tested) version.
Even ignoring iOS 11's plentiful security problems itself, the view is grossly elitist and anti-user. Holier-than-thou security wonks can preach all they want, but they fool themselves when they fail to recognize that, on the whole, users will always gravitate toward whatever works best -- period. (See Uber Loses Customer Data: Customers Yawn & Keep Riding.)
Apple historically has a bad reputation for latent problems in its major iOS releases. The company has thus trained its users to take a wait-and-see approach for every major iOS release. The potential for security and privacy compromises in iOS 12's features only compound the problem.
Consequently, the fault for any security breaches and other problems that arise from its users' natural trepidation against updating falls squarely on Apple's shoulders. Users don't like to be burned.
— Joe Stanganelli, principal of Beacon Hill Law, is a Boston-based attorney, corporate-communications and data-privacy consultant, writer, and speaker. Follow him on Twitter at @JoeStanganelli.