I am far from an Apple "hater" - I applaud a great many of the company's decisions around its products and the more fully-baked state in which it tends to execute them (except keyboards, apparently). But yesterday during the onslaught of WWDC announcements, Apple quietly put the lid back on a boiling pot of a controversy, and it has seriously soured the company's messaging for me on respecting privacy.

Last year, after Apple introduced built-in screen time parental controls into iOS, it began to slowly cull similar - but not functionally the same - apps from the App Store. It did so under what, to me, is a perfectly fair piece of reasoning: these apps were all using VPNs or iOS's MDM (an enterprise device management system) explicitly for the purpose of monitoring a person's activity on their iOS device. Some of the activities tracked are genuinely innocent, such as authentic screen time apps merely meant to enforce parents' rules about phone usage for their children. Others, though, do exactly what you'd expect of an app indistinguishable from one meant to maliciously and silently spy. Full-time location tracking, geo-fencing, explicit image and text content alerts, web browsing history (and filtering), and more.

One of the more terrifying solutions is Bark, which brands itself as a modern, privacy-friendly parenting tool, but actually uses AI (and probably on some level, humans) to alert parents whenever a text, photo, or website with potentially "harmful" content is detected. Looking at all the praise the app has earned from the media and parents, you might be inclined to think this is OK - after all, Bark doesn't let parents read full chat logs, just the ones that raise red flags. But the fact that an app like Bark can even exist while Apple contends that privacy is priority one on iOS is simply ridiculous. An app that can scrape images, private texts, emails, web history, call logs, and location data, send it up to an untrusted third party's cloud for analysis, then return it to parents to peruse without their child's knowledge is consistent with that ethos? In reality, Bark is merely Silicon Valley spin and polish on what is a gross model: Spying on your kids - now easier and faster with AI! (Bark suggests telling your kids you're spying on them, for what that's worth.)

And while the narrative from all of these app developers is that they are merely interested in ensuring children have a safe online experience without parents breaking into their phone in the middle of the night, it's a two-faced one at best. These apps can just as easily be used to monitor and track a spouse, and I have absolutely no doubt they're used that way. Bark, for example, detects explicit images or content - exactly the kind of functionality that a jealous, paranoid spouse would die to have. Some apps like mSpy, which remains banned from the App Store to date (Bark was reinstated), don't even try to hide behind a veneer of respecting privacy: they straight up advertise being able to read texts. Others like SafeLagoon offer all the stock imagery of a fun, friendly, safety-oriented service, but are already advertising the return of full-time chat and SMS monitoring to their iOS app as "coming soon," presumably on the heels of Apple's policy change.

Apps like SafeLagoon initially advertise themselves as screen time and safety tools - but the reality is they're powerful spying platforms.

I want to be clear: I am not telling anyone how to parent their children. If you want to spy on your kids, that's your business. My problem is that these apps also enable abusers and people seeking to control and track others in a nefarious way. It's not hard to see how an app like mSpy or SafeLagoon would be helpful to someone in the human trafficking trade, but I won't get into all the dark, awful ways these services could be abused in this post - using your imagination is probably sufficient.

Given this, it's impossible to reconcile that Apple truly cares about the privacy of every user and is doing everything it can to ensure their data remains private. But why did Apple cave? What led it to make such a radical switch in its ideology regarding these parental spying apps? After all, they're clearly no threat to Apple in any financial sense. The New York Times thinks antitrust concerns motivated the switch, but it's really impossible to know - Apple isn't talking.

It's possible competitive disadvantage could have played a role. Families often choose ecosystems together: everyone is on iOS, or everyone is on Android (in the US, typically Samsung). That can make the decisions of parents regarding smartphones outsized in terms of effect, especially as it becomes likely their children will go on to use the ecosystem they grew up with, a phenomenon that is constantly visible in the world we live in today. Momentum is an extremely powerful force that keeps people inside product ecosystems, and platform lock-in is a powerful tool to ensure they stay there - one Apple has been particularly effective at leveraging. Stoking a narrative that Samsung phones (or any other Android device) provide far superior "parental monitoring" utilities really could prove a decisive factor in some households. While the number would be small relative to Apple's overall business, at a time when the company is desperate to pivot to services as iPhone sales grow stagnant, any loss of momentum could be damaging.

This is all, of course, just a theory - Apple isn't talking about why it reversed the policy or, oddly, why it didn't just implement a screen time API, as many of the companies affected proposed. But Apple has now made clear that its all-privacy-all-the-time model is one that is subject to a massive exception, one which puts a potentially huge amount of data in the hands of companies we know little about. Bark claims to have scanned billions of text messages on millions of phones. How is this OK when Apple itself goes out of the way to tell us just how secure and private iMessage is? The company even recently launched a video ad about it. The dissonance is incredible.

Apple isn't the only one at fault, mind you: Google allows these apps to flourish on the Play Store without issue, and I'd certainly like to see the company take a stand on them. Wholesale device monitoring shouldn't be as easy as covertly installing an app on someone's unlocked phone, regardless of your parenting philosophy. Whatever legitimate parenting use cases these apps enable don't outweigh the potential for abuse or data mishandling in my mind, and I think Apple and Google both should have a long, hard look at what "protecting" children's privacy really means.