I'm a little angry. No, scratch that, I'm very angry. Whenever Apple introduces improvements to Spotlight on iPhones and iPads, I have vivid PPSSD (post-personal-search stress disorder) and remember the glorious years when we had a decent on-device search solution on Android. Then it was gone and Google, the search company to end all search companies, pretended nothing happened, that it was never there, and whoever really wanted a central search solution on their phone?
In-app search, as the feature was called when it launched in 2016, was a Google search app and widget addition that found content in the various apps you've installed on your phone. First-party Google apps were the first to join the fray, so you could scour for contacts, calendar events, messages, emails, places, bookmarks, your browsing history, music, and movies from a single place. Better yet, you could write the name of a person and see their contact card, your previous communications with them, calendar events that involve them, and other relevant info.
Google made the API public too, so third-party developers could integrate their results in it. Not many jumped on the opportunity, but the ones that did simplified the experience for their users. Todoist and Zomato were the two examples that I saw, and trust me, it was so much easier to tap the Google widget that already exists on my homescreen and look for a restaurant or a task instead of finding the app I need, opening it, spotting the search button or bar, then writing my query.
Above: This is what Android's system-wide search was capable of in 2018. Below: This is it now.
With better support from Google and developers, this could've made the Google widget a central place for all searches on Android, be it for system-wide data stored privately on the phone or for public web queries. Easily accessible, universal, fast, what better experience could we have wished for?
But instead of improving, things started going backwards. First, contacts disappeared from the autofill results then everything related to Personal Search was gone. Whether it was people mistakenly thinking Google was publishing their personal details online or the advent of Google Assistant as an idyllic be-all end-all search experience, I'm not sure, but the end result is that you can only find and launch apps with it now, which is next to useless.
This is everything you can see when you search for a contact in iOS 15.
Contrast that with what Apple has been doing with Spotlight on iOS and iPadOS. The universal search capabilities were introduced with iOS 9 and improved over the years with third-party developer APIs. A year ago, they made it to the iPad, and they're getting another significant upgrade with iOS 15. So not only can you find nearly everything we had with Android and more, but now there are smart actions and tighter integrations. You can find your photos straight from it, and when you look for a person, you can see the contact details and links to reach out to them across various apps, but also their location with Find My, links they shared with you, photos of them, and more.
It seems very powerful and, based on my experience with my iPad, it can be very handy. You stop thinking about apps first and start thinking about what exactly you're trying to achieve. A file name is more important that the fact that it's a file, a person's name more crucial than the fact that they're saved in your address book, and so on. It's an entirely new way of interacting with your phone that takes apps out of their silos and removes the virtual barriers erected around them. The difference is similar (but not identical) to searching for a product on Google instead of Amazon: You know you'll likely find it on Amazon so you could go and look for it there and only there, but if you wanted more data about it, you'd search for it on Google, knowing Amazon would be one of the top results. Each approach has its own benefits, and it's great to be able to use both on iOS/iPadOS.
Search is the bread and butter of Google's business, and the company has somehow figured out a way to index the near entirety of the internet, yet it won't index the tiny amount of data stored on our Android phones. Want to find a contact? Open the contacts app and look for it. A place? Open Maps. A pic? Open Photos. And so on. The lack of a central place and unified experience is baffling, even more so when you know the history of the feature and are aware that we had it, we really did, but it disappeared for some reason. And I'm sorry, but Assistant doesn't count. It's not available in all countries/languages, and is a clumsy and slow way to do something that can be handled with a few taps. Besides, it offers fewer features than what we had with Personal Search.
There's a small light at the end of the tunnel, though. It looks like Google could add more universal search features with Android 12, but the details about that are quite vague. Until I see it in action, I'm not getting my hopes up.