The author’s views are entirely his or her own (excluding the unlikely event of hypnosis) and may not always reflect the views of Moz.
Happy Friday, Moz fans! In today’s Whiteboard Friday episode, Tom digs into his research on Apple’s moves in search, specifically their recent launch of what he believes is a search engine, how it works, and how they could possibly hope to compete with Google in the future.
Click on the whiteboard image above to open a high resolution version in a new tab!
Howdy, Moz fans. We’re here today to talk about an Apple search engine. There have been rumors circulating for a couple of years now that Apple are building a search engine. I’m here to convince you I think they are building a search engine and I think they’ve already launched it.
“Apple won’t be building a search engine”
But before we get into what does that look like, let’s start off by talking about the common reasons people say, “Apple won’t be building a search engine.” The first is that we know that Google pay Apple something in the order of $18 billion to $20 billion dollars a year to be the default search engine on Apple devices. That’s a lot of cash.
But if you consider Apple makes more than a billion dollars a day, probably not such a big idea as we think it is. But more importantly, in 2020, the Department of Justice in the U.S. said that they were going to be suing Google for monopoly practices. One of the four bullet points they listed as their reasons was that they don’t want Apple being paid by Google to be the default search engine. They believe that’s a monopoly practice.
They want it to stop. So even if Apple do care about the cash, it might be going away anyway. So then it was not necessarily a surprise, in 2018, we saw the head of search at Google leave and go to Apple. Apple hired the head of search. That’s a very suspicious thing to do if you’re not building a search engine. We also saw at the same time they started listing job listings for hundreds and hundreds of search engineers, which all adds up to building something search related.
Apple has launched a search engine
So rumors circulated, oh, Apple are going to build a search engine. It flew under the radar a little bit, but Apple already launched a web search engine. In iOS 14, we’re on iOS 16 now, iOS 14 launched September 2020, Apple made a change where if you do a search in the iOS default search screen, the Siri suggested websites used to be powered by Google’s web index, iOS 14 changed it to Apple’s own web index.
It’s things like links, web page design characteristics, which sounds like Core Web Vitals and PageRank. So the search ranking factors look very similar to Google. They’ve got a crawler very similar to Google. On their page, they talk about how if you’ve got no Applebot specific rules in your robots.txt file, they’ll follow Googlebot rules. So you can see the direction of travel.
They’re trying to crawl the web in a similar shape to Google. But I think that’s a distraction. I think there are several things that they could do differently if they wanted to build a bigger search engine, and we’re going to talk about the three differences I think they could have that would allow them to build a search engine in a very novel way that would compete with Google in a way that we haven’t necessarily fully understood.
So let’s talk about that. So the first thing is federated search. Consider how you do a search on Google. You type in “Madrid,” and it goes and it looks in Google’s one giant database. That database has a web index and a knowledge index and all of that stuff. But it’s essentially looking in Google’s own singular database. With Apple, it’s tempting to think, okay, well, they’ve got a much smaller web index than Google.
They can just go and look in that. But what I actually think is they’ll use a federated search approach. Federated search is when you delegate a search to multiple different providers and then aggregate the results. In Apple’s case, I think they can delegate the search to all the apps on your device. Basically, you do a search for “Madrid,” and rather than just looking in the web index, it might say, well, this Madrid search, the intent might be to book a hotel, so I’m going to look in the Hotels.com app.
The intent might be to book a flight, so I’m going to look in the Skyscanner or the Lufthansa app. It might be learn a language, Duolingo. It might get a travel guide, so you look at the Lonely Planet app. The point being that they would ask all of these devices and then blend the results from multiple different sources together. But how would they know what apps are on your device and cater to what intents?
In 2020, they rolled out a change to the operating system that allowed apps being installed on your device to register a list of intents that they can cater to, which is very interesting. It seems like 2020 is the year where they started launching a lot of these things and setting up for this potential search engine. So you would get a search that went to their web index, but also searched through the apps on your device and blended those results together.
As further evidence that they will blend these results, Apple managed to reverse engineer the API they use for their web index, and when you do searches, you get one list of results. You don’t get web results and knowledge results. You get one big list of results, sorted by relevancy, and it might be news, web, web, maps, whatever .
The point being it shows Apple’s thinking. They’re concerned more with relevancy than where the results came from, and they’re happy to blend them together. Okay, so they’ve got this federated search approach. So let’s talk about how these searches can also have custom UIs. So when they rolled out the change to allow an app to cater to specific intents, that app can also tell the phone, oh, when you’re catering to this specific intent, here’s a snippet of user interface you can use outside of the app to cater to that in the best possible way.
As an example, if you ask Siri to split a check, it understands that the calculator app on your phone can cater to that intent. But it doesn’t send you to the calculator app. Instead it shows a little custom UI snippet right there, in the search results, where it shows you how to split that check. So apps that have been installed on your phone can say, “I can deal with this sort of intent, and this is the best way to present that result to users.”
It goes one step further. At the same time, they also launched App Clips. App Clips are what Apple describe as small parts of an app. What’s important about App Clips is that they get streamed to your device. So you can scan a QR code or click a link and it will go straight into the App Clip. There’s no need to install a full application on your device.
App Clips are designed to do one small part of what a full app would do, but basically frictionless in loading them up. So an example might be imagine you arrive in a city and you’ve not been there before. You’re running late. So you do a search for “scooter,” and rather than just searching the apps on your device, because Apple have got this database of App Clips and they’ve got almost 2.5 million apps in their App Store, they could also serve that intent.
So you search for “scooter,” Apple say, “Yes, here’s an App Clip that can help you with that.” You press the button and you go straight into the app. It directs you to a nearby scooter. You pay for it with Apple Pay. You get on the scooter and off you go. This is a mobile native experience. You didn’t go to any website. You didn’t install an app. Everything happened right inside the search results. So Google have been moving to this mobile first paradigm, but it’s still web first. With Apple, there’s a potential for a mobile native experience. This is something that Google could only dream of. What is also true is that this federated search approach would allow Apple to completely bypass Google’s strengths.
Google’s strengths are the fact that they’ve got this huge web index and they’ve been building that for 25 years or whatever. With a federated search approach, Apple can cater to many of the same intents without needing to replicate Google’s web index and their rankings and all of that magic. So if you’re going to build a search engine, you probably also want to be able to personalize results.
How are Apple going to do that especially considering that they position themselves as the sort of privacy-centric tech company? So first thing, quickly consider the difference between Google Photos and Apple Photos. On Google Photos, they will do machine learning of all of your photos on their servers in the cloud, allowing you to say, “Oh, show me my photos of animals,” for example.
You can also do “show me my photos of animals” with Apple Photos, but all of the machine learning happens on your device. It shows the two different paradigms. So when you do a search, Google will serve you some personalized search results based on your history or whatever it might be, but it’s all happening in the cloud. What I think Apple will do is serve you a list of potential results from their web index, and then User A and User B would see a different subset of those chosen by their phone using the machine learning on the device, which understands your preferences, your history, what apps you’ve got installed, etc.
The final thing is that, in terms of personalization, Apple have a potential advantage here too because with the federated search approach, Apple can do what I call fully authenticated personalization. If you do a recipe search for a schnitzel recipe on Google, you’ll get some personalized results, but it will be from the web index. If you do a schnitzel recipe search on your device, this federated search approach will allow Apple to look into your “private” databases.
If you’ve got the Recipe Keeper app, it will see, oh, yes, you’ve got a schnitzel recipe in the app. I’m going to pull that out. It might see that your mom sent you a message on Facebook about the best way to make schnitzel, so it will pull that out. This is stuff that Google can’t do. Google can’t look behind the curtain into your “private” databases. Over the last few years, I think Apple have been playing a slow game, laying the pieces to move towards this mobile native sort of experience.
So when I first started talking about this research in mid-2022, at the SearchLove Philadelphia conference, I predicted that Apple would move the search field onto the iOS home screen in order to try to change user behavior and push them towards doing this mobile native sort of search rather than having users go into a web browser to do a search.
Then a couple of weeks later, Apple announced they were doing exactly that, and that rolled out in iOS 16. September 2022 Apple made it so that the search field is right there on your home screen now, trying to push people towards doing this sort of search. What that means for us as SEOs is yet to be seen, whether they move in this sort of direction, but all the pieces seem to line up.
If you’ve got an app, this is definitely something that you should be thinking about. Otherwise, we should all be thinking about what this means for us over the next couple of years as Apple try to shift the user behavior. What that means is still to be seen. Thank you so much.