Clicky

X

Subscribe to our newsletter

Get the State of Digital Newsletter
Join an elite group of marketers receiving the best content in their mailbox
* = required field
Daily Updates

SEO and the Multi-layered Search Experience

22 March 2010 BY

There is a pervasive mindset in the internet community that search engines such as Google are an entirely different class of website. Instead of a destination in and of themselves, search engines are seen as gateways to real content. The search experience as many search engine optimisers define it is limited to what users do in Google, Bing, and such.

However, I’d argue that this is a limited perspective that no longer accurately reflects how users find content on the web. Instead I would call Google and its rivals first tier search engines – the first step in a multi-layered search experience.

This is generally what we envisage a search experience to be: the user loads up a web browser, goes to Google or another search engine, types in a query, reviews the results, clicks on a top result, and goes on to consume the content. We can expand this simplified model with multiple search queries and a little bit of results sampling, and that, as far as many SEOs are concerned, is that.

This model of a search experience is, in my opinion, grossly inaccurate. Search engines like Google no longer serve up the final destinations that users desire. Instead, search engines are a first step, a first query space, towards the final goal.

Instead of finding the actual content they desire, users utilise Google to find other, specialised search engines. An example: a user types in this search query: david coulthard belfast. The Google UK SERPs are dominated by sites that, in their own specialised way, are search engines as well as content platforms.

There is a local news site, sports news sites, and video sites on that first page of the results. Each one of these sites is a content delivery platform, but each of them is also a specialised search engine – for local news, for sports news, and for video content.

For each of these sites the search experience is an integral part of their website. Due to the enormous amounts of content on these sites, they have to have a strong internal search experience to enable users to quickly find the content they want to see. As such each of these sites is a search engine of its own content, just as Google is a search engine of (nearly) all web content.

We see this type of second tier search engines everywhere. They’re so common that we expect a website to have a solid search function. It if doesn’t, we miss it and we complain.

It goes one step further – the most popular websites, those that gather the most visits, are those that have made the search experience a central focus of their website. From Facebook and Twitter (social search) to financial comparison websites (insurance search), from YouTube (video search) to Wikipedia (facts search), all of these sites have robust internal search engines that form a core aspect of their functionality.

When users search for jobs, property, financial products, books, used cars, flights, restaurants, etc., they often start with Google, and then proceed to a second tier search engine before finding the actual content they desire. Combine this with a growing complexity of search queries, with longer keyphrases gaining traction, and we can see the outlines of a much more specialised search space.

Google understands this. That’s why they are expanding in to these specialised vertical search query spaces. Image search, video search, news search, and also financial products, books, knowledge, and so on.

As SEOs we cannot ignore this. If we continue to focus primarily on the first tier search engine experience, we will miss the opportunities for dominance in the second tier search space.

I believe that, as optimisers, first and foremost we exist to deliver relevant traffic to our client websites. And that means we have to look beyond Google and Bing, and embrace specialised search engines. Our craft as SEOs has evolved from day one, and we may soon see the biggest evolution of our industry to date.

AUTHORED BY:
h

Barry Adams is one of the editors of State of Digital and is an award-winning SEO consultant based in Belfast, delivering specialised SEO services to clients across Europe.

Nice job, you found it!

Now, go try out the 12th one:

Use Google Translate to bypass a paywall...

Ran into a page you can't read because it is blocked or paywalled? Here's a quick trick (doesn't always work, but often does!):

Type the page into Google translate (replace the example with the page you want):

http://translate.google.com/translate?sl=ja&tl=en&u=http://example.com/

How about that!?

Like this 12th trick? Tell others they need to look for this trick on our page: http://www.stateofdigital.com/search-hacks-marketers/

Or Tweet: Found the secret 12th one!