‘Write for people, not robots’ has been the clarion call to copywriters for at least six years (since the Panda update), but is that still the case? The answer, like that to any question posed by a History channel documentary, is both yes and no.
When the Panda update was first released into the verdant, bamboo forests of the 2011 internet, it found much with which to keep itself occupied – there was keyword stuffing, scraped and thin content galore and it gorged itself on the fibrous feast of malpractice, dispensing penalties and punishments hither and yon.
While Panda has since reached senescence and, unlike its namesake, reproduced quite happily in the intervening years, there has been a blossoming of technology that has somewhat muddied the waters of its original task.
Searches for ‘keyword density’ in steady decline since 2011 – Google Trends
Beginning with the introduction of Hummingbird in 2013 – which broadened the Google algorithm’s lexicon – and continuing with gradual improvements to the search engine’s capacity for latent semantic analysis, there have been a series of developments which have seen the algorithm’s ability to parse context and meaning grow considerably.
The culmination of this advancement, to date, has been RankBrain – an artificial intelligence system designed to enhance the relevance of SERPs and which (while there is some question over the weighting of each factor across queries) has been referred to by Google as the third most important ranking factor overall.
To clarify, the call to put readers before robots was good and timely advice – for too long, in an effort to game the SERPs, web copy was restricted by the insistence of many agencies and in-house departments on keyword density targets and strict word count limits.
Panda takes its toll – searches for ‘safe keyword density’ grow as Panda updates roll out – Google Trends
However, as various advancements emerge, it’s necessary for marketers to realise that robots are an increasingly important audience for our work – and one we must often please before it reaches our audience’s human component. Happily, the Venn diagram of the content needs of people and robots overlaps more and more over time.
Last month, Google released five guides outlining what it considers to be high quality content. This month, Search Engine Land saved a lot of people a lot of time by publishing some key takeaways from these guides – my only qualm with the piece being that it states these guidelines are meant for ‘technical content creators’, implying that there could be another variety in modern marketing.
Realistically, all content creators must now be technical to some extent to ensure success of their content. While there will always be outliers that succeed seemingly without any attention paid to rankings or audience, they are and will always be outliers.
Beside the fact that my own writing falls foul of a number of Search Engine Land’s top tips (I’ll let you guess and we’ll compare notes somewhere along the line), what is noticeable is that along with some standard writing tips, there are a number which are clearly aimed at simplifying the meaning of content for machines.
Rough overlap of human and robot requirements for information parsing.
Using <strong> and <b> for delineating importance and for visual emphasis respectively, for example, will make zero difference to the average human, they’re both just bold to the eye, but it will help a machine to better understand your writing.
In addition, and what I’m considering a personal affront, there are tips against long sentences and overblown metaphors alongside those against slang, technical jargon and exclamation points. All of which, after all, are very difficult for machines (and, if critics of my writing are believed, for humans) to understand.
Slang and jargon can have meanings peculiar to areas and to industries, for example, while exclamation points offer no insight into where the emphasis of the exclamation should be placed (unlike, for example, <strong>), while extended metaphorical conceits – like a panda on the London Underground, can be difficult to contextualise.
Essentially, what we’re seeing in the guidelines at the moment is a simplified lesson plan for the artificial intelligences that will one day rule the SERPs and, while we have to ensure that content is written for a human audience, so long as there are limitations to what the Google algorithm can understand, we need to help it on its way. While the machine is learning, we have to make sure that we’re not pitching above its head.
For those in digital marketing, this means learning to juggle the needs of both our human and mechanical masters more efficiently. This does not mean disregarding your human audience, but requires us to sign-post meaning, wherever possible, behind the scenes by rendering our content in a manner that is easily converted to data – through proper deployment of JSON, HTML and any other relevant markup.
As voice search becomes conversational and seeps into everyday life (as it will – who thought talking on phones in public would take off? I certainly didn’t and I’m an excellent barometer of how people don’t think), as Google answers more queries in SERPs than ever, page one, and more specifically, the top of page one is going to be vital for certain query types.
The more Google’s AI is able to understand, the more it is going to render visits to pages irrelevant (for information, at least, though there are plenty of people arguing that it may be the same for transactions).
Interest over time in AI, VR and AR shows a steady increase in interest around augmentation – Google Trends
The future of marketing, in my opinion at least, lies in augmented reality and voice interaction with whichever digital assistant to whom you commit, with the assistant plucking sections of various websites to display to you or communicate to you in response to your queries (or whichever variety of advertising you’ve opted into as part of the privilege of ownership).
That means that content creators are going to need to get to grips with the technical side of SEO (or take advantage of the content writer’s new best friend Google Tag Manager) – because AR is likely to become a big thing before machines are able to read like humans, and it’s going to be easier to avoid being caught out by the coming changes if we begin speaking their language now.