In the search world we have all become accustomed to the continued algorithm updates that are rolled out, trying to make sense of them, being proactive and reactive (mobilegeddon being one) and in some cases being agile enough to change where needed so our pages and domains don’t lose their rank and traffic when it comes to the mighty Google.
As of the end of April to early May, we saw another update which Google would not acknowledge at the time released – Phantom 2. So what was it, who did it affect and what should you need to change to future proof against it?
Phantom 1 release was found and titled by Glen Gabe back in May 2013, as most SEOs were focusing on and awaiting Penguin 2.0 update, Glen was inundated with emails from brands that were seeing some alarming drops in Google organic traffic, some reporting 60%+. Glen set about delving into the huge amounts of data that he had at hand across a number of large domains and named it Phantom as it “flew under the radar” of other releases that we had been waiting for or knew about.
Fast forward 2 years, almost to the day and it seemed that Phantom 2 had been updated and released and again reports of huge decreases (and some increases were being seen) and Greg did some more digging.
I am pretty sure that any agency, SEO or internal marketing team looking at the screengrab below on organic traffic would run from the building screaming – its scary stuff to see that kind of decrease.
Image Source: HTMWeb
The screengrab is obviously from a large site (source unknown), but the release didn’t just affect large publishers, it was across the board and in most instances at page level rather than just domain. Barry Schwartz from Search Engine Land finally had confirmation via Google that they had released an update to their core ranking algorithm and it was based around quality of content, quality/user signals and not spammy links or backlink profile.
What did Phantom II update penalise
From what I have seen it certainly wasn’t a part of the mobile update that happened on 21st April (Greg Gabe and others have also confirmed this), as the changes were seen across both mobile and desktop. So far Phantom II seems to:
- Penalise low ranking pages across domains, although its not focussed at domain level – low ranking pages that are affected will have a knock on effect to the domain authority
- Be released over a number of weeks – although some sites saw the changes over a few days, sites are still reporting the changes
- Penalise poor user experience onsite – including too many ads, pop up ads, stacking of videos, thin content, duplicate content, poor design and navigation, content tags/tag cloud pages
- Target a large category of “how to” websites – Hubpages being one of the biggest sites reporting traffic loses – although Paul Edmonsons post seems to convey different analysis of quality pages affected
Who were affected by the update
Although the update didn’t seem to target a particular class of site i.e content farms etc, with Google reporting it was a change to the algorithm itself – taking a view of the Search Metrics winners and losers below seems to show that a lot of the “big” sites were around “How to” style of content. And as with all updates, it isn’t just the big sites that get penalised, its all sites.
Interestingly in the losers list is Pinterest and LinkedIn, which could suggest duplicate content on both sites were the reason for the downward shift.
What changes you need to consider
There seems to be a trend in some of the reporting (outside of organic traffic changes alone) that other factors have been take into account on the user experience and the user signals that Google monitors when it comes to a website, its build, content and engagement.
Here are a few areas that the “big data” sites are reporting about the issues that could see you penalised by Phantom 2 (the list is not exhaustive).
- Thin content – create longer, unique content for your pages – think about why or what the user is looking for
- Minimise Tag pages – where it’s a thin list of more tags or just headlines from the content it contains
- Dwell time and pages viewed onsite are user signals that Google are taking into account around user engagement for content and sites – the longer they spend on site the better
- Increase your word counts, optimise images and/or videos and add more comprehensive and relevant related wording (for search queries, not just keywords)
- Don’t stack videos on a page – where a page has multiple videos with little copy content around them
- Auto playing videos are also deemed negatively as it disrupts the user experience
- Limit the ads above the fold and minimise pop up ads – disrupting the user journey (possibly increasing bounce rates)
- Duplicate content – one that’s been around for a long time – make sure all your content is unique (and add a canonical tag where needed) RottenTomatoes and HubPages seemed to suffer badly with it
- User generated comments – pages/content that have a huge amount of irrelevant or old comments, or haven’t paginated them are seemingly targeted by the update – clear out any really old or irrelevant comments
How can you tell if you have been penalised
Take a good look through your analytics and view organic traffic to your site during the period 29th April onwards. How does it stack up against a similar period? If its about the same as expected, then great. If you have seen a downturn in organic traffic, delve deeper into what pages have seen the drop off, find the engagement problems and tackle the issues on those pages – as mentioned above and focus on the user experience.
Its still about the user
I know, I know – we have long talked about sites and content for users, driving the optimum user experience etc, but its more apparent with the Phantom 2 update that Google is moving more into the signals that are feedback to them around these pages, you need to think if your site and “all” of the pages really adds to the user experience.
With this in our quiver, we really do need to formulate both a strong content strategy based around the different buying stages of their research that a customer may be in and be there when they need the brand, rather than pushing product or as we have seen “how to” pieces of content. Secondly, around the user experience as a whole – making sure the content, navigation, pages, ads are all delivering something of value the end user.
I understand that Google is trying to achieve the optimum result for the user who is searching for information and of course, penalise those sites that “might” be trying to manipulate the algorithm and rankings for search terms. It does appear that yet another update (this time to its core algorithm rather than a filter like Panda/Penguin) has come along and “hit” a lot of sites without warning – wouldn’t it be great to have an understanding of what the next big update will bring via Google themselves, or am I just dreaming about a day when Google and brands can collaborate?