Good morning from SMX London Folks! Today we’ve got Dave Sottimano of Distilled and John Straw from Linkdex talking us through Search Analytics and Competitive intelligence. I had a sneak-peek at Dave’s presentation and know he’s got a fun tool to give away and am also very excited to hear what John has to say!
Without further ado….
Top Tips from David
1. You need to make competitor analysis a process and one that can be broken down into actions.
2. Your competitors offline are not always the same as your competitors in the SERPs. You need to figure out who you are competing against online irrespective of what you know about them.
3. Built a tool to help you find competitors by keyword, it is available here (really hand xpath scraper built in Google docs, definitely make use of this one!)
4. Use the sheet to find out who is coming into the space (if you run the tool every few weeks) you can also find out who you might target for links if you see someone there who isn’t actually selling your product.
5. Use a valid metric – Dave uses ranking because he doesn’t trust any of the traffic data for competitors.
6. What you need – your keyword list (actual and hopeful) find rankings for you and your top competitors and find the individual URLs ranking for each keyword. Find what section of the website is ranking and map out the landscape – find what advantages you have or what sections other sites may have that are actually ranking that you may want.
7. Weight their rankings (Dave suggests using the Cornell eye study or the AOL leaked click thru ratings, though I would also suggest checking out the latest data from Optify and adding that in as well).
8. Deep dive – use link analysis tools like SEOmoz, Majestic and Y! Site Explorer as well as Screaming Frog.
Top Tips from John
1. Get intimate with your competitors on a meta-scale, get information on a regular basis, get information that is actionable, action that data. John believes each of these steps get progressively harder and we are dealing with an industry problem.
2. OpenSiteExplorer is free(ish) but not very fresh, nothing like as large an index as Google’s and only reports link data whereas Majestic is free, a bit fresher, nearly as large as Google, but again, only reports link data. No one is reporting what’s on the page… there is a need to physically get engaged in the competitor site and see what is going on on their site.
3. Hire mathematicians to build algorithms that look on the page and figure out what is going on on a page and split them into different types of pages (perhaps not an option for everyone but very impressive none the less). Their team has created a tool that matches nearly 70% correlation with PageRank.
4. Use the Linkdex and Yoast WordPress plugin integration to help make some sense of this.
5. When looking at what type of page it is you could look for the following as indicators of what type of content: comments, RSS feed, share options, and break them up etc
6. John has said to get in touch if you have a crawler as he is happy to give the algorithms away.
It seems like John’s point is the need to build your own tools as would be Dave’s to a large degree. I tend to agree on both fronts but it’s not always going to be possible for everyone at present. I would say, however, that as this is an advanced conference it should be a strong indicator that any agency that is not trying to build any of their own tools is probably not making the best use of their time.