#SMX London: Hardcore SEO Power Tools
Estimated reading time: 7 minutes, 6 seconds
SEO has been growing exponentially during the last year due to it being more connected to all the other Internet Marketing disciplines. SEO grows more complex and the size of the sites we work on continue to grow.
It’s more important than ever to have a set of high-quality tools that give us both a competitive edge and save our increasingly precious time.
During this session at SMX London, we had Steve Lock of Linkdex, Pete Wailes of SEOGadget, Stacey Cavanagh of Tecmark and Dixon Jones of Majestic SEO showcasing their toolbox of hardcore SEO power tools.
Steve Lock, Linkdex
First sharing his toolbox was Steve Lock.
Steve said that the best tools are those which help him to automate repetitive tasks and make his work easier so he can concentrate in what really matters.
Trello, as Steve said, is a powerful collaboration tool that integrates smoothly with Google Drive, Harvest, Dropbox and many more other tools.
It is extendable with many awesome plugins, as calendars, agile marketing based tools, and export ones. More over, we can first work our spreadsheets in Google Drive and bulk them upload via the Trello API. Finally, Trello has a solid mobile app and best of all, it is totally free.
But one of the best tools that Steve and many other people forget about, is our own browser. He uses Google Chrome as his preferred browser, so the perfect browser setup he proposed was Chrome based:
- Use different Chrome profiles to switch between general, auditing, link building, etc.
- SEObook extensions are great too if we use Firefox;
Then Steve presented us how he uses RSS for better organizing his work.
Steve uses RSSOwl, hacks the Twitter Search API for creating advanced RSS/Atom feeds, and hooks it up with feeds from Google, Topsy, Social Mention and filter all the information and creates alerts via keywords.
Building Your Own Tools
Coming onto the vast universe of “Do it Yourself” Tools, Steve suggested that we all start learning how to efficiently create tools our self using guides. Alternatively, we can always use the tools that have been made by others like:
Less known tools Steve suggested are:
- Rafflecopter to build relationships with bloggers
- Gephi/NodeXL to visualize social networks and link profiles
- RapidMiner for DIY crawling and data mining
- RSSOwl (again)
- SEOgadget toolset
- Social Mention
- Linkdex Network
Pete Wailes – SEOgadget
Next up, we had Pete Wailes from SEOgadget taking to the stage. At SEOgadget creating tools is a strategic asset in order to provide a more effective service to their clients.
Pete presented us a classic situation:
We have 9,000 links and we need to get all their metrics
Doing that manually would mean literally days of work even if we were very fast. There are so many tools on offer that offer us useful data and metrics but every tool is independent. To try and complete the task mentioned above, we would need to use so many different tools!
When we have to deal with data reporting, which normally is a compilation of boring and repetitive tasks, we must find other ways of doing this than the manual way.
The first option is using our most used tool: Excel.
First of all we should create a wireframe based on our needs and then create an Excel template or use one already existing, as the SEOtools for Excel by Bosma or the same SEOGadget created and offers to everybody for using, and finally we must test it.
Just using Excel we can save up to 96.4% of the time before allotted for manual data reporting and reduce it to a more digestible 2.5 hours frame.
The second option is creating your own tools.
For that you first have to MVP in Excel, then find a developer (PHP, Python or Ruby) and finally rebuild everything as a tool.
Doing so you can decrease the time needed for data collecting to just 15 minutes!
The third option is scaling tools paralyzing the collection of data.
Result? 10 second needed instead of days!
What tools for developing tools?
As languages, Pete suggest using the most common ones:
Remember to always create the API service first, which will give life to your tools, and then start building the front end(s) of the tools themselves.
This is something SEOGadget learnt from failures in creating tools, which were looking awesome but whose core was less the efficient.
When creating a tool, it is always better to create a demo rather than writing a specifications doc, because “to do it faster” is always better than “to do it slow”.
Stacey Cavanagh – Tecmark
Third on stage was Stacey Cavanagh of Tecmark. She is absolutely not a coder and therefore what she looks for in tools is to get help in working smarter with less consumption of energy, which can then be used for other things she may have to do.
Do you need an infographic, but no budget (or very small)?
Or do you need data but client is not giving you anything?
Do you need to write something, but have lack of inspiration?
Did you create awesome content, but don’t know who to show it to?
Have you pitched to bloggers and journalists, but not had any response?
- Don’t use tools!
- USE THE PHONE!
Did you publish great content but not had anyone looking at it?
Do you have pretty pictures, but you don’t know who’s using them?
- Reverse Google Image Search
- Image Raider
Is keyword research giving you migraines?
Do your report suck?
And what about tools directly into our own browser? Super charge your Chrome browser with apps like SEO for Chrome or SEO SERP.
Finally, Stacey presented some tools that can make our SEO job easier when we are on the move:
- SEO Rankings
- Found SEO Tool
Dixon Jones – Majestic SEO
Fourth on stage was Dixon Jones from Majestic SEO, who started with a question: what differentiates an average tool from a hardcore one?
Dixon explained how exists a “Maslow’s hierarchy of Tools”.
At the base we have Site Scrapers, in the middle Big Data Tools and on the peak of the triangle we find the Meta Tools, which relies over the two before cited.
Let’s go back to the Scrapers, and we can find they have a problem when it comes to Google: Google doesn’t like them at all.
Scrapers break the Google’s terms of service, they are short term, they accede to non-proprietary information and, ultimately, their business model may be forfeit.
What are the best site scrapers?
Dixon moved on talking about Big Data tools, which have a problem: they distort the math, but are ultimately needed.
In fact, knowing traffic to every site puts our won site’s traffic into context. On the other hand knowing the Link Graph of every site reveals weaknesses in our own relationships. Finally, seeing the whole web and not just our own site helps us seeing the wood the trees.
When it comes to Big Data tools, Dixon segments them in Link Analysis tools, Content Analysis tools and Tracking Analysis tools.
- In the first segment we can rely on Majestic SEO (obviously), Hrefs and OpenSiteExplorer;
- In the second segment we can use Blekko, Bing WMT and Google WMT;
- In the third segment, we have SEMrush, Alexa and Compete.
Finally we have the Meta Tools, which have too a problem: the dependency over the other two sets of tools (Scraping and Big Data tools).
Some of the tools that we can put under this category are:
- Market Samurai
Of all these, Dixon suggested to give a closer look to LinkRisk and Linkdex.