Boost your SEO strategy with these 5 hacks using Google Tag Manager: classify your blog posts, implement structured data, rel canonicals and others.
When a site migration goes wrong and a site loses traffic, it’s usually because of one (or more) of 5 common reasons. Daniel Bainchini explores these reasons and offers SEO remedies for each one.
Google Search Console’s Crawl Errors report has a special section for Google News. Read about the causes and solutions for some common News crawl errors.
According to Pete Campbell, the future of SEO will revolve around advising brands on their open data strategy via apps that leverage open data through API’s.
Google, Microsoft, Mozilla and Apple are desperate to encourage all brands to ensure their website is compatible with the relatively new HTTP/2 protocol. But how?
The latest tool from Screaming Frog is a log file analyser that allows you to combine log file data with any other set of URL data for meaningful analysis.
Speed, it’s one of the things that are highly underestimated when it comes to websites. And something many (including ourselves) struggle with. What to do?
With DeepCrawl, sites can be crawled in a similar way to Search engine bots. Fili Wiese dives deep into how to use it to optimize search signals and make a site visible in Google’s organic search.
Crawl optimisation is about making sure search engines can crawl all the right content and don’t waste time on your site. Barry Adams talks about common crawl optimisation issues and provides fixes to ensure Google doesn’t waste time crawling non-indexable pages.
Diagnosing technical SEO issues on a website can be quite challenging. In this post Daniel Bianchini provides 17 of his favourite tools to help with identifying and fixing technical SEO problems.