Google Search Console’s Crawl Errors report has a special section for Google News. Read about the causes and solutions for some common News crawl errors.
According to Pete Campbell, the future of SEO will revolve around advising brands on their open data strategy via apps that leverage open data through API’s.
Google, Microsoft, Mozilla and Apple are desperate to encourage all brands to ensure their website is compatible with the relatively new HTTP/2 protocol. But how?
The latest tool from Screaming Frog is a log file analyser that allows you to combine log file data with any other set of URL data for meaningful analysis.
Speed, it’s one of the things that are highly underestimated when it comes to websites. And something many (including ourselves) struggle with. What to do?
With DeepCrawl, sites can be crawled in a similar way to Search engine bots. Fili Wiese dives deep into how to use it to optimize search signals and make a site visible in Google’s organic search.
Crawl optimisation is about making sure search engines can crawl all the right content and don’t waste time on your site. Barry Adams talks about common crawl optimisation issues and provides fixes to ensure Google doesn’t waste time crawling non-indexable pages.
Diagnosing technical SEO issues on a website can be quite challenging. In this post Daniel Bianchini provides 17 of his favourite tools to help with identifying and fixing technical SEO problems.
It’s hard to overstate the importance of proper technical SEO as the foundaton of your website’s organic performance. In this post Daniel Bianchini looks at four technical SEO issues that often go unnoticed.
A lot can go wrong when transitioning to SSL / HTTPS. Make sure to schedule and execute all required SEO related steps in detail to avoid negative impact. Jan-Willem shows you how on State of Digital.