A case study proves frequent SEO audits are needed for a large website not to be penalized

Search Engine Watch has posted an interesting case study recently. It is an in-depth analysis of a large website being punished by Google’s Panda algorithm update. The site lost about 30 percent of Google organic traffic, which can be shown here:

NGM1998_11p138-9

SEW’s post is a great read, so we strongly recommend you to read it. From WebDNA’s point of view the most important are the blog’s recommendations on how to avoid similar situations in the future. The main hint is to “continually audit the website”. We fully agree with this conclusion:

Initial audits can be extremely powerful, but once they are completed, they can’t help you analyze future releases. To combat this situation, I recommend scheduling audits throughout the year. That includes a mixture of crawl audits, manual analysis, and auditing webmaster tools reporting.

Continually auditing the site can help nip problems in the bud versus uncovering them after the damage has been done.

We couldn’t  have agreed more. Having similar experience myself, that is why I have decided to create a tool that is able to run a quick audit, that points to potential ban and penalty threats. The goal for WebDNA.io is to learn to understand the search engine’s algorithms and provide you with specific advice.

Feel free to join our beta testers here. Organizations that participate in it gain an opportunity to win a special feature of our app dedicated to their needs.