Be positive! Check and monitor your backlinks!

WebDNA is a comprehensive tool for managing and controlling links. It allows you to audit the existing historical links and to monitor and assess the quality of the inbound ones.

While creating the app we used machine learning and the newest technologies to ensure that the assessing of suspicious links and catching them are highly effective.

Automatic links assessment in the cloud

One of the undeniable advantages of WebDNA is that it works in the cloud computing so you don’t have to use your own machine for analysing all the websites linking to you. Also, it guarantees that the exploration of even very big sets of data will not be too demanding.

System is fully automatized so it’s easy to conduct analysis and automatically monitor even 200,000-300,000 backlinks without running the programme on your local resources.

You can use one of three modes of analysis:
– automatic: app assesses the newest links found coming from three external providers;

– files upload: you can upload file with backlinks from the external sources, e.g. Google Webmaster Tools, Ahrefs, Moz, Majestic etc.


– the form: copy the links from the test file or other source and paste them to the special form.


When you set the analysis, WebDNA goes through each domain and URL address by using crawler and scores it with our original mechanism for data analysis. After collecting all the parameters needed for scoring, app decides on the quality of a website linking to you.

Effective tool for links assessment

Thanks to Sybilla algorithm, links assessment is highly effective and successful. Right now we’re measuring thirty-something parameters for each URL address. They give the basis for the quality assessment which is then sent to you.

Sybilla divides websites into three categories: negative, suspicious, positive.



Website gets a negative status if it’s harmful. Link to your page on such a website can cause you troubles and lead to lowering your position in Google. In extreme cases it can even get you a ban. That’s why the link should be reported as unwanted in Google Disavow Tool.

Suspicious sites are exactly what they’re called – suspicious and so they should be manually checked by the user. We suggest to pay extra attention to those websites and analyse them very carefully. If a given URL can harm the profile of your website’s links, you can change the status for “negative”. If you decide that the link isn’t harmful, you can update it for “positive”.

Positive status is given to those websites which have enough features to rank them as trustworthy and of good quality. Positive websites will help you get higher in the rankings.

After analysing the links, WebDNA will recommend one of the three statuses but you can always manually change them, e.g. from suspicious to negative etc.

The simplest tool for backlinks assessment

The app lets you generate disavow file ready to be uploaded in Google. The file is prepared basing on all the URLs marked as negative, and it doesn’t matter if the website got this status automatically (system ranked it at such) or manually (you changed the status yourself). All the links marked as negative are included in the disavow list, which is available when you click “remove toxic links”.

toxic links export

Sybilla is an original complex algorithm which takes into account the following parameters when assessing any HTML document:
– meta tags quality; for example, the length of tag title
– the structure of the HTML document and its quality
– the number and type of links on a given website
– the number of images in the document
– the distribution of links in each part of the website
– HTM2TXT ratio
– the amount of text found on the website
– loading time and the status code of the website
– domain authority
– the analysis of the website for malware and viruses

The system collects and calculates also some additional parameters needed for the correct analysis (a secret know-how of the creators). Those data are assessed and help to:
– evaluate the profile and the quality of links;
– make decisions whether it’s worth to keep the links on a given website;
– evaluate the links of your competition;
– search for the quality websites where your links could be placed.

We wanted to give you something light and easy to use. In our fast-paced world we need tools which do what they’re meant to do and aren’t too complex. That’s why WebDNA is probably the simplest tool for monitoring the links in the whole world (unlike some other available solutions)!

WebDNA limits the amount of data that you need to analyse without adding more difficult parameters.

WebDNA is made to make webmasters’ work easier by automating previously manual work of searching and visiting websites of low quality.

wiz (2)

So if you want to improve your visibility in Google or get your website out of the filter – WebDNA will be the solution just for you.



Voytek Blazalek
Twitter: @blazalek
LinkedIn: blazalek

Fully functional WebDNA beta. It’s free for now, so try it!

We are proud to announce, we have released a fully functional beta. Sign up, log in, and check as many sites as you want for toxic links.

We hope you’ll like it as much as we do. It’s clean, it’s simple. Here’s how it works.

Once you log in, you can add the website you want to analyze.


You will get three options to run your analysis. The first is the easiest, and you can run it on any site, not only the one you control. Choosing “Get sample backlinks from” will provide you with an instant analysis of 50 inbound links.

analysis step1

However, if you want to check up your own site, we recommend using the second option, and Upload backlinks file. It’s the file you can get logging to Google’s Webmasters Tools. Here’s a short tutorial, if you are not familiar with it.

Once you have the file, just drag it here:


If you have a list of inbound links, or any URLs you’d like to verify with, you can just copy and paste them using the third option.


Now, our analysis takes time. The most important thing for us is to provide the best results, that is identify toxic links, which is not an easy task. That is why sometimes you have to wait several minutes, sometimes longer for the results. We have made it convenient for you, and we will notify you once the results are ready via e-mail.

While the analysis is in progress, you will see a notification in the panel.


Once your analysis is ready, you get an e-mail with confirmation. Now you can check out the results. It’s very simple. You get:

  • The number of backlinks analyzed (blue)
  • The number of backlinks flagged as positive (green)
  • The number of backlinks flagged as negative (red)

The last one is the essence of our tool. WebDNA is there to help you get rid of toxic links. Our algorithm is still learning, that is why we kindly ask you to run through the results, and if you see any familiar URLs that have been marked red, change the minus icon into plus, please. We need your feedback in order to improve our results. That is one of the reasons why as a beta tester you get our tool for free.


Once you double checked our automatic analysis, click “Remove toxic links” button. It will enable you to download a “disavow file”. It’s a text file with the URLs flagged as negative. You can upload to Google Webmasters Tools using these instructions.

Once you do it, Google will tell its bots to ignore the listed backlinks while positioning your website. Do not hesitate to do it. It always helps to remove some spam pages pointing to your site. Whenever you change your mind, you can upload a new disavow file.

Register free beta version and get 15% discount

Teach Google to ignore spam: How to upload a disavow file with Webmasters Tools

In order to use WebDNA tool, you need to get familiar with some webmaster’s basics. This tutorial is for beginners. We have already written about how to:

The final effect of our analysis is a “disavow file”. It’s a text file with all the inbound links you would like Google to ignore, because they are harming your search engine visibility. They are either spam or simply pages that no longer work.

Once you download your disavow file, you need to upload it using Google Webmasters Tools.

First go to this the disavow links tool page. And choose the site you want to remove the toxic links from.

how to upload disavow file 1

Google will warn you twice it’s an advanced feature. But since you are using WebDNA, it means you know what you are to upload disavow file 2

Upload the file you downloaded from WebDNA dashboard. You can also see here the history of disavow files uploaded. As you can see, you can always remove the file, and let Google bots take under considerations all the spammy links we have found for you.

how to upload disavow file 3 

Give us a try and see if our analysis can help your search engine visibility.

How be more visible in search engines? Try these 7 backlink strategies

Inbound links are the core of your site’s exposition in search engine result pages, especially in the most important ones: Google. Of course in you enrich the internet with quality content, the chances are it will come naturally. However there are some backlink strategies, that are fair and safe, and that your competition is probably using right know. That is why you should take a brief look at the list below, and see if you had considered all of them.

1. Follow the competition

Check out the keywords you are interested in. Find your competition that has higher position on the result pages. Tthen examine their site, looking for their external links. There are several tools to examine it. One of them is Bing Link Explorer , where when you register your site, you will be given a large sample of inbound links to any site (yeah, competition is good, since Google does not provide a similar tool). You will find out the places that you have missed during your research. .

2. Get free press, interviews, reports, PR news

Inspire journalists and bloggers to write about your product and site. You can use any reason. New products in your store, new functionalities, a weekly or monthly report on the market. Use the resources you have, the knowledge about the market to inspire bloggers and professional writers.

On the other hand, create ready to publish content. Editors working for internet portals, site’s that have huge traffic and earn their living on pageviews will be delighted to publish any valuable free content, and we’ll usually leave external links in text body.

Bloggers would rather have you, or your crew guest posting. It’s the most organic of backlink sources.

3. Use social media

But do in wisely and profoundly. Don’t spam it. Publish tutorials on your products, services. Use the knowledge of your team to answer question on quality forms as Quora or

4. Donations and social responsibility activities

Charity shouldn’t be self-interest motivated, but in business it usually is. Find a cause that is somehow linked to your activity and have pages with list of donors with active links. Try theaters, schools, universities, galleries. Being a donor to a good cause is a great think to do, especially if you can build your brand on it.

5. Search for unlinked content

Not just text and data. Use image search to find your graphic without annotation. When you find any unauthorized usage, just reach out to the site owners and kindly ask them to link to your site.

6. Broken link building.

It’s an advanced but very effective strategy. The key is to find not functioning or abandoned content, such as links on other sites that don’t work, and replace them with new ones directing to your site. Here’s a useful tutorial for anyone of you wanting to try it out.

7. Feel free to improvise

Just make sure the pages you are targeting are of high quality. Avoid cheap catalogues that may be made by a positioning agency and can be a threat to your site’s reputation.

Avoid posting your links in forum comments, unless it comes naturally. Google algorithms are taught to find such activities and treat them as spam.

Article directories and blog comments should be out of your list. They are supposed to be made spontaneously by real people, not brands pretending to be ordinary internet users. There is no long-term efficient strategy involving such activities.

Good luck, if you have any other ideas or experience, feel free to put them in comments!

image: CC BY-SA 2.5, Art by Felipe Micaroni Lalli

How to write an efficient title tag


Times they are a changing in SEO. Recently Google have introduced a search result page layout change. Seemingly it’s just cosmetic adjustment, but since it’s such an important page, every detail can affect many businesses.

Until recently title tags optimal length was between 65 to 70 characters, but now it has been reduced to 48 – 62 characters. It makes editors and copywriters job a little bit more difficult.

Here are some useful tips:

  1. Create different pages dedicated to each user group

    The best way to do it is to analyze the keywords people are using visiting your site, and then try to divide them into different categories based on: action, location, user profile. You should remember though not to create too many different pages, since Google’s Panda algorithm can punish you for deteriorating user experience for solely SEO purposes.

  2. Each title tag should be unique, don’t use it on more then one page

    According to “Search Engine Watch” it’s one of the most common mistakes by webmasters. Meanwhile, when you do it, search engine will most likely qualify your page as duplicated content. Get creative, hire a copywriter and do your best to avoid it.

  3. Keep the recommended length

    As I’ve noticed in the introduction, at the moment the most efficient number of characters seems to be between 48 and 62.

  4. Make it practical for the user

    That should be the key to the success notwithstanding any possible changes in search engine’s algorithm. The title tag should bring up the content description first – what’s the information on the site, then you can add some more general keywords, finally answer the question – who is it run by – that is, the name of the company, organization or department.

If you have any questions of comments on title tag writing, feel free to post them below.

A case study proves frequent SEO audits are needed for a large website not to be penalized

Search Engine Watch has posted an interesting case study recently. It is an in-depth analysis of a large website being punished by Google’s Panda algorithm update. The site lost about 30 percent of Google organic traffic, which can be shown here:


SEW’s post is a great read, so we strongly recommend you to read it. From WebDNA’s point of view the most important are the blog’s recommendations on how to avoid similar situations in the future. The main hint is to “continually audit the website”. We fully agree with this conclusion:

Initial audits can be extremely powerful, but once they are completed, they can’t help you analyze future releases. To combat this situation, I recommend scheduling audits throughout the year. That includes a mixture of crawl audits, manual analysis, and auditing webmaster tools reporting.

Continually auditing the site can help nip problems in the bud versus uncovering them after the damage has been done.

We couldn’t  have agreed more. Having similar experience myself, that is why I have decided to create a tool that is able to run a quick audit, that points to potential ban and penalty threats. The goal for is to learn to understand the search engine’s algorithms and provide you with specific advice.

Feel free to join our beta testers here. Organizations that participate in it gain an opportunity to win a special feature of our app dedicated to their needs.

Tips on backlink analysis that will help your site’s positioning

In-depth backlinks analysis can not only help you optimize search engine traffic. It can tell you much more interesting information about your site.

John Ball from “Search Engine Journal” points at four main uses of backlink analysis:

1. Competitor Analysis

Competitor analysis can provide you with wide range of useful information. You will learn what keywords are most useful, which pages of competitor’s site are most linked, which content proves to be most effective, which sites are linking to it.

2. Link prospecting

It’s a very important element of link building process. First you should start with queries most important for your clients. Then you’ll need to additionally filter the results, deciding, which of them are most useful, and of the highest quality. Once you’ve done it start building the relation with the website owners (bloggers, publishers) by contacting them. Here are some useful tips on link prospecting.

3. Backlink Audit

This is a very important part of link building. Toxic links can be as harmful to your search engine visibility and business to overshadow all backlink building process. Google may decide to penalize your site or even ban it because of toxic links. Sites that used to be considered high quality at a time, may be treated as spammers by new search engine algorithms.

Some of the linking pages may not respond. Some of the links may point at redesigned or removed parts of your site.

It’s also not recommended to over optimize the backlinks net. While auditing your backlinks, go through all the points we’ve mentioned here.

4. Link Reclamation

The final result of the analysis is link reclamation. You may contact the linking site’s owner to change a wrong URL or inform him about his page not responding. In case you decide it’s not possible or simply not worth it, you can disavow a link, using Google’s dedicated tool.

Remember, according to new search engine standards, it’s often better if some sites aren’t linking to yours.

image by mterraza via

Register free beta version now! Get discount on future plans 

Why has Google banned my website?

It’s a webmaster’s nightmare. First, you notice sudden drop in traffic. You log into Analitycs only to find out, that visits from Google have dropped to zero. Your site has been

Google ban

Why? For how long? What should I do to bring it back?

For its part Google will provide you only with general information:

Google may temporarily or permanently remove sites from its index and search results if:

  • it believes it is obligated to do so by law
  • if the sites do not meet Google’s quality guidelines
  • or for other reasons, such as if the sites detract from users’ ability to locate relevant information.

Google’s hints are very vague. The main quality guidelines are:

  • Make pages primarily for users, not for search engines.
  • Don’t deceive your users.
  • Avoid tricks intended to improve search engine rankings
  • Think about what makes your website unique, valuable, or engaging. Make your website stand out from others in your field.

You can spend hours wondering if you are breaching any of them, and come with no clear answer. And even if you are sure you do comply with all guidelines, there are always “other reasons”.

As a general rule Google will not provide you with any individual clues:

We cannot comment on the individual reasons a page may be removed. However, certain actions such as cloaking, writing text in such a way that it can be seen by search engines but not by users, or setting up pages/links with the sole purpose of fooling search engines may result in removal from our index.

How Google can tell if suspicious links have been set up by you? It can’t. That is why so many sites are blacklisted by mistake or as an effect of toxic external links fabricated by competition.

Google ban may happen to anyone, and it’s always a great challenge. Even more difficult, since the search engine won’t provide you with practical steps to go through.

The most tedious task in trying to lift it is inbound links analysis. You need to go through each one of them (sometimes even hundred thousands) manually and by your own judgement decide to disavow it. And you will never get to know for sure if your decision was right. After removing part of incoming links, you will just have to implement the change and wait for the result. If the ban stays  – keep on experimenting.

It all takes time, and website owners observing traffic meltdown don’t have it.

We went through this ordeal. That is why we decided to create a tool that will help you identify the reason of Google or any search engine ban. is a self learning tool, that collects data about links from different sites, and quickly provides you with advice on what to do to remove the ban. Try it now or come back if you ever face a Google ban, which is not such a big problem if you’re using our tool.