Search Engine Optimisation

Stop wasting link authority on low quality pages

I have recently noticed that most of my tag pages are supplementary results on Google, which basically tells me either those pages just don’t have enough links for Google to trust them or they are just crap (or duplicate content). It has always been wrongly implied that the more pages you have, the better the chance of you getting found. To an extent its true, however if 80% to 90% of your pages are rubbish, then you can kiss your Google ranking goodbye.

Now assuming your website has gained trust within your own niche community and have in turn obtained a high level of inbound links, you decide to start generating even more relevant content as well as more pages. You go on creating a lot of great content and start keyword fluffing your website by having a tag cloud and individual tag pages. Google crawls your website over a period of time and when you go to check how your website is indexed, you realise that a large portion of your website is in the supplementary results.

The reason behind this is very simple, by producing a lot of low quality and low relevancy pages which you then link would mean that you are sharing your link value with these pages. As Google devalues pages with little to no unique content (tag pages basically), you lose the link juice you passed to them in the first place which could’ve been given to pages with higher value content and more relevancy for search engines.

This is exactly what happened to me when I got too happy with creating more pages. After a bit of research, it turns out that having more pages does not necessarily mean you will get found more. If anything it can have the exact opposite effect on you. If the majority of your pages are of low value, in this case the tag pages, then you will be spreading thin your link value with pages which yield little use to end users.

As you can probably guess, search engines hate it when pages with no unique content are indexed, hence the increase of pages in the supplementary results. The solution to this is relatively simple fortunately for us, all we really need to do is have a “no index,follow” meta robots tag on our low quality pages. This allows crawlers to continually crawl through the pages, however prevents them from indexing these pages. By doing this you will effectively stop spreading thin your link juice as well as reduce useless pages from being indexed.

Founder of UnicornGO, Visugu and Pixelsquare. I am an Aussie with a passion for building sustainable and scalable businesses servicing the mid to enterprise tier clients. Have an idea that need funding? Reach out to me and we can have a chat.

Leave a Reply

Your email address will not be published. Required fields are marked *