How New Sites Are Penalized In The Serps
This website, like thousands of other new sites, is stuck in Google's version of Purgatory - The Supplemental Results!
What are the supplemental results? When Google adds a site to its index, it rates the usefulness of your pages. In simple terms looks at the content text, who links to you and who you link to. It looks at the age of the site, how regularly it gets updated, whether the content is original, and importantly, how quickly the site gets inbound links and the usefulness and relevance of those sites.
With newish sites (but this does happen to older sites too) when Google indexes it, it doesn't have much data about that site. There aren't many inbound links (a good gauge of how useful other webmasters think the site is) and Google won't know how often it gets updated or added to. So the big G puts the pages into its Supplemental index, that is below all the other Search Engine results.
What does this mean to a site? Well, if someone uses a search term that is found on your site, all other things being equal, you will appear below the other sites that have that search term, or your site won't appear at all, even though it has been indexed. To see your site the searcher has to click on the "see similar results" text at the bottom of the list. Therefore you miss out on many, many visitors.
As I write this article (August 06) this site is well and truly stuck in the supplemental. If you do a site search using the search box at the top of the page, for something like tomtom or sat nav, notice that next to any results it says "supplemental result". (If it doesn't, I'm out - hurray!!!) The search box sends you query to Google and instructs it to search just in this site, and Google adds its own ads to the result.
How to I get out of the Supplemental Results? Good question. There has been a lot of debate about this subject on Webmaster & SEO forums, and the consensus is not to panic, keep adding original relevant content, and to slowly build appropriate inbound links through reciprocal and one-way programmes.
Google's advice would be to attract more visitors by paying for an Ad words campaign, where you bid on keywords to appear higher up in the regular listings until your site is indexed in the main Search Engine results. This is a good is a good idea if you are running a business, but not so if your site derives most of its income from pay per click schemes, you'll be spending all your profits on ads, and it is very difficult to get the equation right where you bid low on adwords, direct the clicks to adsense pages where any possible click will earn you more than the original adwords click. I know, I've tried it. What usually happens is that in order to get to your cheap ad on Google, they will probably have already passed the more expensive one you're trying to serve them, but hey, if you can make it work good for you.
What will I be doing? Carrying on with my original plan for this site, which is to make content my main priority for the first year. My target is 500 pages of original, relevant web pages. I have a low intensity reciprocal link program bubbling along in the background, which only generates a handful of appropriate links a month. This is fine, because I know that Google doesn't like sites that all of a sudden generate too many inbound links.
I do use Google Sitemaps, and update it about once a month when I have added a few more pages, and I'm not afraid to use non-reciprocal one way links to other sites which I like, again where the content matches.
Hopefully by Christmas 06 most of my indexed pages will be out of the Supplemental results and I will see an increase in visitors, if not, I'll just carry on building and linking.
To check my progress use the search box above again for a sat nav related term, and that will give you an idea of how successful I have been!
Update: Sep 2006. 103 pages indexed by Google, of which 70 are supplemental.
Update: Oct 2006. 106 pages indexed by Google, of which 103 are supplemental. Oh dear.
Ok, something is obviously wrong here. Google is indexing my pages, but deeming them fit only for the supplemental index, and I need to find out why.
A quick search of the Webmaster World forums and I've got a few avenues to research. First, the canonical issue, or www.mysite.com vs http://mysite.com. Apparently this leads to duplicate (and supplemental) listings, so you should use redirects to so only www. URL's get indexed. However, I'm on a Windows Server, and can't use redirects. I've also told Google sitemaps that I prefer www., and only www. results come up in searches for my site, so I don't think that's the problem.
The next thing is meta-tags - title and description. Apparently it's bad to have them the same, which lots of mine are, and looking at them they're not very descriptive, so I'll be changing them.
Lastly is something I overlooked before. Duplicate content. I have written almost every page on this site, with a few original articles submitted by others, so I thought I couldn't be the victim of a duplicate content penalty. I don't think my site has been scraped, so that couldn't be the reason either. Then I read a post about looking at your pages through the eyes of a spider or bot - i.e. looking at the html text. What I found is I think the root of my problem. Take a look at this text file to see what I mean. It's my review of the TomTom 500, but it's difficult to tell by looking at the file!
So I think that Google thinks that all of my pages are roughly the same, with a little bit of content at the bottom, but I bet that the bot has already made up it's mind about the site before it even gets to the good stuff!
So what am I going to do? Remove all the rubbish above the content. It'll mean sacrificing some fancy roll-over link effects, but if it leads to the site getting out of Supplemental hell, it's worth it!
Ok, done. The new Tomtom review is here. See the difference? There's still a lot of html above the real content, but a lot less than before. If I check the stats in MS Word, the original page had 11887 characters and 394 lines. The new page has 8448 characters and 294 lines, a big reduction. Just by looking at the text files you can see the difference.
The pages now load faster, and I've got rid of an annoying error message that used to pop up in the Internet Explorer message bar, that I think was related to moving the mouse before the menu had initialised.
I've still got to change all the title and description meta-tags, but hopefully by getting rid of the fancy drop-down java script menus Google will start to look at my site in a kinder light. We'll see at the beginning of November.
Comments / Questions:
Add A Comment / Question: