Google Search Listing Problem?
Plot file Utilities COM Integration and Legacy Historic Products Historic Screensavers Kaleidoscope Samples Eurex Exchange Programming Spam Problems Miscellaneous



Google "Listing Position" and duplication of web content on multiple domains

Is poor "Listing Position" related to the replication of identical content on multiple (similar) domain names ?

Our Google Listing Position seems unreasonably low (Jan 2006), to the extent that it actually requires some effort to get our pages to appear in a Google search results listing, but they do appear reasonably in MSN, Ask Jeeves (Teoma?) and Yahoo listings. Note that our pages don't get listed after all the other pages with higher Google PageRank, there are other similarly ranked pages far higher than ours (click here for a tool that displays the Google Search Engine Results Page (SERP) together with PageRank). This kind of problem is similar to the "sandbox" problem discussed in terms of new sites indexed by Google, be that real or imaginary, we are not a new site, so don't think this much-described bias against new sites applies to us. True, we have hardly any web sites pointing at us, hence the 0 PageRank (see here for one description of PageRank, there are others), but that obviously doesn't bother the other engines. And anyway, since Google says "Make sure that other sites link to yours", but given that it is a commercial site and Google also says "Don't feel obligated to purchase a search optimization service" it seems to leave a bit of a grey area.

Although there are various discussions to be found on the web about how Google's algorithim for returning search results might, especially since late 2003 (eg see "Been Gazumped by Google? Trying to make Sense of the "Florida" Update!" by Ben Lloyd), be so much tweaked to outwit deliberate manipulation that it also now rejects "proper sites, there is nothing specific we've found on the Help pages of Google to substantiate this, see their Guidelines. So looking at what is on their pages the only guideline we seem to be breaking is as follows...

Having registered and to stop anybody else taking them, we simply used a facility called "Server Side Pointers" (SSP) provided by our web site hosting company to link these "secondary" domains to our "real" website ( Essentially, rather than providing a 301 redirect for the  "secondary" domains, the the SSP creates a duplicate "virtual" identical website so that any pages in will be served up as if they also exist in the secondary domains, and the user will see the "secondary" URL in their browser address bar). This duplication behaviour is confirmed by looking in the log file for the "real" website, as well as "GET" lines associated with the "real" domain URL ( it also shows the GET requests for secondary domain URL (like , and the pages get delivered with the normal 200 completion code.

Searching around the Google web site also reveals:

"Don't create multiple pages, subdomains, or domains with substantially duplicate content. " (on

This content duplication would seem to apply in our case (on the secondary web sites) , but how much effect would it have?  Looking at other discussions of this topic you will see discussion of many copies of material, and then the original is lost from the listing, much to the annoyance of the original creator, so maybe our low count means this criteria is not relevent. If anyone knows, please tell us. [This must be a staggeringly large comparison problem for Google to solve - in theory comparing content on every page with every other page - no matter how much you mentally simplify the problem it still looks like a lot of data to crunch.]

There is also an oblique reference to this issue in the Google help "Why does my site have two different listings in Google: and"

If your site is appearing as two different listings in our search results, we suggest consolidating these listings so we can more accurately determine your site's PageRank. The easiest way to do so is to redirect your http URL to your www URL using a 301 redirect. This should resolve the situation after our crawler discovers the change. For more information about 301 HTTP redirects, please see

Please note: using a robots.txt file and our automatic URL removal system to remove just the http or www version of your site will not work. This will remove both versions for 180 days, and we cannot manually add your site back to our index.

This "non-www" listing duplication also applies to us because our provider has automatically added wildcard A records to the DNS and serves up the same pages for the non-www website as the www one. Again, the log file shows the GET requests for non-www url "" being delivered as if they were normal pages. What this Google reference doesn't directly answer is the issue of listing position (which is not solely related to PageRank, as explained by Google).

From our point of view it would be preferable to simply have the same content on all these web sites, and for Google to consolidate the PageRanks / listing position. A question about duplication of content on was submitted to Google asking for advice, they say that we aren't penalised (the reply took a week), and they did refer to, ie the non-www version. And although we are "not currently penalised, and it is included in our search results" we still don't really know how much effect this has on the ranking - afterall given "we suggest consolidating these listings so we can more accurately determine your site's PageRank" implies that any duplication can have an effect. 

Meanwhile the "secondary" web sites have been "disconnected" so they don't deliver identical content and replaced with a single page that redirects to, but maybe that in itself (the link) will count against us. Hopefully we'll see a beneficial effect sometime, anyone know how long penalisation lasts? 


2006 Camel Services Ltd.