Results 1 to 4 of 4
  1. #1
    Newbie
    Join Date
    January 18th, 2005
    Posts
    23
    I was wondering if anyone has the scoop on a SE Seniority "ranking" system (if one exists)? For example, say person A develops a web site to sell car parts in 2003. In 2004, person B develops a car parts site using the same affiliate vendor and same site-development style (person B's site has no more or no less products/info as person A).

    How does G, Y! or MSN rank (or SERP-rank) these two identical sites?

    Thx for any insight you can shed.
    -KH

  2. #2
    ABW Ambassador
    Join Date
    January 18th, 2005
    Location
    England
    Posts
    4,327
    If they are indeed identical, then one, or both would be penalized.

    The two sites would have to be different.

  3. #3
    Newbie
    Join Date
    January 18th, 2005
    Posts
    23
    Julian wrote: <BLOCKQUOTE class="ip-ubbcode-quote"><font size="-1">quote:</font><HR>If they are indeed identical, then one, or both would be penalized.

    The two sites would have to be different. <HR></BLOCKQUOTE>

    This is a good point. But, all else being equal, I think only the second site would be penalized.

    I guess a more-generic question would be: how, e.g. two datafeed-driven sites using WebMerge, would differ in rankings (SERP or PR)? A datafeed has specific fields -- SKU, NAME, etc. -- that may be ubiquitously used by several AMers.

    Of course, most good AMers know that they should create sensibly unique sites. Going back the topic of a possible SE seniority algo, does this mean that the new kids on the block will not just have to be different, but an order-of-magnitude better than the first-comers?

    -KH

  4. #4
    Newbie
    Join Date
    January 18th, 2005
    Posts
    2
    From what I understand, Google is smart enough to tell the difference between an original and a copy so duplicate content is not going to hurt you in G if you have the original content. In Yahoo, I think everyone will get banned - whoever made the original plus all duplicates. In addition, a duplicate content filter will trigger when sites are nearly identical, such as one site being 95% similar to another. Here's an interesting paper on the future of search engines & duplicate content, Block Level Link Analysis
    http://research.microsoft.com/resear...0Report&id=754
    Search engines in the future will look at an html page, slice it up into pieces then compare against pieces on other sites. Scary stuff if you use product catalogs.

  5. Newsletter Signup

+ Reply to Thread

Similar Threads

  1. In Search Of A Site Search Engine Spider
    By Heidi in forum Midnight Cafe'
    Replies: 2
    Last Post: September 19th, 2002, 07:13 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •