Results 1 to 14 of 14
  1. #1
    Google indexed pages question
    I went to uptimebot and entered my site and found that Google had only indexed only a couple hundred of my pages. The web site is a database of private schools and I have almost 30,000 pages on the site. The site has been up for almost 5 years. Can anyone tell me why the entire contents have not been indexed after so long?

    The message above says not to post a URL but it should be in my profile.

  2. #2
    Full Member
    Join Date
    January 18th, 2005
    Location
    Mebourne, Oz
    Posts
    257
    Try this Google search:
    site:www.yoursitename.com esearchforit

    It shows 36,500 results for the word esearchforit on your site.

    Rob
    Last edited by enginez; December 29th, 2005 at 07:07 PM. Reason: removed site URL
    [URL=http://ProxyGrader.com]ProxyGrader.com[/URL]
    How anonymous is your proxy?

  3. #3
    notary sojac Herb ԿԬ's Avatar
    Join Date
    January 18th, 2005
    Location
    Central/Western NY State
    Posts
    7,741
    Lightbulb
    those results apparently depend on which google index file they are using at the datacenter you're going through. it has been in a constant state of change for months.

  4. #4
    Rob,

    I did the search and got 36,300 using esearchforit but only 400 or so for eschoolsearch.com. I wonder why the difference.

  5. #5
    Full Member
    Join Date
    January 18th, 2005
    Location
    Mebourne, Oz
    Posts
    257
    Google searches for words on the page. I chose esearchforit because I figured it appears on every page.

    Rob

  6. #6
    ABW Ambassador
    Join Date
    January 18th, 2005
    Location
    Los Angeles
    Posts
    4,053
    The search for site:eschoolsearch.com returns 36,200 pages but most all of them are in the Supplemental Index, so they may as well not be indexed. They won't be re-crawled.

    Two of the problems are the duplications and the dynamic structure of the site. Most really aren't pages, they're search results with only minor differences between them.

  7. #7
    I think I should add content to every page. I could add a short description of each city (population, location, etc.) Do you think Google would re crawl the pages if I updated them?

  8. #8
    I asked this same question to Google and this is the answer I got. Great customer service.


    Thank you for writing to us. Due to the tremendous number of requests we receive, we're unable to personally respond to your email at this time. We're always working to provide comprehensive, up-to-date online assistance and encourage you to consult our Help Center at XXXXX We also encourage you to check out our general support discussion group, where many webmasters and Google users share their questions and expertise. Our general support discussion group is located at xxxxx

    We hope you find these resources helpful, and we appreciate your taking the time to write.

    Regards,
    The Google Team

  9. #9
    ABW Ambassador
    Join Date
    January 18th, 2005
    Location
    Los Angeles
    Posts
    4,053
    Do you think Google would re crawl the pages if I updated them?
    Again, those aren't "pages" as such, they're dynamically generated search results. Web pages need to be reachable somewhere in a site's navigation, which those aren't. A CGI result isn't a link in the normal sense of the word.

    Great customer service
    Actually, you (and other webmasters) aren't their customers - they're not in the SEO consulting business and don't provide SEO consulting services.

    Your site needs a complete overhaul in the basic structure and architecture to get it so that it can have pages fully indexed, and you won't find any department or staff people at Google who are designated and paid to provide those services to webmasters. That's the job of site owners themselves and whichever people they have helping them with their sites.

    Do some searches and find some businesses that are ranking, with pages fully crawled and included in the index, that operate multi-state and multi-city. That could be vacation accommodations, car rentals or certain types of repair services and even nationally based ticket brokers - basically, any businesses that have sites covering multiple locations.

    See how they're constructed, including the navigation and what the URLs look like, and you'll get an idea of what it is you'll need to be doing.

    Added:

    Also, take a look at what MSN Search has for the site

    http://search.msn.com/results.aspx?q....com&FORM=QBHP
    Last edited by webworker; December 31st, 2005 at 05:15 PM.

  10. #10
    Again, those aren't "pages" as such, they're dynamically generated search results. Web pages need to be reachable somewhere in a site's navigation, which those aren't. A CGI result isn't a link in the normal sense of the word.
    I did create a site map page where they are listed.

  11. #11
    notary sojac Herb ԿԬ's Avatar
    Join Date
    January 18th, 2005
    Location
    Central/Western NY State
    Posts
    7,741
    Exclamation
    Quote Originally Posted by travisbickle100
    I did create a site map page where they are listed.
    which kind of site map? one for your visitors done in html, or a special sitemap.txt file for google to consider?

  12. #12
    ABW Ambassador
    Join Date
    January 18th, 2005
    Location
    Los Angeles
    Posts
    4,053
    Just a sitemap won't do it. It'll help with getting the pages into the index, which they are; but they're in the Supplemental Index, so sitemap or not they'll not be getting re-crawled and they're not likely to rank.

    From Google Help

    The other issue is that, looking at what MSN Search is showing, the identical meta description is running throughout the entire site.

    Again - if you look at results returned for multi-location type of searches you can see that they've pretty much got regular pages with regular URLs and regular site navigation pointing to them. No doubt those are dynamic sites, but they're either using mod_rewrite or have URL rewriting written right into the programming code.

    Other than a solution from the technical end, the best route would be to create flat HTML pages with enough original content to be indexed, with unique page titles and descriptions that are included in regular, crawlable site navigation. New pages like that are likely to be included in the regular index and stand half a chance.
    Last edited by webworker; January 1st, 2006 at 02:58 PM.

  13. #13
    The sitemap is one in html.

  14. #14
    Comfortably Numb John Powell's Avatar
    Join Date
    October 17th, 2005
    Location
    Bayou Country, LA
    Posts
    3,432
    I have a site that's about three weeks old with about 20,000 pages. Google indexed 500 pages right off and traffic got up to 100 visits a day. Since then Google has crawled over 8000 pages without any addition to the visible index, and traffic is down to around 8 visits per day.

    Is the early traffic due to them not having the PR set? I'm thinking that while the 500 pages were visible that they determined with PR 0 they would be low in SERPs.

    I'm also wondering how often the Google index is updated. I know that I've read that but can't remember.


  15. Newsletter Signup

+ Reply to Thread

Similar Threads

  1. Removing Indexed Pages from Google
    By superCool in forum Search Engine Optimization
    Replies: 4
    Last Post: August 31st, 2012, 02:15 PM
  2. When will the pages be re indexed by Google?
    By IntegratedS in forum Search Engine Optimization
    Replies: 4
    Last Post: February 3rd, 2012, 02:51 PM
  3. Replies: 2
    Last Post: September 21st, 2006, 04:59 AM
  4. oh 15 new pages indexed!
    By jc101 in forum Search Engine Optimization
    Replies: 2
    Last Post: July 26th, 2003, 10:47 PM
  5. # of pages indexed by Google?
    By Celicaphile in forum Search Engine Optimization
    Replies: 13
    Last Post: September 29th, 2002, 04:00 AM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •