Results 1 to 20 of 20
  1. #1
    Newbie
    Join Date
    July 15th, 2011
    Location
    London
    Posts
    24
    Question Have I messed up? Google traffic nose dived.
    I have a site base upon Wordpress, which aggregates a large number of RSS feeds, which I then use as an index for a custom search engine. Essentially, I'm piggy backing the WP post database as a search engine index. All fine, but of course I have several thousand posts which are duplicates from other sites.

    I do have some unique, decent content, but only a few dozen pages, tiny in comparison to the aggregated content. So, based on advice here and what I've read, I removed everything that's aggregated from the Google index, so it wouldn't negatively affect me. Used a mixture of robots.txt and meta noindex tags and removed it from index with G Webmaster tools. All fine a dandy.

    I've done a google sitewww.mydomain.com search, checked all my decent unique content is still indexed and fine, it is. No aggregated content - good. Tested my robots with the G Webmaster tools, all happy.

    Search for my site title, still comes up top on Google - fine. Searched the keywords I'm aiming for, not moved page rank, still pretty far down on page 4. Oh well.

    However, checking my Analytics's since removing aggregated content and my Google organic traffic has nose dived. Down from ~80 daily visits to 6 ish. Can't work out what's happened.

    Guessing there's other search phrases that worked well based on the aggregated content, however I can't find any likely candidates in the Webmaster tools etc. All the top search phrases listed still give me very similar page rank to before. Of the top landing pages, those that are now noindex'd, they only brought in a small number of 'entrance' hits, very little compared to say the home page, which of course is still indexed.

    Now I'm hesitant to revert, as surely all the duplicated content is not a good thing to have indexed. Should I just give it time (been 3-4 days so far)? Are there other more reliable ways to find out how people used to find me on Google and with what search phrases? Any ideas, tips?
    Last edited by BurgerBoy; September 12th, 2011 at 06:18 AM. Reason: unlinked

  2. #2
    ABW Ambassador superCool's Avatar
    Join Date
    April 23rd, 2008
    Location
    Texas
    Posts
    1,268
    not sure if this is related, but superCool recently had something similar happen. had some duplicate content from datafeeds and some of it was very poorly written trash, so superCool put in noindex for googlebot. seems like maybe Goog doesn't like that, since all the other pages on the site took a nose-dive. have not had time to go back and try to remedy the situation. perhaps Google doesn't like the content even though you've told it not to look at it. after Panda some people suggested noindexing poor pages. maybe we need to remove the pages completely to please our master...big G

    or maybe the drop in page count (and internal links) for the site had an effect?

    who knows? Traffic: here today gone tomorrow

  3. #3
    ABW Ambassador daiarian's Avatar
    Join Date
    April 4th, 2011
    Location
    Beautiful Wales
    Posts
    602
    "so superCool put in noindex for googlebot"

    How is this done?

    Regards

  4. #4
    ABW Ambassador superCool's Avatar
    Join Date
    April 23rd, 2008
    Location
    Texas
    Posts
    1,268
    superCool put this in the head of the bad pages

    <meta name="googlebot" content="noindex, follow">

    superCool is not recommending this - please research before doing it

  5. Thanks From:

  6. #5
    Newbie
    Join Date
    July 15th, 2011
    Location
    London
    Posts
    24
    I used teh SEO Ultimate Wordpress plugin, which let me tailor which bits of the site have noindex/nofollow etc. It insert this into the header on mine:-

    <meta name="robots" content="noindex,follow" />

    G Webmaster tools, show up loads of pages that have been (intentionally) blocked by robots. Not sure what is linking against it.

    Thing is, I can't find any search queries that I've drastically dropped page rank on. Not that I was very high to begin with. That's the mystery, I can't put my finger on what I was getting the traffic from google before with...

  7. #6
    Moderator
    Join Date
    April 6th, 2006
    Posts
    2,689
    Are you sure the drop came as soon as you added the "noindex"..?

    Having said that, there are a few missing pieces - in Webmaster's tools, there is a lag of 2-3 days from when you make changes.

    Have you seen the "pages crawled per day" drop like a stone, as I would expect if you told G not to index thousands of pages...?

    Think about it - if you tell search engines that thousands of your pages shouldn't be indexed... what's left of your site? The internal linking structure will be wiped out as well.

    I would put the aggregate pages back.. and wait a few days. Perhaps there was some element of uniqueness to your approach - it's hard to know if your traffic will automatically bounce back to previous levels, but I would put the site "back" until you can figure out what the next step will be.

    One more question: Have you reviewed the most popular pages in the past 30-60 days..? If they are pages no longer indexed, you have your answer (even if you can't identify the referral).
    Last edited by teezone; September 12th, 2011 at 12:11 PM.

  8. #7
    Newbie
    Join Date
    July 15th, 2011
    Location
    London
    Posts
    24
    Yes crawled pages have dropped since my change, but there was a massive spike on the day I removed shed loads of content...

    Most of the 'blocked by robots' crawl errors have been referring to a zillion Wordpress Archive pages, per month, pages 1 - 300 etc. I have no links on the site to the archive now. All internal links to non indexed stuff has nofollow on it (did that some time ago actually). And a lot of the content is wordpress posts references by ID, even though I've permalinks enabled so no link is ever shown in this form. Besides, all the aggregated content has permalinks back to source... Although my RSS feed, I believe references things this way.

    Its as if Google, has twigged its a Wordpress site, then blitzed it way around the whole site, second guessing everything based on standard Wordpress url structure. Which in itself leads to duplicate content, same pages presented in numerous ways around the site, with different urls. The Wordpress site, actually recommends blocking a lot of stuff with robots.txt like this, just in general to minimise duplicate content shown to google:-
    https://codex.wordpress.org/Search_E...t_Optimization

  9. #8
    Moderator
    Join Date
    April 6th, 2006
    Posts
    2,689
    Wordpress aside.. so you noindexed the pages after you saw a drop in Webmaster Tools..?

    If you were making changes because G had dropped your site ranking, you may have a long road in front of you. Your original content may generate some traffic.. but if the most popular pages were aggregated pages, I can't see how you would get that back.

    Google used to favor Wordpress sites, but that has changed.. platform aside, thousands of pages of duplicate content will eventually be flagged by the search engines.

    You also need to start adding original content daily, while you continue to figure this out.. and what about the "popular" pages (ie. the most visits)... were they original, or duplicate..?

  10. #9
    Newbie
    Join Date
    July 15th, 2011
    Location
    London
    Posts
    24
    Nope, I wanted to hide the dup content, to get better traction in google. The drop occured after I noindex'd stuff.

    I understand the need to unique content (I have some, I'm continually building it up), I just didn't want the dup content holding me back. However I require it in my DB for indexing, user searching and presenting as results. I also have a section where users can browse the latest.

    I've also noticed my Adsense ads are blank on my noindexed pages. I'm guessing as the page isn't indexed, its got no idea what the content is to decide what ads to show. Is this correct?

  11. #10
    Moderator
    Join Date
    April 6th, 2006
    Posts
    2,689
    The drop occured after I noindex'd stuff.
    Put everything back.. and then reassess the site structure. I just don't think it's a good idea to tell search engines to ignore thousands of pages at the drop of a hat.

    As I said before, if you weren't being penalized for the duplicate content, don't pull the plug on those pages.

  12. #11
    Newbie
    Join Date
    July 15th, 2011
    Location
    London
    Posts
    24
    Yeah. Looks like that is going to be the best plan. I was hoping from a boost (at some point) in google rank from loosing the dup content, but yeah, maybe it wasn't causing me as much trouble as I first thought...

    Definitely don't think its worth waiting for a while, just to see how things pan out?

  13. #12
    Newbie
    Join Date
    July 15th, 2011
    Location
    London
    Posts
    24
    I'm still not sure if this really is bad. Looking more closely at my Analytics's. Yes, visits are down since the big noindex removal. However since then time on site, pages/visit and new visits have increased. All looking at visits from Google organic in isolation. Don't know if this is skewed somewhat due to small number of visits.

    Considering that and my top search queries, still giving me similar rank to before, I'm still uncertain which way to go.

  14. #13
    Moderator
    Join Date
    April 6th, 2006
    Posts
    2,689
    However since then time on site, pages/visit and new visits have increased
    Well, now you're looking at things differently, which is a good thing... for affiliate sites, higher traffic doesn't mean higher sales. If you have dropped non-converting traffic, that's not a great loss..

    There is no right or wrong answer, but if you start building original content daily, you will start to see increased numbers. They may not bounce back to previous levels for a while, but I think you will be better off in the long run.

    It might be a shock to see the drop now, but I don't think the duplicate content will ever really pay off. Maybe it's better to have control over it now, rather than taking the hit later.

    At any rate, 2-3 days is too short a period to really assess the damage (if any!).

  15. #14
    Newbie
    Join Date
    July 15th, 2011
    Location
    London
    Posts
    24
    Exactly, I'm beginning to wonder if I was drawing in a pile of untargeted traffic that wasn't helping me. I'm going to sit on it for a week and see how it goes. 2-3 days isn't long, I could just be panicking and jumping the gun.

  16. #15
    Moderator
    Join Date
    April 6th, 2006
    Posts
    2,689
    You have the right attitude.. and you're not the first to be faced with these decisions, if you go back a few years, you will find some of my own panicked threads

    When Google rolled out the first wave of Panda, I posted some suggestions, and continue to tweak my own sites accordingly:

    http://www.abestweb.com/forums/searc...te-142384.html

    It's hard to be patient, but taking a step back usually helps.

  17. #16
    Moderator
    Join Date
    October 16th, 2007
    Location
    Neenah, WI
    Posts
    682
    The big G sure is getting more and more picky with their algorithm. My biggest fear is that they will eventually categorize and depreciate "affiliate" sites no matter how valuable the content.

    Not sure if you are loosing traffic from a drop in SERPs or other sources of traffic?

    G likes to look at your site overall and prioritize what keywords your site supports and the strength of the supporting semantics. Some times page "A" supports a keyword phrase on page "B" by further establishing relevance of that topic. When you start removing pages from being indexed and/or followed you can remove the support for another page. I'm not talking about flowing page rank, just a semantic or direct keyword support of the relevance of your site and how it relates to that page.

    Removing content from G can damage your site's authority for the subject/topic and affect your SERPs.

    Content is king and looking at your site's content as a whole is just as important for any individual page.

    Just throwing that out there as it seems relevant(ish) here. Hope that can shed any kind of light on this community's SEO thought process.

  18. Thanks From:

  19. #17
    Newbie
    Join Date
    July 15th, 2011
    Location
    London
    Posts
    24
    Think I get what your saying.
    Other sources of traffic have been fine. Rank in SERPs remains that same as before purge.

    My keywords coming up in analytics's and webmaster tools aren't what I expect or want. It seems duplicate content is sending misleading keywords to the forefront, which isn't helping me. Things like the names of the duplicate sources etc. I'm hoping that now my intended keywords from my unique content will start to show through.

  20. #18
    Newbie
    Join Date
    August 5th, 2011
    Location
    kokkedal denmark
    Posts
    1
    new content website
    write unique content as well to change the shackles of self to your website.
    hang like videos and pictures on the page
    update pages like regula

  21. #19
    ABW Ambassador superCool's Avatar
    Join Date
    April 23rd, 2008
    Location
    Texas
    Posts
    1,268
    Arfa, just noticed the paragraph below in Google duplicate content guidelines. Don't know if "no longer recommends" really means "will kill your search traffic", but it's probably best not to do things G does not "recommend".

    Google no longer recommends blocking crawler access to duplicate content on your website, whether with a robots.txt file or other methods. If search engines can't crawl pages with duplicate content, they can't automatically detect that these URLs point to the same content and will therefore effectively have to treat them as separate, unique pages. A better solution is to allow search engines to crawl these URLs, but mark them as duplicates by using the rel="canonical" link element, the URL parameter handling tool, or 301 redirects. In cases where duplicate content leads to us crawling too much of your website, you can also adjust the crawl rate setting in Webmaster Tools.
    Duplicate content - Webmaster Tools Help

  22. #20
    Moderator BurgerBoy's Avatar
    Join Date
    January 18th, 2005
    Location
    jacked by sylon www.sylonddos.weebly.com
    Posts
    9,618
    I don't ban anything from G and my sites do real good. They are all datafeed sites and I haven't had any problems from G at all. I've been doing this since 1998.

    I do submit sitemaps to G and resubmit whenever I add more merchants.

    Vietnam Veteran 1966-1970 USASA
    ABW Forum Rules - Advertise At ABW

  23. Newsletter Signup

+ Reply to Thread

Similar Threads

  1. Google Nose
    By HoopsFan in forum Daily Chuckle
    Replies: 4
    Last Post: April 1st, 2013, 11:54 AM
  2. Webmaster Tools (google) messed up?
    By CluckCluck in forum Analytics, Multi-variable Testing & Optimization
    Replies: 3
    Last Post: December 19th, 2010, 03:46 PM
  3. So, is google messed up or what?
    By Steveinid in forum Search Engine Optimization
    Replies: 9
    Last Post: June 4th, 2003, 05:51 PM
  4. Google Messed Up
    By sulla in forum Midnight Cafe'
    Replies: 3
    Last Post: October 16th, 2002, 01:13 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •