Results 1 to 2 of 2
July 21st, 2011, 01:02 PM #1
Wordpress with aggregated content
- Join Date
- July 15th, 2011
I have a Wordpress based site, which aggregates lots of RSS feeds and then lets the user search them (as well as merging in search results from elsewhere). Essentially I'm using the Wordpress post database as an index for my own custom search engine. Once the content passes a certain age, I ditch it. For visitors this is a good specialised search engine and a nice aggregated feed of latest posts.
Of course, aggregated and duplicate content is seriously going to pull down my Google index. I am bolstering the site with good original content to back things up. But I'm guessing its going to be an up hill struggle to work against the aggregated content I'm using.
I was thinking, can I tell spiders to not index the aggregated content and only search the rest, with rel=nofollow, robots.txt etc. Is this likely to remedy the situation?
Are there any other strategies I should pursue?
July 21st, 2011, 05:36 PM #2
the Google also obeys googleon/googleoff tags. Search for those if you're not familiar.
Also, if you can modify your header meta tags on those pages put in noarchive tags. The Google still archives pages without the noarchive tag, even if you have noindex tags or block them in robots.txt.
Cheeky bastards, eh!?!...Salty kisses, Sandy toes, and a Pirate's heart...
By MicheleH in forum Newbie Affiliate FAQs & Helpful ArticlesReplies: 1Last Post: June 30th, 2011, 06:10 PM
By markwelch in forum Midnight Cafe'Replies: 8Last Post: May 25th, 2010, 08:33 PM
By SOKRATES in forum Marketing Resources & Power ToolsReplies: 1Last Post: March 20th, 2009, 12:09 PM
By Hardaka in forum Newbie Affiliate FAQs & Helpful ArticlesReplies: 10Last Post: June 24th, 2008, 11:35 AM
By Chris - AMWSO in forum AMWSOReplies: 11Last Post: October 15th, 2004, 01:15 AM