Results 1 to 17 of 17
  1. #1
    Newbie Sir Isaac Lime's Avatar
    Join Date
    January 25th, 2006
    Posts
    8
    Using other peoples articles - Duplicate content penalties?
    While I am indeed writing articles of my own, i've thought about using some I found on a few article sites. But I do have a few concerns about doing so.

    I've heard of a "duplicate content" issue at google, that according to some, google will penalize a sites page rank (and possibly backlink count) for using content thats already widespread. Now the content in particular i'll be using isn't exactly "wide spread", since my market isn't over run by affiliates yet, but i'm still cautious about using it.

    Given this is all true, it leads to me an idea of using both my own articles as well as others I find on article sites. Assuming I do so, what might be a good ratio to use just to play it safe? Secondly, what success or failure have others had by using other peoples articles for content?

  2. #2
    Antisocial Media Expert ProWebAddict's Avatar
    Join Date
    March 25th, 2006
    Location
    Go Daddy
    Posts
    1,109
    The truth is it all depends on who you ask.

    Interestingly enough, I was just reading Alice Sebas blog about this. http://www.aliceseba.com/2006/03/hmm...duplicate.html

    Personally, I have not suffered from using content that is found on article directories. Nor have I suffered when other people have used my articles. However for every person who says it's worked fine for them, there will be 10 others who will say they have been dropped by google for so and so..and vice versa.

  3. #3
    All Around Web Guy Cursal's Avatar
    Join Date
    January 18th, 2005
    Posts
    829
    Some very carfully choosen articles 5-8 for a niche site has worked ok.
    Mass usage of articles or template article sites have done little to nothing even though all pages were SEO for G.

    If you have the time to write your own content then that is your best bet

    I have notice better targeted traffic/click throughs when I have written a short piece about a product or service and then had the link right there with it, both text link and image links.

    Never been banned or dropped for dup content. It just doesn't get picked up by the bots and spiders.
    Oregon Publishing: Web Development, Graphic Design, Domains & Marketing
    Deluxe Banners Bartender's Guide Cooking Jobs

  4. #4
    Super Sh!t Stirrer SSanf's Avatar
    Join Date
    January 18th, 2005
    Posts
    9,944
    I think a lot depends on what else is on your page other than the article since the SEs apparently go by some percentage of repeated content and not just the fact that there is some. Otherwise, everyone who used "Welcome to my site!" would have a dup content penalty.

    This is my own solution. I may use the same data feed over numerous sites but all my sites have 7 to 8 SSI includes on each page. By the time I add the includes, significant portions of every page is different from every other page, if you see what I mean. The pages don't read the same to the SEs at all.

    A percentage of the page is comprised of the includes and that changes the content of the page with the article significantly from any other page on the internet. And, the article or the products change a significant portion of each page such that it is different from every other page on the same site.

    Every site has its own unique set of includes. The SEs read the includes as if they actually were in the HTML of each page. Sometimes, I even use a blank txt file for the includes because they aren't needed at the moment. But, I can quickly put something in the spot if it is needed such as an annoucenment and instantly have it run site wide.

    I can also "update" every page on the sites by simply changing the includes so that all the pages have "fresh" content meaning that there is verbiage on each page that wasn't there the last time the SEs looked at it so each individual page reads a little differently than it did before.

    Using virtual includes, you can change every page on the site at once. If you wish, though, you can use includes to change pages in single categories, only by just using regular includes. You have to decide which way you want to do this before you set up the site or you will be doing a lot of rebuilding.

    I am lazy so I do it site wide. Just thought I would mention the other possibility.

    I think the real trick is to make the totality of the page unique enough to avoid dup content. SEs work from mathematical equations. I would venture to guess that the longer the article the more verbiage from other sources should be on the pages.
    Last edited by SSanf; March 27th, 2006 at 08:16 AM.
    Comments are opinion unless otherwise noted. Remember, pillage first. Then burn. Half of all people in the world have IQs under 100. You best learn to trust ol' SSanf!

  5. #5
    Member
    Join Date
    December 12th, 2005
    Posts
    87
    But it is so easy to solve the dup content problem. You can put RSS newsblock right into the article body. The RSS news are taken on your keyword, so this content will be relevant. And the readability is not affected (because article scramblers that change paragraphs and words give a nice vomiting effect).

    Plus, even if you have 1000's of articles in your article directory, it is very easy to automate the putting of relevant RSS newsblock on all the pages with just one macros. I made it for one of my Internet marketing tools and it helps. Plus the clients are satisfied.

    You can use this experience too. It really helps.
    [LEFT][B]Good partners = top positions[/B]
    see screenshots with [url=www.autolinkexchanger.com]PROOFS[/url]
    [url=www.autolinkexchanger.com]Automatic Reciprocal Links Exchange Script[/url]
    [url=www.profitlinking.com]Link exchange service[/url] - best price in the industry + [url=www.profitlinking.com/free-service-contest.htm]free contest[/url][/LEFT]

  6. #6
    The Google-hatesme! Guy
    2006 Winner Lazarus award
    PM Me....
    aahh's Avatar
    Join Date
    May 26th, 2006
    Posts
    36
    There are a couple things to consider. Most people agree that the duplicate content filter is not a "penalty" as such. It's simply a filter that Google uses to avoid having the same content show up dozens of times in one search.

    The problem arises when Google has to determine who "owns" the content. If your content is halfway successful in the search engines, it will be copied by hundreds of spam mirroring sites almost immediately. So who gets credit for the original content?

    The answer seems to have to do with who has the inbound links -- in other words, "PageRank". It's popular nowadays to dismiss PageRank as being yesterday's SEO goal -- but when it come to avoiding the duplicate content filter, PageRank still seems to matter.

    Of course that makes the answer to your question complicated because, if you have a high-pagerank site, you can put other people's content on and get them filtered out -- while if your rank is too low, your content can be filtered out in preference for the spam sites mirroring you.

    Yes, it sucks to be the little guy.

    My advice: if you're a little guy, make sure to have good interlinking on your site and then build inbound links to each of your content pages to make sure you get "credit" for your own content. You can do this by creating content RSS feeds linking directly back to each content page. It's a bit of work initially but it's worth the time investment to protect your own work.

    Hope this helps,
    Chad

  7. #7
    ABW Founder Haiko de Poel, Jr.'s Avatar
    Join Date
    January 18th, 2005
    Location
    New York
    Posts
    21,609
    Chad,

    [Bit off topic]
    Have you ever seen [after any major google update] a correlation between inbound and outbound links %ages affecting a site's PR?
    [/Bit off topic]
    Continued Success,

    Haiko
    The secret of success is constancy of purpose ~ Disraeli

  8. #8
    The Google-hatesme! Guy
    2006 Winner Lazarus award
    PM Me....
    aahh's Avatar
    Join Date
    May 26th, 2006
    Posts
    36
    Lightbulb Link age and PR
    First, remember that the PageRank as reported by the Google Toolbar does not necessarily represent actual "PageRank". The main purpose of the PageRank toolbar nowadays seems to be to confuse SEO experts.

    (very controversial) I've never discerned ANY correlation between outbound links and PR or SERPs. Outbound links (sadly) can even be dangerous because they open you up to Google's array of "invisible" penalties and also Google bowling attacks by your competitors.

    As far as inbound link age: I'm not really a confirmed believer in the "link sandbox". Google seems to give a boost to new content for a while and then those pages sink down to their correct level. After that, it's all about inbound link quantity and quality.

    Older sites do tend to do much much better in the SERPs -- but then they also tend to have lots of links from "trusted" sites. Case in point : I have about a hundred domains and have actually worked to build links to only a small handful of them. The ones with many good links have risen in the SERPs and the others have remained in the "sandbox" of obscurity. Most people who give "sandbox" examples do not seem to appreciate the importance of inbound links. In my opinion, it's all about number and quality of inbound links -- not age.

    One useful tip for link building, try to focus each page of content on just one keyword-phrase. It's already a struggle to win a position in the search results without "diluting" the page's reputation with multiple keywords. Trying to 'focus' a page on multiple keywords is ... well, not focused. With proper keyword focus, it's pretty easy to capture a low-competition keyword-phrase.

    Wash, rinse and repeat with 500 pages focused on low-competition keywords and suddenly you're doing far better through a "long-tail" approach then you could ever do fighting it out for one of the top-ten keywords.

    Regards,
    Chad

  9. #9
    Newbie
    Join Date
    October 12th, 2005
    Posts
    6
    I have heard it is best to post your article on your site a few days before you submit it and let the engines have a chance to find it. I don't know if that helps, but I use other articles and some of the pages I use them on do very well in the search engines, so I don't think it is a big problem right now, who know what the future holds though.

    Clint Pollard

  10. #10
    The Google-hatesme! Guy
    2006 Winner Lazarus award
    PM Me....
    aahh's Avatar
    Join Date
    May 26th, 2006
    Posts
    36
    Hi Clint,

    Actually, it is probably best to not submit it at all. Submitting pages (at least to Google) doesn't actually seem to do anything. If your page has enough links to compete in Google then it will be found without submitting.

    On the other hand, one little-known trick is to link to your page or site several months before you actually create the page. I know that sounds odd but it's useful if you want to launch a project and don't want the PR to start at zero (which looks bad). Just link to if for a few months first and when the site or page is created, it will ALREADY HAVE PAGERANK. Hehe... This isn't really an SEO trick, it's just a side effect of the way PageRank is calculated.

    - Chad

  11. #11
    What's the word? Rhia7's Avatar
    Join Date
    January 13th, 2006
    Posts
    9,578
    Quote Originally Posted by aahh
    On the other hand, one little-known trick is to link to your page or site several months before you actually create the page.

    - Chad
    Wouldn't the page get a 404 not found?
    ~Rhia7 -- Remember the 7
    Twitter me

  12. #12
    The Google-hatesme! Guy
    2006 Winner Lazarus award
    PM Me....
    aahh's Avatar
    Join Date
    May 26th, 2006
    Posts
    36
    Quote Originally Posted by Rhia7
    Wouldn't the page get a 404 not found?
    Sure, until you put something there. But this has no effect on PageRank. PR would still be assigned so as soon as you place a page there, it "magically" has rank already.

    Remember, this is not a trick to get your page into search results, it's mainly useful if you plan to launch something and don't want your page to look amature by having a PR0.

    - Chad

  13. #13
    Member begabloomers's Avatar
    Join Date
    April 23rd, 2006
    Location
    New South Wales, Australia
    Posts
    128
    "Sometimes, I even use a blank txt file for the includes because they aren't needed at the moment. But, I can quickly put something in the spot if it is needed such as an annoucenment and instantly have it run site wide.

    I can also "update" every page on the sites by simply changing the includes so that all the pages have "fresh" content meaning that there is verbiage on each page that wasn't there the last time the SEs looked at it so each individual page reads a little differently than it did before.

    Using virtual includes..."

    I am still working on my site ( very slow process) and didn't want to have to manually change each page every few weeks, if possible, could you let me know how to include a "blank include" or how to update this include once for it to appear on every page.

    Thanks Susan
    :Todd:

  14. #14
    Prince of Content Vinny O'Hare's Avatar
    Join Date
    January 18th, 2005
    Posts
    3,126
    I stopped using other peoples articles, It gives peace of mind to write your own. But with all the scaper sites stealing my stuff I often wonder if I am getting credit for being the first one into the search engine.
    Vinny O'Hare - OPM - Contact Info email: vinny at teamloxly.com ~ 702-582-6742 Twitter

  15. #15
    Newbie
    Join Date
    October 12th, 2005
    Posts
    6
    Getting Credit
    nyfalcon, from what I understand, if the link in the resourse box is pointing back to your site you will get the credit for the article.

    Clint Pollard

  16. #16
    Newbie
    Join Date
    December 18th, 2006
    Posts
    1
    I have an answer
    Hi all,

    I have done alot of reading on this topic in the past. First off alot of you
    are totally right about the use of duplicate content.

    But let`s face it, alot of websites use duplicate content. Im sure millions of
    sites use it and still get high pr ranks and traffic from the search engines.

    It`s about using the right combinations of old and newer content on your
    sites and blogs to drive traffic. And yes, bys simply adding rss feeds around
    your older articles, they will not be penalized for being duplicate content.

    Remember, that rss feeds should compliment your content. Over using them by placing 10 headline feeds all over your page will have an opposite effect. Everything in moderation. You are better of using 1 or 2 rss feeds with 3 - 5 headlines per page.

    But, if you add rss in javascripts, the spiders can`t read java, so it`s all
    for not. Also , snippets throughout your articles will help you achieve
    good results.

    It`s almost impossible to build large sites, with original content. And why
    even try, unless you have more time on your hands. A good mix of new
    and old will bring traffic and your visitors will like the fact that you have
    alot of info as oppossed to 6 or 7 pages and then, they`re off to another
    sites.

    Plr Articles can help you add alot of content in a shorter amount of time,
    but you will have to re-write about 30% - 35% to make these articles
    considered unique. Far less work than writing them all yourself.

    As for duplicate penalties, again as long as you consistantly update with an
    old and new article mix, rss feeds---not in javascript and actively exchange links and submit articles, with your links in the authors resource box. You will drive traffic to your websites at a steady pace.

    jaambees
    (edited out manually added sig/link)
    Last edited by Trust; December 19th, 2006 at 07:06 PM. Reason: edited out manually added sig/link

  17. #17
    ABW Ambassador
    Join Date
    January 4th, 2006
    Location
    USA
    Posts
    2,477
    It is interesting that jaambees dug out this old thread...welcome jaambees.

    I was thinking about asking the same question the other day. I concern more on legitimate issue than page ranking on duplicate content.

    When you guys use other people's articles, did you state the source clearly on the page, or simply quote the whole article/part of it?

  18. Newsletter Signup

+ Reply to Thread

Similar Threads

  1. Duplicate Content
    By remysays in forum Blogging, Mobile and Social Media
    Replies: 7
    Last Post: December 17th, 2012, 03:44 AM
  2. Datafeeds = Duplicate Content Penalties?
    By Hectic GHC in forum Programming / Datafeeds / Tools
    Replies: 18
    Last Post: June 10th, 2010, 12:10 PM
  3. Duplicate Content - ?
    By Geno Prussakov in forum GoldenCAN
    Replies: 12
    Last Post: May 10th, 2006, 08:33 AM
  4. Duplicate Content
    By Kip in forum Search Engine Optimization
    Replies: 5
    Last Post: September 13th, 2003, 08:09 PM

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •