Debunking the Myth That Search Engines Hate Duplicate Content
I’ve been asked if I agree with the idea that Google dislikes duplicate content and that professionals should reduce or refrain the amount of content they syndicate (distribute) on the Web.
I think there’s a huge misunderstanding out there about how to distribute your content without hurting your Web site in the eyes of search engines.
If you publish content, you should have the first, original content on your Web site. No doubt. However, you can “repurpose” it and strategically distribute it on the Web, and this will not hurt your search engine/Google SEO efforts.
This is actually something I have used to help my clients quantifiably increase their traffic rank, traffic visits, leads and sales.
I call it SONAR. This method of repurposing and synchronizing content (albeit text, audio, video) distribution into various, targeted channels allows companies, publishers, entrepreneurs—basically anyone with content on their Web site—the ability to ultimately turn traffic into sales.
The method represents the following online distribution platforms:
- S Syndicate partners, content syndication networks, and user generate content sites;
- O Online press releases;
- N Network (social) communities;
- A Article directories;
- R Relevant posts to blogs, forums, and bulletin boards.
Case study: The technique helped increase traffic ranking and visits to a alternative health Web site by 3,160 percent and 81.5 percent, respectively, in three months. And in four months, traffic visits increased to an investment Web site by nearly 80 percent, as well as an increase its traffic ranking by nearly 150 percent. Plus, the traffic to this investment site was monetized for an ROI of 221 percent.
So not only was the Web site’s search marketing efforts not hurt by content syndication, it actually improved exponentially with rank, visits and sales.
The key is to repurpose the content. Tweak, or reposition, an original article on your Web site with minor changes (for example to headline, intro paragraph, closing, etc.) and then syndicate on other different Web sites.
Google tries to look for the best version; where the content originated. It’s usually your Web site, and typically that site that gets the “search engine credit,” so to speak, via the higher ranking in the organic results listing.
In Google’s view, duplicate content is more hurtful for a site if it’s posted in more then one place on that same domain (not via syndication on other Web sites such as through online press releases, article directories, etc.) .
For example, you may have two pages on your site with virtually the same content. In most cases, Google notices this and ignores one of those Web pages. You will not get both of those listings in the search engine organic results—just one of them.
Keep in mind, Google will decide your pages are duplicates only if your (page titles and meta descriptions) are the same. So make sure each Web page on your site has unique, relevant tags with targeted keywords.
Now if you’re concerned about Web pages on your site and its text “printer friendly version” hurting your Web site’s SEO, simply block the search engines from spidering the print friendly version.
When you think about it, syndicating content is the SEO model of social sites (like Digg, Drop Jack, StumpleUpon, etc.) and article directories.
So make sure you understand the power of your content and how to syndicate it the right way before you hit the breaks on any search marketing efforts. Because doing nothing and not leveraging your content will actually hurt your site.
Wendy Montes de Oca is president of West Palm Beach, Fla.-based online marketing services consultancy Precision Marketing and Media. She can be reached at email@example.com.