This post is authored by Joel Don, ISA's community manager and a social media marketing & PR consultant.
The assembly line created a revolution in manufacturing, enabling companies to make quantum leaps in productivity. But should you apply a similar approach to your industrial digital marketing program, republishing the same content to a wider readership to drive more traffic to your landing pages? In other words, is content "cloning" safe for your industrial marketing program?
Republishing seems like a good idea, especially with industrial businesses that find themselves hard-pressed to enlist employee subject matter experts in contributing to the company blog or guest authoring for other sites. Despite the fact that the majority of digital marketing experts regard blogs and guest authoring as absolutely vital to optimizing a company’s search value, reports suggest few industrial companies list blogging in job descriptions or MBOs. The general rule of thumb: If it’s not a job requirement, it doesn’t happen.
If you don’t have the budget to pay freelancers or marketing agencies to ghostwrite a constant stream of new posts for your company (yes, it’s more common than you think), then your marketing savvy competition will probably always rank higher in search. You’re simply not producing enough content or leveraging the power of guest blogging to generate traffic and backlinks to your landing pages.
In other words, you have to continually demonstrate to search engines that your site and its content matter. You can’t hack your way with SEO tricks to top ranking; companies caught gaming the system have learned this lesson the hard way because Google rightfully can’t afford to have search results turned into a display of late-night infomercials. Read the New York Times story on the well-documented J.C. Penny SEO fiasco before you buy into that too-good-to-be-true marketing pitch that just landed in your inbox.
The secret to ranking well in search, delivering traffic and creating conversions is generating great content, and lots of it. The challenge: If you are not refreshing site content, guest authoring or otherwise driving legitimate (i.e. not shill or gamed) backlink traffic, search engines basically reduce your PageRank score, dropping your site from the coveted first page of a SERP (search engine results page) to the oblivion of the “next” pages.
So when a deadline for a guest blog post is looming or you are looking for ways to get extra marketing mileage out of a white paper or case study, why not simply republish existing, high-value content? Here’s why: Back in 2011 Google released its Panda algorithm update, which among other things, suggested sites posting duplicate content would be punished. Panda created a semi-panic with site owners and an avalanche of arguments and debates in the SEO community.
I have been following the continuing discussion on the duplicate content issue since Panda, and, frankly, for every post that says there is no risk to duplicate content, you can find an equal number of stories sounding the alarm about significantly decreased rankings for re-posting content.
Enter the respected Matt Cutts of Google, head of the web spam team and the public face for all things SEO. Matt posted a one-minute video in 2013 that some felt settled the matter that has caused a collective global anxiety about duplicate content. Basically, he said, don’t worry about it. Here's the video:
https://www.youtube.com/watch?v=Vi-wkEeOKxM
But the concerns continued to simmer. In recent months, SEO specialists such as Neil Patel said you still have a lot to worry about, duplicate posting can hurt your page rankings. Other SEO marketing firms seem to have confirmed that you may still need to lose sleep at night if you care about your rankings – evidence suggests SERP results do indeed suffer from cloned blog posts. To keep your head spinning, there are plenty of cogent arguments with an opposite take on the issue. Respected digital marketing specialist Andy Crestodina dismisses the hysteria over duplication posting as much ado about nothing. Andy doesn't re-post blogs he writes as a guest author simply for style reasons – he prefers his own company website to offer only original content. That's a powerful marketing statement.
So should be you be concerned if you are allowing your content to be published on more than one site? My sense is there is enough hard data to be watchful of your SERP results. It doesn’t hurt to test your content regularly for any impact on rankings. Some marketing and public relations pros have taken a casual attitude about the duplicate posting issue. I asked one very well-known “Top 100” social media leader about a guest post at one site, that was then re-posted on the author’s own blog a few weeks later with a small notation that the piece was originally published elsewhere. The response was slightly cavalier: What's, me worry? Clearly, globetrotting social media "gurus" do not always agree.
Let’s take the search engine point of view. You write a blog post and place it at four or five sites. How does search figure out how to rank each copy when someone uses keywords related to the post? Should the search engine give more rank to better-known and popular sites, or look for indicators such as backlinks and referrals from social media networks? We’re talking about the carefully guarded secret recipe that is the search algorithm. Plus just because there are no apparent penalties today for your duplicate posting, there’s no reason an algorithm update might disrupt content rankings. And how does the search engine distinguish the primary, or original source posting, from the copies? Luckily, there is a technical way you can clarify your republishing practices with search engines on a post-by-post basis. More about that shortly.
Here’s the issue that nobody talks about when it comes to republishing posts: courtesy and digital etiquette. If you are going to republish an article from your site elsewhere, you owe the other site owners the courtesy of letting them know your content is not original. Steps can then be taken to inform readers that the article was previously published, and site owners can minimize the risk of punitive action by search engines on the lookout for certain types of questionable or suspicious link-building tactics.
Some authors and websites think it’s good enough to add a simple disclosure line with or without a backlink indicating that the article is a duplicate posting. The problem I have always had with that approach is we’re trusting that search engines have tuned their algorithms to interpret such "disclosure" text as a digital version of the “Get Out of Jail Free” card. Yet there’s no proof that adding such disclosure text insulates a duplicate post from down-ranking, since there’s no industry standard for search engine-recognized wording in such disclaimers. So at best it serves as feel-good value for site owners and authors, and a courtesy to readers. That's OK, but offers as much protection as an umbrella in a hurricane.
Now we get to your DIY point-and-click technical options. If your site runs on the WordPress platform, you can easily take simple steps to protect your site from down-ranking as a result of republished content. The same actions can be employed on other blog platforms, as well as HTML sites. Most WordPress site owners use an SEO plugin, and among the most popular are Yoast SEO and All in One SEO Pack. Both plugins enable you to tell the search engines to exclude a post from search indexing and identify the source or original posting of an article or blog. With the plugins, you simply make selections from several choices or enter the URL of the source article:
Now the question: should you use one or both of these metatags for a cloned post? The answer is it’s probably best just to pick only one. Individually they should both accomplish the same goal but using different approaches. My preference is to use the canonical URL option, even though non-canonical cloned pages do not merit search engine ranking. My thinking is the noindex metatag says “ignore me completely” while the rel=canonical instruction performs the courtesy of recognizing the original source. Who knows, if in the future search engine algorithms give some credit to sites that identify copies, you might benefit.
Update: A new post suggests there may be some risks associated with the noindex and rel=canonical tags. The problem appears to affect sites with very high volumes of pages; effectively Google crawlers "choke" on processing large quantities resulting in missed SEO opportunities. Definitely read the findings reported by Eric Enge of Stone Temple Consulting and decide if your site is at issue.
A final word of warning. There are some sites that republish content without permission, commonly known as scraper sites. More than likely the scraping is not being done in your best interest, and there are legitimate concerns that your SERP results can be negatively impacted by these sites since many are designed to game search. Be watchful for such copyright infringement, and take steps to protect your search engine rankings.
What's your view on republishing content? Do you think it's safe and beneficial for driving more traffic to your landing pages, and do you protect sites and cloned content with noindex or canonical metatags?