Google is all about high quality user experience. When their users search some query, Google wants to deliver them various and useful information, relevant to what they look for, and from websites that can be trusted. This is the main reason they won’t allow webmasters using duplicate content and trying to fool them to get higher rankings in the search results. As the world’s most popular search engine, Google sets the main rules SEO need to follow if they want better visibility for their websites, and the only way up is through original and interesting content that gives all the needed information to those interested so they would gladly want to share it with their friends and family.
The archive and category pages you have on your website are considered duplicate content by Google. For example, your website’s main content are your articles and blog posts, but those pages are also archived, and in addition to this, the category pages contain about 200 words snippets from the same articles for the users to quickly see what the article is about.
This means that the same content is repeating on more than one place on your website, and there’s even a possibility that someone has stolen or scrapped your articles, so it might appear on some other places (URLs) too.
If you are hiring freelancer writers, or other way outsource the writing of the content you use on your website, you should also be certain you can trust these persons that the articles they deliver are completely unique, and not copied or slightly modified content stolen from other websites. PlagTracker is a plagiarism checking software that can help you scan their writings and get links to the original sources, if plagiarism is detected.Google will also penalize webmasters who publish other authors’ original articles, even if they have their permission to do so. This type of duplicate content might be legal, but it is bad for the search engine optimization of your website.
So how can you protect yourself from being penalized by Google for duplicate content?
Use Meta Robots or Robots.txt
If you already have duplicate content on your website, let the search engines know which pages to index, and which to de-index. This is done with meta robots for single pages, and robots.txt for the whole site. Knowing how to use them is especially important if you are using HTML or PHP to build your website.
Use WordPress Plugins
If you are using WordPress as a platform for your site, there are plenty of plugins you can use to solve the duplicate content issues. Make sure you won’t miss the ones that allow you to de-index your archive and category pages, and automatically make all your links canonical.
Scan your Content with PlagSpotter
To make sure no one steals your original content use the PlagSpotter online tool to check and monitor duplicates across the Internet. This software scans and analyzes the entered URLs to report if copies of those pages were detected. You can even use the ‘Protected by PlagSpotter’ badge to show the black hat SEO’s that you are keeping an eye on them.
About the Guest Author:Austin Rinehart is the senior writer on PlagSpotter, married and have two lovely adult daughters. Looking for opportunities to publishing on various topics such as internet trends, science researches, strategies of life improving and etc.You can also Contribute to OSH by “Write for US”.