Earlier today, I was reading about how Stack Overflow was being outranked by sites that were scraping their content. This reminded me of a pretty cool use for the automated SERP tracking provided by AuthorityLabs.
Let’s say you’re in a similar situation to Stack Overflow. People may be scraping and using your content legitimately so you can’t go filing DMCA complaints to have them removed from the SERPs. Maybe you have an ecommerce site with a data feed of product descriptions that are being given to affiliates (which should really differ from the ones on your own site, but that’s a discussion for another topic). If this is a situation you’re in, you need know both where you are ranking and where those who are reusing your content are ranking. It is important to make sure these sites do not outrank you for your own content.
Here are a few tips for tracking these sites in AuthorityLabs:
- Add the domain for each site your content is showing on. If there are a lot, target at least the ones who pose the most threat
- Select the domains from the dashboard and Group them together
- Sync the keywords for all of the added domains with your main domain that contains the content you are monitoring.
Once you’re tracking the sites using your content on the keywords that are important to you, it’s important to understand how to deal with sites if they happen to outrank you. It’s likely that whining to Google about being outranked isn’t going to get you anywhere. There can be dozens of reasons, none of which have to do with a faulty algorithm. Some of these reasons may include the following
- Poor information architecture
- A lack of deep links to pages containing the content
- A lack of links to your site in general, which may be causing your content to not get indexed
- A lack of unique value being provided by your site versus the site using your content
- Missing basic on page factors such as the keywords being used in the title and headings.