We are a team of marketing consultants in London at Artios and we combine the use of artificial intelligence and AuthorityLabs data to deliver SEO recommendations that are statistically proven to work.
Our data scientist, Andreas Voniatis, explains how Artios used the AuthorityLabs API to demonstrate how to identify search engine competitors, to approach SEO data science studies and the importance of reliable data for making rankings in search engines more predictable and quantifiable.
Assuming the keyword research has been performed adequately to ensure the keywords under investigation yield useful traffic then ranking data may be used to identify competitors as follows:
- Take all the list of keywords
- Check the top 100 rankings for each keyword on a daily basis
- Repeat for a minimum of 10 days, preferably 30 days if time allows
The above will allow you to get a sense of:
- Which sites and competitors are ranking higher and lower than you on average
- How consistent your (or your client’s) competitors rankings are for your targeted search phrases
Once the data is gathered, you should be able to create a plot like the following:
The graph above shows the average ranking trend in Google for agencies in the creative design market in the UK collected using AuthorityLabs over 10 days.
While there is some fluctuation, we can see that the rankings are fairly consistent with little variation other than designbybridge.com which seems to be getting worse.
Click to enlarge image.
The above shows the standard deviation which practically measures the rankings fluctuation. It’s quite interesting that designbybridge.com has the 2nd highest standard deviation, which makes sense given that it partly reflects its deteriorating form.
Competitor Research Tips
The biggest mistake that any SEO can make is to only look at what their leading competitors are doing. Why? Imagine if the leading sites in Google UK all have landing copy that is on average 500 words, looking at the leaders you would then suppose that your 200 worded landing page needs to be beefed up to 500 words right?
Wrong! By checking the rankings and the ranking factor) of the sites consistently ranking lower than you’d be able to test whether, the landing page word count is indeed a ranking factor and to what degree.
We’ve covered this is excruciating detail here, however the key takeaway is that you should check the ranking factors of your competitors consistently ranking below you and not just the leading group.
TIP: To start you need to evaluate, for each competitor:
- SEO strategies – full site, on page and off page
- Backlink analysis
- Content strategies
- Keyword utilization
There are so many variables that can impact rankings. Try to determine which factors work for others and which factors you have forgotten.
There is Amazon A9 and highly anticipated Facebook search engine. With regard to Amazon A9 which has developed a rather lucrative ecosystem for marketers to sell other manufacturers products (known as private labelling), there are lot of theories as to what makes the Amazon A9 algorithm tick.
While AuthorityLabs doesn’t advertise the fact that their technology can be used for finding the ranking URLs of products in those engines, and that there are tools that are specifically geared to tracking Amazon A9 product search results like Jungle Scout. We use AuthorityLab’s API for the simple fact that you can get the Amazon URL of a product. The Amazon A9 tool market from what we have seen is quite embryonic meaning we haven’t seen ranking API tools, which is indicative of the relatively young thriving community of the Amazon Reseller community.
We haven’t yet attempted Facebook but to get the ranking URL for Amazon A9, all you need is the unique product number known as the ASIN number (stands for Amazon Standard Identification Number) which is unique to all products sold on Amazon. All you need to do is to:
- Use Jungle Scout to get the top 100 ASINs for your target search phrase
- Use the AuthorityLabs API to retrieve the Google ranking URLs
The above will give you the top 100 ranking URLs to start performing ranking factor tests and predictive modelling.
SEO Studies Need Reliable Ranking Data
For any search engine research study to deliver reliable SEO insights, the results will be heavily reliant on the quality of the ranking data. For modelling search engines including Google, we look for:
- Accuracy – the ranking reported is most likely to be the site ranking that search engine users will see
- Scale – the data source must be able to cope with numerous requests on demand i.e. multiple sites simultaneously, multiple keywords
- API – the collection of the data must be convenient to collect and feed our machine learning algorithms
While traffic and other metrics are important for evaluating the success of any SEO campaign, rankings data remain relevant despite:
- Seasonality – rankings are unlikely to be affected by different hours of the days, days of the week, or months of the year i.e. search engines are unlikely to deploy a July algorithm!
- Economic cycles –For any given country, for any given year, the algorithm is highly unlikely to change because of the economic circumstances.
The limit of your predictive modelling for SEO is the limit of your knowledge in terms of the ranking factor tests you want to run in order to identify what works for Google, Amazon A9 and other search engines. As long you have reliable data sources like AuthorityLabs, you can use artificial intelligence to help you identify patterns that lead to higher (and lower) rankings.