How to Format Your ‘Now Provided’ Report – Part 1 [VIDEO]

Now-Provided-02

Many of you have tried out of new Now Provided report. If you haven’t the Now Provided report pulls together AuthorityLabs ranking data, Google Analytics data, and a few other data points to show you how many pages are receiving organic traffic and a list of the keywords that those pages rank for.

Now the million dollar question. After setting up the Now Provided report, how do I analyze this awesome data? I’m glad you asked! I made a dashboard that looks at the top landing pages and with four different types of conditional formatting, depicts the data in an easily readable way.

If you need to quickly format your data I also made a template with the formatting in there.

Step 1 – Export

Go into your AuthorityLabs account and export the Now Provided report with the icon in the top right corner.

now_provided

Your CSV should look like this. I turned it into a raw data tab and saved it off as an excel workbook.

step1

Step 2 – Count Your Results

I counted how many keyword results of the landing pages. Then I grabbed the top five to analyze. Make sure to copy over each of these top 5 tables into new sheets. We’ll need to use them later.

step19

I grabbed the 100 keyword results for the home page and plopped that into it’s own sheet.

Step 3 – No Baggage

I got rid of the metrics from landing pages like users, sessions, and bounce rate. We’ll be analyzing those metrics in the next tutorial. For now, the metrics I’m focusing on are specific for the keywords. To do this quickly, click on the Column B and drag the selection over to Column E, hit the delete key or right-click and select Delete.

step2

 Step 4 – Format as a Table

I selected all the data, but all you have to do is click inside the data set and then go up to Home > Format as Table. I also turned off the gridlines, I know, shocking. :) View > Gridlines.

step3

Step 5 – Start the Pretty Parade

Under Home > Conditional Formatting > Icon Sets > Stars. I also put them in Reverse Order for the position rankings.

step4

Step 6 – Two Cells Run into a Bar

Select all of the columns that you want data bars for and then go into Home > Conditional Formatting > Data Bars. To change the color of the Data Bars go to More Rules and change the color and even add a gradient.

step8

Step 7 – 80% Percent of Pretty Data is Formatting

I changed the formatting at H and I. I changed the formatting from general to percentage.

step6

Go ahead and change the CPC value to a monetary one.

step7

Step 8 – Can’t Take the Heat

Another great conditional formatting option is the Color Scales. 

step9

I changed the Lowest Value to a green and the Highest Value to a red. To do this go to Manage Rules > Color Scales > Minimum: Green | Maximum: Red.

step10

Step 9 – Raising the Bar

The last formatting option was for the Search Volume metric. It’s under Icon Sets too and it’s one of my favorites for an overview snapshot.

step11

 Step 10 – Copying the Formatting Over

Now that we’ve finished the formatting for the Homepage table, let’s share the love. To do this, navigate to one of the other top 5 landing pages. I put mine into different sheets to formatted them more easily. Then format the data set as a table.

step12

Step 11 – Spread the Colors

Next, grab the first line of the formatted table and paste it onto the first line of your unformatted data series. When you paste the values a small drop down menu will appear with Paste Special options. Select the one that is Paste > Formatting only.

step13

 Step 12 – Complete the Rainbow, Taste the Rainbow

After that, select the small green + sign located at the bottom right of the first row and drag down to copy the row. Now, select the same option to Fill Formatting Only.

step15

Step 13 – Spread the Cheer

Make sure to copy this formatting process over to the other top 5 landing pages.

step16

Step 14 – Consolidate Your Winnings

After collecting 5 sheets of formatted tables, I put them all into one sheet and labeled it Dashboard. That way you can scroll easily and see how each of the keywords would benefit your site in specific ways.

step17

Step 15 – Viola!

Grab a cocktail, sit back, and enjoy the rainbow of pretty data you just created. :)

drink

 

 

Google Analytics Audit – Deployment and Settings

It has always surprised me that the most freely available and useful tool that Google provides to website owners, Google Analytics, is so often incorrectly deployed on my client’s sites. Here’s a quick DIY Analytics Audit that you can run through for new client sites or your own small business website to make sure you’re getting the most out of this free resource from the Big G.

Correct Code Deployment

Go to your site, right click and view the source. (If you’re using Safari then go to Settings and enable Developer tools to get this option). Do a search for UA to find the segment of code where your Google analytics has been deployed.

  • Are there multiple copies of the same UA code? Then there’s a possibility of getting duplicate traffic.
  • Are there multiple different UA codes? Then you’re sending traffic to multiple profiles. This may have happened if  a 3rd party set this up Analytics before.
  • Are you in control of the profile? Again if you had a 3rd party set this up, then a UA with a higher # on it indicates a number of other profiles are on it and it may belong under another agency. You should check the User section of your analytics and make sure you’re the primary administrator.
  • Does your entire site have Google Analytics code? You can check for GA code across your site with this tool, a custom setup in Screaming Frog, or in Raven Tools Site Auditor.

Legacy vs. Universal

You can track subdomains with universal Google Analytics code
Next look at the code and see if it’s using analytics.js or ga.js. If the code is using ga.js then it’s an older implementation and you should consider updating to Universal analytics. You can get some cross sub-domain tracking abilities as well as other features. However, it’s not something you do without a gameplan. Kane Jamison of Content Harmony explains:

“So, this guide is a pretty clear rundown of the caveats I have for UA migration – if you have a vanilla account you’ll be ok, but if you have tons of custom tracking, you need a gameplan. For me the clues would be lots of event tracking, or any ecommerce setup, or other custom elements other than the basics like URL-matching goals.

Once you see any of those custom elements, there needs to be a migration gameplan in place. It’s still worth doing and you’ll have to do the UA upgrade eventually anyways, but you can’t just press the button and expect it to function the same.
On a sidenote, if you’ve planned everything well, this is also a good time to switch to Tag Manager. Rather than rebuilding tons of custom event firing tags and hard coding them into the page templates, you can set up id tags and other elements that can get tracked by GTM. It adds a layer of complexity for sure, but if you have an experienced analytics person doing the migration then they’ll probably want to do this as well for the convenience it offers.”

Demographics Data – Who Are These Visitors?

Who wants to see sailor moon pictures

Ever wondered who wanted to visit your Geocities website shrine for Sailor Moon? Well Google gives you some access to those demographic details.

  • Add a simple line of code to your deployment to get access to this data. (For Universal analytics the code is: ga(‘require’, ‘displayfeatures’);
  • If you’re using Yoast’s WordPress Analytics, then click on the Advanced tab, and check the box demographics.

Enhanced Link Attribution: Who Clicked Which Link

If you have a link to the same destination multiple times on your page then you might be confused when using Inpage analysis in Google Analytics to understand your visitor’s behavior because it will use the same % of traffic for each instance. You can fix this with Enhanced Link Attribution.

  • ga(‘require’, ‘linkid’, ‘linkid.js’);  <– Add that to your analytics code
  • If you’re using Yoast, then click the Advanced tab and add it to the Custom code section. (Dear Yoast, please add this as a checkbox feature. Also, if anyone knows if there’s a problem with this method please let me know so I can update this section)

Filter Out The Crap Traffic

Coffee grounds in a pile of coffee filters

Filter out the useless and let through the tasty goodness.

If your client is habitually visiting their site, clicking every single page then that will quickly skew your traffic stats if you do not have filters in place.

Bot Filtering in Google analytics
  • Check the box to exclude hits from spiders and bots
  • Get IPs from the client (Tell them to Google “What is my IP“)
  • Add your own IP and anyone else who might visit the site who you work with professionally.

Site Search – Who Looked For What?

Did You Set Up Site Search

Hey, wouldn’t it be useful to know if people were looking for that blog post you wrote about DIY Site Audits? Well, you’re not going to know unless you set up this awesome feature. You can then use your own visitors to determine what content would be worthwhile for you to create.

Integration is Great…ion

Awesome Mix Vol 7

Hard to make a mix tape if you only have one album. Get all the data you can!

You can connect your Google Analytics to Adwords, Webmaster Tools and some 3rd party tools. I STRONGLY recommend you take advantage. You get extra layers of data like treemaps and search queries which you can connect to your conversion data.

Presenting Your Audit Findings

Often when you’re doing an audit you can just flip these switches and toggles and get these things fixed right away. Other times there’s a client to convince, an administrator to convince or developer to task. Here’s a handy template for presenting your analytics audit findings.

I’ll be creating a follow-up post on Google analytics to review Events, Goals and Conversions if you were wondering why I didn’t include them here. Let me know if there are any other “must check” sections of Analytics in the comments!

Site Audit: Indexing Tips & Tricks with Screaming Frog [VIDEO]

Knowing how to do a site audit is an extremely useful tool for any marketer. But if you’re like me, intimidation can creep in big time while starting these audits. I’m here to help with that! I pulled several key points from Annielytics’ Site Audit Checklist and from Jeremy Rivera’s Simple DIY Site Audit post. Thank you for the inspiration, guys!

I also use Screaming Frog to crawl the sites and sitemaps. If you haven’t downloaded Screaming Frog… do ett. If you don’t know how to use it, there’s a tutorial for that.

Now, let’s dive in! You can get the full tutorial here:

If you don’t have 10 mins to watch me talk, no problem. Here’s the step by step walk-through:

Audit Magic

Like I said above, this site audit checklist is a dreamboat. Annie Cushing gave it to the marketing industry as a gift and there are about 10 viewers on it at any given time. Craziness.

annielytics

Yo’ Checklist

I grabbed a few check list items to walk-through and take the intimidation out of them. I compared a few different sites: Modcloth, Adored Vintage, and (#theoneandonly) Screaming Frog. And I color coded them … duh.

  • Is there a robots.txt?
  • Is there a sitemap?
  • Are there errors in their robots.txt file?
  • Are their sitemaps clean?

Want to follow along? Download this audit checklist.

screamingfrog

Now that we got that all set up. Let’s get down to business.

Step 1 – Check for Robots.txt

Go to the domain homepage and type in /robots.txt. Not sure what a robots.txt file is?

“A robots.txt file is a text file that stops web crawler software, such as Googlebot, from crawling certain pages of your site. The file is essentially a list of commands, such Allow and Disallow, that tell web crawlers which URLs they can or cannot retrieve. So, if a URL is disallowed in your robots.txt, that URL and its contents won’t appear in Google Search results.” (Google Support)

The first site I looked at was Adored Vintage. They had robots.txt, but most of it was copied and pasted from a generator. “If you use this as a sample file then you should remove all comments …” This does cause problems later.

Incorrect:

adorevintage..robots

Correct:

celver.robot.sf

Step 2 – Check for Errors in Your Robots.txt

There are a bunch of guidelines for how to set up the robots.txt file correctly. Now to learn all the rules and implement them… jk. Luckily, there’s a tool for that. I ran this analysis for Modcloth and they had a few mistakes, like using an asterisk, that caused serval errors in their Robots.txt.

mc.errors

Now that you’ve cleaned up your Robots.txt,  let’s move onto your sitemap.

Step 4 – Find your Sitemap

Unlike the Robots.txt file, you can’t just type in www.yourdomain.com/sitemap.xml. Sometimes it’s located under different folders. So a quick find for the sitemap is to search in Google, site:domain.com inurl:sitemap.xml. Here is the one I downloaded from Modcloth.

modcloch-sitemap

Your sitemap may (and should) be listed in the robots.txt. Modcloth does a wonderful job with this. They list specific links to there sitemaps, and the first one is the main sitemap.

modcloth.robots

Do you need a sitemap? Check out a sitemap generator like this one.

Step 5 – Sitemaps in ScreamingFrog

Many times companies will put up a sitemap and not update for months, or even years. Check your sitemap to make sure you’re not linking to any 404 (broken) pages.

To do this, save off your sitemap.xml file.

Then, change the Mode > List this will change ScreamingFrog from an online scraper to a file scraper. Cool, right?

change-mode

Select file > File type> .xml

import-site-map

Modcloth has a a 404 page in there main sitemap. That’s telling Google to check out a broken page, that’s definitely bad for business.

vintage-404-sitemap

Those are some quick tips to make sure your Robots.txt and Sitemaps are in tip-top=shape. I went over a few other tips in my tutorial video. All tips that can be found in Annielytics’ Site Audit Checklist.

Thanks for tuning in! Now go forth and data!

Michelle Stinson Ross on Twitter Cards: The Basics Everyone Should Know

MichelleFor some folks Twitter Cards are as confusing as can be, so we thought we would reach out to Michelle Stinson Ross, Social Marketing Manager at Offers.com, and get some clarity. Check out the Q&A below.

Also, Michelle will be speaking at SMX Social in the “Up Close With Twitter Cards“, November 20th at 2:00pm. If you are attending be sure to check out her session and be sure to get some great ideas from her.

What are Twitter Cards, how are they used and why should businesses care?

Basically, the short answer is, Twitter cards are a multimedia value added tweet. They can be used to build email lists, drive web traffic, or facilitate app downloads. Brands and businesses need to care because the cards aid with the heavy lifting of click through.

Do businesses need a professional developer to set up Twitter Cards?

There are two different routes you can take with Twitter cards. One of them does require a little help from your webmaster to add the proper mark-ups to your web site. Before you get too deep into this option be sure to read up on the information Twitter has for developers. If you have a site that is very photo and multi-media rich, this is an option you need to seriously consider.

However, there is a DIY option within the Twitter Ads environment that doesn’t require any coding to use. These cards are great if you want to make your tweets stand out with a visual and not lose clicks to people simply expanding a photo that you shared with a tweet.

Can you explain the different kinds of cards and tell us which are your favorites?

Let’s focus on the DIY Twitter cards. Even though you have to go into the Twitter Ads platform to set these up, you don’t have to run an ad to use them.

There are 4 DIY card types currently available. I say currently, because when I first started using them, there were only 2 card types. I fully expect Twitter to continue to roll out new tools to encourage brands to use the Ad platform.  The current card types are a Lead Generation card, a Website card, and two versions of the App card, a basic App card and an Image App card.

The cards serve the functions that their names imply.

1). The Lead Gen card is set you to help you collect email sign-ups via a user’s Twitter account. It’s an easy one-click sign-up for the user and is very mobile friendly.

2). The Website card allows to you add an image, call to action button, and a fully tagged URL to your tweets. If a user clicks or taps anywhere on that tweet they will be directed to the URL embedded in the card. HELLO increased CTR.

3). The two app cards facilitate the download of apps directly to users phones.

Right now I’m all about the Website cards. My main directive is to drive traffic via social to our website and I LOVE this card for mixing up our tweet types and the improved CTR.

I also like the capabilities of the Lead Gen card. I personally haven’t used it much because when they first launched this card, it had to submit the emails collected directly to the CRM. Our proprietary CRM didn’t interface well with the system.

In reviewing this card for this interview, the new about page mentions that the list can be stored to a brand’s Twitter Ad account and downloaded at will. THIS is a far better option for my system, and I will circle back around very soon to start using this option.

Can you give us your top 5 Twitter Card Tips?

Most of these tips are specific to the Website cards, but here it goes:

1). ALWAYS used tagged URLS for your twitter cards so that the traffic they drive can be properly attributed to your efforts.

2). Get very familiar with the image size requirements for the cards. The website card must be at least 800 pixels wide and 400 high. So you will have to create compelling landscaped images for these cards.

3). Depending on the way your campaigns are set up you can edit and keep cards up to date with fresh images, copy, and links without having to create new cards over and over.

4). I highly recommend using a card when you’re paying to promote a tweet. The website cards in particular only charge when the user clicks through. If you promoted a standard photo tweet, you could be charged for clicking a link or clicking to expand the image.

5). Test, test, and test again. As I mentioned, when first rolled out the Lead Gen card was not a good fit for us. The updates are worth giving this one a shot now.

How can one measure the effectiveness of Twitter Card use?

All of the basic engagement metrics can be found in Twitter analytics both per tweet and per card. You can track CTR, favorites, RTs, and overall reach. As I said before, it’s also important to tag the URLs you’re sharing via these cards so that you can track what happens when the user clicks through and Twitter stops measuring.

True effective use of the cards will be determined by the goals you set before you launched into the use of them.

What are some creative uses of Twitter Cards that you have seen?

The really great creative cards are the more complicated developer level cards that require a little code on page to pull off. These cards pull in text and beautiful full-sized images from a blog post or serve as a media player for video and audio elements embedded on linked pages.

The advantage to theses cards is that no matter who links to that page, Twitter will pull in the multimedia elements. The engagement and amplification factors for these cards are well worth the effort to mark-up your best web pages.

What do you recommend people don’t do with Twitter Cards?

Right now, my best tip is don’t over use them. Just because you don’t have to pay to use this option doesn’t mean every tweet should be a card. Use them as an element of your overall Twitter content mix.

Do you have some resources that you recommend for those wanting to learn about Twitter Cards? 

Twitter has provided top-notch resources in both the Twitter for Business pages and the Developers pages. If your business has been assigned a Twitter Ads rep, they are excellent about walking you through EVERYTHING you need to know about any of the DIY cards. They may not update you on changes as they roll out, but will provide any answers to questions that will arise as you notice something changed.

Most of the articles I’ve read in industry type publications outside of Twitter’s own content, tend to just provide an overview and don’t delve into the depth of how to use them.

A Big Thanks to Michelle For Her Time

Michelle is a very busy lady and we really appreciate her taking the time to answer questions for us. If you have any questions or want to reach out to Michelle you can find her on Twitter @SocialMicheller.

A Simple DIY Website Audit

Recently, a friend of my wife’s was wondering if I would take a look at a her small business website and give her some insights about what she could focus on to improve. The catch is that she didn’t have the budget to hire me or any other SEO professional. I also have run into several IT managers and marketing people in the past couple months who know their company site is not working right and would love to be able to fix it, but don’t know where to start to gather the information needed to get the beaurocracy to approve those fixes.

That’s why I’ve decided to make a great a DIY site audit template for similar small businesses or in-house IT managers who’ve been handed the “role of SEO” and other who need to know where to start when you’re doing a site audit.

You've got to break it all down in an audit

Breaking down a site into all of it’s pieces can help you understand what’s broken and how it should be fixed.

First of all, you should know that there is a lot of information you need to know about technical SEO and the implementation of best practices. There are real professionals in the SEO field who specialize in this specific part of the optimization process.

There are dozens of insights and opportunities they would spot in this process that you won’t if you follow this DIY process. So hire someone if possible.

That being said, if you’re in over your head, have no budget and have a website with problems that even you can see then I think I can help you.

Helpful Site Audit Templates

Chances are good that if you’re doing this kind of audit then you’re going to want to have it in a more permanent form than an email. You may even need to present it to the “powers that be” to get some of the potential issues resolved that will be revealed in an audit. That’s why I’ve created a template for you in three different mediums so you can update it with your own site audit:

What You’ll Need To Do This DIY Audit

Look closely

Here are some tools to help you take a closer look.

  • Access to Google Analytics
  • Access to Google Webmaster Tools
  • Copyscape
  • A lot of patience and a web browser
  • A victory dance for when you’re finished. Shot of Whiskey, a fine hard apple cider or glass of wine.
  • A good site crawler.

Before Your Run, You Gotta Crawl

When you start your audit you’re going to need some data that will be easier to access with additional programs instead of just a browser. The first thing being a “crawl” program that visits the site, accesses all of the pages and behaves like a search engine spider to uncover some flaws in your site design or architecture.

I personally love Beam Us Up’s free SEO crawler, mostly because it will give you an excel export of the “crawl” data with tabs showing filtered results for the most common issues and it has no crawl limit on it’s free version (just be sure to update to the most recent Java). You can use other crawlers like these as well that have other limits or require licenses.

Let’s Get This Site Audit Started!

High Five! Let's do this audit!

High Five! Let’s do this audit!

Domain Expiration

Fail: Letting your domain expire.

Seriously. If you let your domain expire underneath you, that’s a serious fail.

Letting your domain expire is the most preventable of SEO catastrophes. There’s a whole industry based on ransoming expired domains back to owners who allowed them to lapse. Unfortunately, I let my own domain footinmouthdisease.net lapse and now there’s a medical article site spamming up the interwebs.

Domain Name Redirection

The very first thing I always look at is whether or not they have properly setup their domain name.

  • Does the site show up at BOTH www and non-www without redirecting to one or the other? (
  • Does the site resolve at /site.aspx, /default.aspx or /home/ instead of the bare root domain?
  • I almost always prefer to use non-www for my domains, partly because the internet is larger than the “world wide web” but also I hate having to say “double-you,double-you,double-you”.

Server setup – SSL

If you have heard that a SSL certificate and securing your site to send traffic through HTTPS protocols can possibly impact ranking, then you should know that you can easily make errors in this process, here’s something to check and it’s related to redirects. If you have got a certificate for the root domain, but not a “wildcard” certificate, then you may end up getting some scary messages going to your visitors when they click a link that goes to the WWW version of your site (even if there’s a redirect).

Robots.txt

The robots.txt file is located in the primary directory of your website. This file tells the search engines what pages they can and cannot crawl on your website. It also helps them know what sections of the site to spend their time in. Here’s what to check for:

  • Does it exist? If it doesn’t then make one. (Yoast SEO plugin for WordPress includes it.)
  • Does it block URLs that exist in the sitemap? Then either remove them from the sitemap or the robots file.
  • Are you disallowing crawling of your entire site? Yes. This happens WAY too often.
  • Do you have a ridiculous amount of crawl rules?

XML Sitemap

The xml sitemap is also a file that should be located in the primary directory of your website. This file serves as a map for the search engine to quickly understand how the pages of your site work together. While the search engine will likely find all pages on your site eventually through it’s crawling, it’s important to make that process easier for them by uploading this file and directing the crawler to the pages you would have them see as most important.

  • Does an XML sitemap exist?
  • Is it formatted correctly?
  • Has it been submitted to Google Webmaster Tools?
  • Does the sitemap contain pages excluded in Robots.txt?
  • Does the sitemap include improperly formatted URLs?
  • Do the pages included in the sitemap deserve to be included? Don’t add empty eCommerce listings or stub pages.

Malware

I've got a Hunch Hunch You've Got Malware

Hunch Hunch, What What?

If your site has been compromised by a php inject or has hijacked your comments or forums then you’re in for a bad time. Check periodically for warnings in Google webmaster tools, and also beware sudden spikes in traffic to the user generated portions of your site. I’ve helped several people recently who had to disavow a ton of links pointed to forum pages that were generated with links to pills, pron and gambling sites.

Site Load Speed – Page Timings

The page load speed of a site can often influence the way search engines look at a page or site. If it takes too long to load, they will assume that provides a worse user experience than if it were to load faster and they most certainly take that into consideration when ranking websites.

“Amazon finds a 100ms slowdown – one tenth of a second! – can mean a 1% decrease in revenue.”

According to surveys done by Akamai and Gomez.com, nearly half of web users expect a site to load in 2 seconds or less, and they tend to abandon a site that isn’t loaded within 3 seconds. 79% of web shoppers who have trouble with web site performance say they won’t return to the site to buy again and around 44% of them would tell a friend if they had a poor experience shopping online.

Site: Search

One of the best ways to understand how Google has indexed your site is to do what is called a “site search.” This involves using a search modifier for google along with the domain of your website. The search query would look like this: “site:basecampleasing.com.” This search will return all pages that Google has indexed for your site.

  • Does the # of indexed pages match the real # of pages?
  • Are there duplicate pages being indexed?
  • Are there thin or stub pages being indexed?
  • Has your site been indexed at www, non-www or with /index.php or any other URL string?

URLs

Page URLs should be structured in a way that makes sense with the content on the site and that makes sense to the user if they were to simply read it. This usually involves including a topical category level in the URL if applicable and using targeted keywords in the URL where possible.

Below is an example of the way a very solid URL for SEO would be structured:

http://www.example.com/category-keyword/subcategory-keyword/primary-keyword

vs.

http://www.example.com/prod/cat/unit1234

You will need to carefully plan out any URL re-writing with proper redirects in your HTAccess file to fix these structural URL issues. Don’t make any sudden changes that could cause pages that are indexed to suddenly return error messages.

Meta Titles

Meta titles are supposed to be coded into the head of the HTML on every page on your website. This element generates the title that you probably see in the top tab of your web browser as well, and more importantly, the title snippet seen on the SERP for pages on your website is pulled from that title tag. It is also one of the most important elements that Google looks at when trying to understand what a page on your site is about and it’s relevance.

Meta titles also should generally include targeted keywords that are relevant to the content on that specific page. The title should be unique to each page on the site to signify the difference in content across the site. They should include both a strong keyword phrase and the name of the company in order to maintain consistency across the site. A big issue is title length, with far to many people making them too long, which will cause Google to make some changes to it’s SERP.

Meta Descriptions

Taking control and writing an effective meta description allows you to draw more searchers through to the site increasing your click through rate which makes your organic reach more effective even if you’re not changing your rankings directly.

Meta Keywords

We could exchange reciprocal links! Oh, i know..want to be part of my link wheel?

Created with due respect to Ryan North of Dino Comics from his template

Seriously? Sigh. Come on people. Just kill these tags if they exist.

OG tags, Twitter Cards and Rich Pins

These tags from facebook help populate the correct image, title and description for when your page is shared, and leads to increased click through from that social media channel. Adding Twitter card markup has a similar effect but also gives you relevant analytics data at analytics.twitter.com to show you your tweet’s performance. Verifying your site with Pinterest allows you to post rich pins from your site which improve click through.

Headings

Don’t think you’re going to suddenly jump to the top of the rankings when you use header tags but don’t forget them either. You should naturally be using h1s for things like the blog post title, and h2s for sub-topics. It’s helpful to remember that using headers also is a usability issue. I always use these headers as “tabs” to grab skimming page visitors looking for the piece of content they need.

Duplicate Content

What Out For Duplicates!

What out for duplicates of your original content!

There are actually two ways to look at this issue. The first way is looking for content that’s been duplicated on the site itself. This can happen really easily in big eCommerce site where there are a lot of entries. It’s also a common side effect of many CMS platforms. This should be detected in your crawl. Remove the duplicate, any links to the duplicate page and add a 301 redirect. If you can’t because your CMS has it’s limits, then add a rel=canonical tag to the one you’d prefer to be indexed.

The send thing you need to watch out for is off-site content duplication. Use a 3rd party tool like Copyscape to check to see if your content is duplicated on other sites. This could be your own site copying from others or a problem with scraping by 3rd party sites. If your site shares content with other sites, it could be stemming from use of a manufactures description for your eCommerce products, laziness or plagarism and should be hunted down and stopped immediately.

Duplicate Content

What Out For Duplicates!

There are actually two ways to look at this issue. The first way is looking for content that’s been duplicated on the site itself. This can happen really easily in big eCommerce site where there are a lot of entries. It’s also a common side effect of many CMS platforms. This should be detected in your crawl. Remove the duplicate, any links to the duplicate page and add a 301 redirect. If you can’t because your CMS has it’s limits, then add a rel=canonical tag to the one you’d prefer to be indexed.

The send thing you need to watch out for is off-site content duplication. Use a 3rd party tool like Copyscape to check to see if your content is duplicated on other sites. This could be your own site copying from others or a problem with scraping by 3rd party sites. If your site shares content with other sites, it could be stemming from use of a manufactures description for your eCommerce products, laziness or plagarism and should be hunted down and stopped immediately.

[See what I did there?]

Image Optimization

There’s a lot of things you could be doing wrong with your images. Here’s the two most important ones:

  • All images should have non-spammy but relevant descriptions added as alt text and optionally in the title element as well.
  • All images should have a filename that is relevant to the product, and not just a string of numbers or slider1432, because that also is a signal to search engines about the relevance of the image.

Favicon

This small image in a .ico format is displayed in the browser and becomes the icon when the site is bookmarked. It’s the cherry-on-top. Don’t forget about it.

Broken Pages – 404 & 500

There are a lot of reasons why you could end up with a 404 that go beyond the page simply not existing. It’s key to map out how many pages are actually broken through your crawl, where those errors are linked from, how they are broken, and get them fixed.

The 500 error is a server error message and tends to be a programming or CMS caused issue but still should be fixed with the same following process.

When check your crawl to see how many of these errors have been created don’t just fix the problem that created them on the page. Be sure to add 301 redirects via your HTaccess file for those broken URLs to a relevant resource page. Once you’ve done that check in Google Webmaster Tools to see if their crawl errors has that page listed and mark it as fixed.

404 Page

It’s going to happen eventually. Someone is going to type in the wrong address, click on a mal-formed link and end up on your 40 page. But that doesn’t mean you should lose the value of that visitor. Make your 404 more than an apology, guide those visitors to your highest value pages and offerings or at least entertain them with clever error messages.

Gathering All Of The Data

Now don't get too frustrated.

Now don’t get too frustrated. This is going to help make it all better in the end.

Okay. This is where the rubber meets the road. I’ve provided the template of your choice in Google Documents, Spreadsheets and Slides for you, now go and see just how many of these mistakes your site has made.

Presenting Your Audit Findings

You don't have to dance, but you are certainly entitled to.

You don’t have to dance, but you are certainly entitled to do so.

While there are even more parts to check for every area I listed and tons more beyond that you really should be feeling good when you complete this audit. It may take some time to actually get these things fixed, if they ever are, but you will know you’ve done everything you can to improve the situation. Good job!