With the role of SEO leaking into so many job roles these days, i.e. social media, development, UX, design, video etc, it’s amazing there are still traditional “SEO” job vacancies going. The problem with SEO being in multiple job roles, is that it can get neglected if not one person is looking after it.
Not only that, SEO methods are constantly evolving due to search engines’ developments and users’ changing perspectives. Keeping up to date with the latest algorithms can be tricky (as of right now, the latest buzz is around “Fred”).
So to help you out, we’ve put together the top 5 common mistakes we find when auditing client sites:
#1 Missing or duplicate meta descriptions
Meta descriptions are there not only to help with your rankings, but also to help users from a Google search figure out if your content is the kind of content they want to read. Not having one means that you’re going to lose out to other websites who are above (or below!) you from the organic results.
Yes, we know that writing meta descriptions can be a bit of a chore, especially if you’re having to do a whole bunch in one go. But! If you get into the habit of writing one before you publish your page, you’ll thank yourself later when you have 0 missing meta descriptions. Think of it like doing the washing up after dinner, it’s got to get done at some point, so just do it now.
Meta descriptions only need to be 156 characters long, which in itself can be tricky to write. Think of it as a summary of what that page’s content is about to try and keep it short.
Quite often in our audits, we find that clients have meta descriptions – yay! – but they all say the same thing – boo. Why this is bad is because you aren’t summarising what the page is for the user browsing through search results on Google. As we said earlier in this section, meta descriptions are a ranking signal, so using the same one across multiple pages isn’t going to help you.
#2 Not having a keyword document
A keyword document is really important to have, not just so you have another fancy looking document to wave around at people, but so you can build content from the keywords you want to target. You should aim to have a top 10 that are the key focus, but have up to 30 keywords to have some variety – but these can be a secondary focus.
We’d recommend reviewing your keywords every 6 – 12 months to make sure they’re all still relevant, or if you need to add anymore because your company has grown its business.
You should also be using some SEO tool to track your rankings against those chosen keywords. One we use for our clients is MOZ Pro. It shows you your SERP, where you competitors sit against each keyword and the search volume for that keyword.
#3 Keyword stuffing
So, do you know what keyword density is? It’s the percentage of times a keyword or phrase appears on a web page compared to the total number of words on the page. There is a whole debate about “what’s the ideal keyword density” but unfortunately there is no magic number. But to give you an idea… 5% is probably too high.
You may not be aware you’re doing it – or if you are, shame on you. Basically, the easiest way to make sure you’re not doing it, is to read back over the page and think to yourself, “does this read right?”.
Keyword stuffing can really harm your SEO performance. It falls under the Panda algorithm which looks content. Google is employing a special semantic search called Latest Semantic Indexing which recognises your contents topic without the need for stuffing.
#4 Not submitting an XML site map to Search Console
So, this heading is two-fold. Firstly, do you have an XML sitemap? If the answer is yes, move onto paragraph two. If the answer is no… why the hell not? XML sitemaps are there to help you be found from Googlebots. A case study at Bruce Clay found that pages indexed rose from 24% to 68% all because of an XML sitemap. So, if you don’t have one, go create one. They’re free to create (providing your website isn’t over 500 pages).
Those that do have an XML sitemap, go you! But… is it loaded into Google’s Search Console? AND, it is up to date? You should be using a CMS that automatically updates your XML sitemap anyway, so it’s rare to have one out of date. The important fact here is having loaded into the Search Console. The reason you want to do this is so that Google is aware of it which is obviously useful when you want pages to be indexed.
#5 Poor URL architecture
This is a tricky one to just pick up and do like the other points in here, so we don’t recommend you do this in a rush.
What we mean by URL architecture is having properly categorized folder structure within the URL, e.g.:
As a rule of thumb, you don’t want to be going any larger than 3 folders.
Ways to identify if your URL structure is poor is:
- Numbers and letters all jumbled in to the URL
- URLs with no folder structure, e.g. a blog post that doesn’t sit in a “blog” folder within the URL
- News or blogs listed in multiple categories, which leads to a duplicate content issues
- URLs that produce a 404 from the folder, e.g. www.example.com/news/ should go to a news homepage, but doesn’t and just shows a 404 page
Fixing any of these issues will involve some development help, but if you, the SEO expert, start to work out the hierarchy, you can talk this through with your development team to implement.
Hierarchy is important to search engines as it can find your content quicker, and you should be moving link juice around the site more efficiently.
If you’ve found that you have committed any of these common SEO mistakes, go and fix them right now! Or, a better solution, give us a call and we’ll come sort them out for you.
Posted 31 August 2017 by