top of page
Writer's pictureNathan Watkins

Unlock your Website's Potential with Technical SEO Audits

Updated: Feb 24, 2023

A good SEO strategy isn’t just picking the highest ranking keywords and slapping them onto your website’s pages. There is so much more going on under the surface of your pages that can be tinkered with and optimised to bring you the best results possible. All of these things come under the umbrella term of technical SEO, and in this article, we’ll dissect what a typical technical SEO audit will look like, everything that’ll appear on your checklist, what these things mean, and understanding how they can be optimised to maximise your website’s performance.


Technical SEO Defined

Technical SEO can be aptly summarised as everything SEO that goes on below the surface of your pages. When discussing the attributes of technical SEO, we are referring to things such as how effectively search engines will crawl and index the pages of your site, your website’s performance in terms of speed, your general site health, and hygiene in terms of error codes and much more. Technical SEO is not concerned with your content, but rather the structures and skeleton of your website in which the content sits; optimising these structures so that your website performs better than ever.



What is a Technical SEO Audit?

Put simply, a technical SEO audit is a complete analysis of the technical setup of your website, pointing out areas that can be improved to allow web crawlers to index and rank your website competently. There are loads of marketing tools out there that allow you to gain a great deal of insight into your website’s technical capabilities, with some of the most popular being:


Google Tag Manager:

If you’re looking to use tags on your website to analyse performance metrics, Google Tag Manager is the only tool you’ll need. Your technical SEO audit could check whether these tags are firing as they should be.


GT Metrix:

GT Metrix helps you to analyse your website’s page speed by providing easy to understand grades for different aspects of your site. You can use this info to optimise your site and make it run much faster and smoother than before.


Screaming Frog:

A website crawler tool that serves as the bread and butter for any technical SEO audit you run. It highlights common SEO issues such as errors, redirects, canonicals, and more.


Search Console:

A web service that lets you check for issues, errors, and the index status of your website’s pages. Use it as part of your technical SEO audit to optimise the visibility of your website.


Siteliner:

Duplicate content can be a damning issue for your website and can stop it from ranking as highly as it should. Siteliner is a tool that assesses the amount of duplicate content on your website and can help you to diversify your pages more.



Your Technical SEO Audit Checklist

Your Technical SEO audit should start with an agenda; a checklist that’ll cover everything you're looking to analyse and optimise over the course of the project. The following checklist you’ll see below might not be identical to the ones used by other SEO specialists and agencies, but it covers all of the key areas of your website that should be evaluated, including:

  • Is your site speed as quick as it could be?

  • Are there any issues with your website’s hosting?

  • Have you checked for insecure content?

  • Are your campaign tracking measures functioning as they should be?

  • Does your site have any crawl errors or status codes?

  • Have your redirects been set up correctly?

  • Have you looked into your Meta Robots directives and Robots.txt files?

  • Has canonicalization been put in place appropriately?

  • Are your Hreflang tags working correctly (if you use them)?

  • Have you checked your sitemap?

  • Is all of your Metadata adequately optimised?

  • Have you checked all of your image alt attributes?

  • Have you checked that there's no duplicate content across your website?

  • Have you evaluated your header tag structure?

  • Is your website mobile usable and friendly?

  • Is any Schema Markup you use working effectively?



Site Speed

Site speed is probably the most common, most noticeable, and most annoying Technical SEO issue around. Having a website that runs quickly is not only a valuable ranking factor for your SEO, but it can also have an indirect influence on your website’s performance. How many times have you clicked off of a website entirely purely because it was taking ages to load? A slow website usually means a higher bounce rate, which means fewer people are seeing what your website has to offer. You can gain an insight into how fast your website runs and the areas that it performs poorly in by using tools such as GTMetrix and Google PageSpeed Insights. These tools can shine a light on specific issues such as:


  • JavaScript: JavaScript is a great way to enhance the look of your website and make its pages pop, but it can also lead to a slower page speed. You can choose to defer the parsing of JavaScript, which means that the content of your pages will load independently of the JavaScript, and will only load these elements when needed.

  • Image Sizes: The images on your website’s pages can quite often be much bigger than needed without you realising, making your site have to work extra hard to load your website. Simply put, larger images take longer to load. Optimising image sizes can help your page speed by lightening the load and making your pages run faster for the user.

  • Enable Browser Caching: If you enable browser caching, you allow files from your pages to be stored in the user's browser. Doing so prevents these pages from needing to be loaded entirely again from scratch, making your website run much quicker than before.



Hosting

The software you use for hosting your website can be a big factor in the site speed of your site and your overall SEO efforts. To check for any issues with your hosting, you’ll need to manually access your server directly; the most common of which are slow site speed and issues with your TLD (Top Level Domain). An incorrect TLD means you need to make sure the IP address your website uses is the most relevant for your country; for example, a UK-based website should use the TLD .co.uk rather than .com. If your site has two of these domains, you need to redirect your .co domain to the .com version. Hosting issues are one of the most tricky to investigate and deal with correctly, so you should always look to work with your developers closely to make sure you solve these issues properly.



Insecure Content

Insecure content is an issue that is only concerned with HTTPS (Secure HyperText Transfer Protocol). In simple terms, HTTPS is a URL structure that is much safer and more secure than the older HTTP one. A website’s URL that begins with https:// is much more secure than one that starts with http://, and with Screaming Frog, you can see a list of every page on your website and its respective addresses. You should look to update all pages that use the old HTTP format to redirect to HTTPS addresses as soon as possible.



Campaign Tracking

If you are using tags (such as with Google Tag Manager) to track your website’s performance, then your audit will need to check these tags are working as they should be. For Google Tag Manager, this is simply a matter of clicking around your website and seeing if the right metrics are populating correctly (e.g. if you click on your website, then the Sessions metric should go up). You can check this even easier by downloading the Google Tag Assistant Chrome Extension. If these metrics aren’t changing, then there may be issues with the code of your tags.



Status Codes

Your technical SEO audit will be looking to identify and solve a variety of different errors, and the most obvious of the errors to find are pages with status codes. 400 errors are codes that tell the user a page is missing, 500 errors encompass server-side errors in the accessing of your pages. There are other status codes that don’t signify “errors”, every normal, functioning page on your site should display a 200 code (this means the page is normal and in good health) or a 300 redirect code (if you’re using redirects on these pages). Dealing with the error codes is a simple matter of crawling your site to find the pages with errors and redirecting the user from the problem page to a relevant, functioning one. Redirects can come with their own wrinkles, however.



Redirects

When users encounter error pages on your website, the process of automatically sending them to the most relevant page is known as a redirect. Having many redirects from one page in a row before users reach their final destination is known as a redirect chain- having too long of a redirect chain can slow your website right down and make it very frustrating for your users to navigate. Using a tool like Screaming Frog, you can identify every instance of pages redirecting and optimise this process. You can accomplish this by eliminating redirect chains on your website; simply have the first redirect take the user immediately to their target page.


Redirects also come in many different forms depending on the reason you are using them in the first place. Your audit should check in on all of your redirects to make sure the right type is being used in the right situations. For example, a 301 redirect is used when a page will never come back to your site. A 302 redirect on the other hand is a temporary measure; for pages that should eventually return to your site after they are updated/fixed.




Robots.Txt & Meta Robots

Robots.txt files and Meta Robots directives are two snippets of code that you can use to provide more information to Google on which pages you would like to be indexed and in what ways you would like them to do so. The most common uses for both are to highlight individual pages to Google that you do not want to be indexed (for example, pages with no inherent value such as Cookies policy pages). In your technical SEO audit, you can check both of these code snippets by using Screaming Frog’s SEO Spider. You need to ensure that both are being correctly applied and are not accidentally blocking web crawlers from indexing pages that you do want to be indexed.



Canonicalization

Your canonical tags are code snippets that instruct web crawlers to index the right version of your pages. Many websites contain dozens of pages that are incredibly similar to one another; for example, an online store selling clothing may have several pages of near-identical content for each variation on a certain item of clothing (e.g. style, colours, etc.). These pages are all very similar, so only one version needs to be indexed by web crawlers- you need to decide which page is the “canonical” version. Your audit should delve into whether or not each of your pages has the correct self-canonicalization tags applied, and you can check all of these at once using Screaming Frog’s SEO Spider tool.‍



Hreflangs

Hreflangs aren’t used on every website, but if yours does then it is something that needs to be considered. They are little snippets of code that inform web crawlers about the different language variations a page has. Hreflangs are used to tell crawlers not to view these pages as duplicate content on your website, and also to make sure the person viewing the website gets the right language for their nationality/location. Your Hreflangs will need to be checked with a tool like Screaming Frog to make sure all is running as it should be.



Sitemaps

A sitemap is, as hinted by the name, a map of your website. Web crawlers such as Googlebot use sitemaps to easily navigate your website and identify the pages that you want to be indexed. Your Technical SEO audit should check that your sitemap is crawlable, and contains no errors or broken links of any kind. Using Search Console, you can quickly check for errors after uploading your most recent version of your sitemap.‍




Metadata

Metadata is what we call the snippets of data that describe every page on your website; in the form of a Meta Title and a Meta Description. This information is most commonly seen in Search Engine Results Pages (SERPs). Metadata needs to be optimised to the appropriate character length; if they are too long, then the information may truncate and the whole description or title might not be visible to the user. If they are too short, then search engines may not end up displaying them to the user at all. Meta Titles should be between 40 to 66 characters in length, while Meta Descriptions need to be between 120 and 155 characters long. You should also look to factor in some keyword optimisation to get these pages ranking as highly as possible!



Image Alt Attributes

Image alt attributes are pieces of code that are in place to describe images to the visually impaired. You can use a tool like Screaming Frog’s SEO Spider to identify all the images on your site that are either missing these attributes or could be optimised further through keyword research.



Duplicate Content

Duplicate content can be a big issue for your website’s SEO and ability to rank. The two main issues associated with duplicate content are when:

  1. Content on your website is repeated over many pages on your website.

  2. Content on your website already exists on another website entirely.


Web crawlers like Google hate plagiarised content and come down heavily on those that engage in it, so whether you have accidentally copied someone else’s content a little too closely or vice versa, you will need to adapt your content to make it more unique. Websites with unique content are valued much higher than those that have none, so this is a very important issue to tackle if the content on your website is used too often or exists elsewhere on the internet.



Header Tags

The way a page’s content is structured is an important ranking factor, and a lack of a clear header structure can be detrimental to achieving the best positions. Achieving a good header structure is very simple; you simply need one H1 Title Tag at the top of your page. Each new section on your page needs a H2 Title Tag, and every subsection under your H2s needs a H3 Title Tag. To visualise it a little crudely, it should look something like this:


Header 1

  • Header 2

    • Header 3

  • Header 2

    • Header 3



Mobile Usability

Back in the day, websites would be built purely for the average desktop user; but as of November 2021, 54% percent of all website traffic comes through mobile. Your website needs to account for all your mobile users and be as mobile-friendly as it possibly can be. Use Google Search Console to receive a mobile usability report that highlights a variety of issues; from the size of content sections and text fonts to the positioning of your page’s clickable elements. You can also simply conduct a manual sense check of your website to see if it looks as it should from the perspective of a mobile user. Remember, over half of your site visitors will be looking at it through the much smaller screen of their smartphones!



Schema

Schema is the way you as the webmaster can provide additional nuggets of information to search engines, in the form of code snippets that add context, messages, and extra details. For example, a page with a FAQ section can have Schema Markup in the page's code that tells search engines that the page has a FAQ section. The advantages of this are that they help search engines to better understand what your pages are, and there are Schema options for hundreds of different situations. As part of your technical SEO audit, you can check for any errors within your Schema Markup coding by using Search Console, which easily highlights what issues and where they are. You could also use the Structured Data Testing Tool to achieve this.




30 views0 comments

Comments


bottom of page