The first thing that all our SEO clients want us to do is perform an SEO technical audit of their websites.
The audit tells you what problems are on the site as well as what needs to be done to improve the site.
I feel that the client also wants to gauge our SEO capabilities when they ask for this.
An audit will also tell me as an SEO what are the quick wins that could be enjoyed in bringing in organic traffic as well as critical issues that could be blocking the site from search engines
I like breaking my technical SEO audits into 4:
- Technical problems on the site
- Content audit
- Competitor audit
- Backlinks audit
Fortunately, in this day and era, there are so many free tools that one could use to perform an seo audit. They are however never that accurate. Other than using the tools, I like fusing my audits with a personal touch where I go through the site and see it from my own point of view.
I also like putting myself in the target audience shoes and thinking of what keywords they might be searching for to get to the site.
Components of a Technical SEO Audit in Kenya
What is the overall user experience on the site?
Is it easy to navigate?
What is the mobile responsiveness of the site?
How good is the site speed?
Does the site have an ssl certificate.
Well these are just bits of what a proper audit should have.
I like going a step further and looking at the google search console and Google analytics data.
What do they tell me about the site?
Are there pages that are not indexed and why are they not indexed.
Do we have a sitemap.xml and robots.txt files uploaded to the root folder of the site?
Are there 404, or say 500 errors being displayed by the Google search console dashboard.
Has the site been put under manual review or has security issues and have they been addressed by the webmasters?
I will also look for cases of keyword cannibalization or check whether the site has recently been hit by a Google update.
If the site is using schema markup language, could there be issues with this?
What is Google indexing on the domain? Is that what is needed? I have seen cases where a previously expired domain was showing hacked content on Google. This will of course affect ranking of the site.
How to perform a technical SEO Audit
It is now easier to perform a technical SEO audit.
All you need is the Google search console access of the site you are doing the audit on.
SEO Audit checklist
You can download our SEO technical audit checklist here
- How many pages are indexed.
- Do we have thin content
- How are the pages rendered by search engines
- Does the site have a sitemap.xml page and robots.txt file–what are their contents
- From Google search console, what is affecting indexing
- Keyword cannibalization
- SSL certificate
- User experience, mobile responsiveness
- Site speed
- Core web vitals
- Keywords ranking
- Spammy backlinks
- HTTP status codes eg 404, 500 etc
Steps to performing a technical SEO audit in Kenya
Check for presence of malware in the site.
If you do not have google search console access, you can use free tools such as sucuri site check to do this. If there is malware or signs of hacked content, you will need to remove this.
With GSC access, you can check for security issues as well as any manual actions on the site. Problem with this is that GSC only reports such cases when the search engine has crawled the site and found the issue.
Does the site have a valid SSL certificate? If not, you need to upload one. You could use a Positive SSL certificate for this.
Are all pages and items on the ssl certificate? You can use tools to check cases of mixed content such as geekflare.
One format of site
Check that all formats of the sites redirect to one ie http, https, www and non www point to one site
You will also need to examine whether there are any indexing site issues on the website.
–From GSC check that the important pages are indexed. All your service and product pages should be indexed. If you are operating a multi-location business, you would want all your location landing pages to be indexed, right? For blog pages that have keywords you want to rank for, you also want them to be indexed.
–Check that important pages are on sitemap.xml. The easiest way to ensure that all your important pages are frequently crawled and indexed is having them on the sitemap.xml file. Any page that should not be indexed should be removed from this file. Are there any pages that should be indexed that are present in the Google search console removals tab?
–Check that they are allowed in robots.txt. Again pages that should be crawled should be allowed in the robots.txt file. No need to individually copy each url here. Nope, a simple / would allow all site pages. Sitemap shown in robots.txt
–Wordpress sites, see that categories, archives and tags are not indexed to avoid duplicate content issue
Are there any thin pages indexed? For this you might have to manually inspect each of the pages. A thin page might have just a title and a single image yet the page is indexed.
There are instances where you will find that your webpages have duplicate content yet they are indexed. It happens very often in ecommerce sites where there is lots of facets to one product. You need to look to see that no duplicate content exists in such pages. If it exists, and it usually does, have canonical tags pointing to the main product page you want indexed.
Keyword cannibalization issues on GSC
In GSC, you might find that Google wants to rank two pages for just one keyword. Examine this and have the main page as canonical.
Mobile friendliness of the site
Use the Google Mobile friendly test to see that there are no issues such as text too small, content wider than screen or clickable elements too close.
HTTP status codes
Do we have HTTP status issues such as 404 errors, soft 404, 500 internal server issues etc. This would be displayed on GSC as well as on tools such as Ahrefs site audit. This needs to be sorted for a good user experience. You might need to involve your tech team on this.
Title tags and meta descriptions
Though these do not contribute to ranking, you would want them to be optimized for CTR.
Picture this for example; someone is applying for a loan and they see this ‘Personal Loans ABC Bank’ vs ‘Get Your Low Interest Personal Loan Approved in Less than 24 Hrs”
What would you click? What would persuade you more? The second option, right?
The same thing happens with the meta description.
Multiple location site
If the site is offering services in different locations, each of the areas need to have their own landing page. GBP should then be applied with each of the GBP having their landing page as their website.