This guide about Site Audit using SEMRush will not too long. By the way, some of the numbers in the image are edited.
By the way, SEMRush has a 7-days free trial, you can do Site Audit in this free trial.
* Please find some FAQs about general site audits in the end section of the article. The FAQs are not only useful for SEMRush site audit, but also for other site audits.
Semrush Site Audit Steps
Step 1 – Go to Site Audit Project
Here’s the place you’ll find projects: 12 tools to manage SEO including site audit, PPC campaign, social media, and content marketing activities related to one specific website in a project. As mentioned, there are other tools for SEO, but we will focus on SEO Site Audit.
You can find a guide for other tools on our website. If you don’t find it, please contact us, and we will happily try to make it. It also benefits us because we know what demand for the SEO tools out there.
Step 2 – Click Add New Project
By adding a new project, we can start using the tools provided by SEMRush. Please provide the domain and the project name. Giving it a proper name will make you organized.
Step 3 – Site Audit Overview
You can see the overview of the errors, warnings, and notices of the site you audit. Looking at the overview will help you to notice what’s wrong with your website. You can then prioritize which SEO action you have to do.
Step 4 – Issue Tab
Scroll down to see other issues which are warning and notices. You can send this report to the software engineer in charge of these problems. The issue stated in this step is clear enough. If you have authority you can fix them right away, but if you don’t, you can export the issues as an MS Excel file or CSV File and send it to your client or the software engineer, in case you do an SEO site audit for your company’s website.
Step 5 – Crawled Pages
This tab will show you the pages crawled by the SEMRush bot. You can know how much the depth of the pages which is the number navigation link the bot needs to reach the page from the root (homepage). A good website will not have an orphan page, which is a page listed in a sitemap but no navigation from root. The page listed in the sitemap also needs to have depth no more than 6 (this is Alexa recommendation).
Step 6 – Statistics
These are some important statistics about your websites. You can see which statistic which will help your website to have better SEO. As you may know, 404 pages must not be linked by other pages. If your website has many 4xx pages, then you should try to fix it. Moreover, 5xx means there is something wrong with the server, you should fix it, as it will affect your SEO score.
Step 7 – Compare Crawls
You can compare the crawled pages from the previous site audit. You can also see how many issues you fix. Compare it with the previous site audit will give information about how good your website. You can also inspect if there are any features developed that make your website has issues.
Step 8 – Progress
This tab allows you to see the performance of your website based on the history of SEO audit. You can analyze whether your SEO fix is successful or not.
Step 9 – Re-run Campaign
If you or software engineers already fix the issue, and you are confident that the issues are fixed, you can “Re-Run the Campaign”.
So, that is the SEMRush Site Audit will help you to determine your website health. You can also cross check the issue which is also appeared in the Google Search Console.
You can prioritize the issues that appeared in both the tools. Why? Because the issues that appeared in the Google Search Console are actually important issues, if it also appears in the SEMRush site audit, then it is common issues.
By the way, if you ask why there are issues that appeared in the SEMRush, but not in the Google Search Console, there are several reasons.
- The issues are not technical.
- Google not consider it as issues, but common SEO best practices consider it as issues.
By the way, I also use SEMRush as competitor research. You can check at SEMRush Competitive Research
By the way, here are some FAQs that commonly asked to me.
Site Audit FAQ
Do I need an SEO Audit for my website?
Yes, proper technical issues lead to bad SEO practice. It means search engine crawler bot cannot see your website’s content correctly. It may lead to issue in their indexing.
You can find issues for yourself, but it requires high effort as your website grows bigger. Most of technical issues can covered by scanning it programmatically. That’s what SEO Audit tools do.
What are tools to do Site Audit?
If you talk about SEO, then leading SEO tools usually have it. I have tried SEMRush site audit, Alexa site audit, and Ahrefs site audit. By far, SEMRush is the most reliable tools as it have not only issues section but also statistic.
Actually, Google Search Console also do site audit for you, but most of them is related to technical issues. Some SEO practice is not scanned by Google Search Console. If you ask why, it is because SEO best practices are something that assumed by SEO specialist that trying to predict what Google algorithm is.
Do SEO Audit need special access to my website?
No, as SEO audit is trying to mimic what search engine crawler do, so you don’t need any special access given. The crawler bots will access the public page that doesn’t require any credentials.
If you found any SEO site audit that need access, please ask them what is it for.
Do I need to implement all SEO audit recommendations?
The issues that resulted from the SEO audit are the violations of SEO best practices. But, some of the SEO best practices are just the prediction of Google algorithm.
Moreover, you may encounter differences in several reports from different tools. It will be better if you fix the issues, but if you have proper justification, then you skip some.
The best example is the page title. Google doesn’t state how much length your title should be, but the Alexa site audit will have issues if your title is longer than 60 characters. If you think you cannot describe the page using only 60 characters, then it is better for you to have a longer title as it will be more descriptive to the search engine crawler.
You may also look at the effort to fix the issues. Such as, do software engineers need to change many things? Actually, the good architecture of the web should not encounter this problem, but who knows… (It is based on my experience as software engineers).