SEO Audit: The Ultimate Guide 
SEO audit is a crucial step for the SEO of your website.
It helps to understand why the pages of your site are not well positioned in the search engines.
Thanks to a complete SEO audit you will know all the points to be improved so that the foundations of your site come as close as possible to perfection.
Without solid foundations, your site will have only a very small chance of reaching the top of the Google ranking and other search engines.
In this article, I will show you exactly how to conduct a complete SEO audit in 2018.
You will know all the SEO steps and tools to use to carry out your audit effectively.
Tools Needed for a Complete SEO Audit
Before attacking the SEO audit you will need to prepare the following tools:
For keyword audit:
For competition audit:
For Technical Auditing:
For the audit of the pages (on-page):
For content audit:
For User Experience Audit (UX):
For Audit Backlinks:
There are of course other SEO tools, but those listed are the ones I use personally.
What is an SEO Audit?
It is thanks to the SEO audit (if it is well done) that you will be able to know all the axes of improvements, whether purely technical or strategic.
The purpose of SEO audit?
This is to improve the positioning of each page of your site on a variety of keywords with the aim of generating more qualified traffic.
More Qualified Traffic = more Leads.
But above all:
More Leads = more customers.
Why do an SEO Audit?
A house needs solid foundations to last in time, do you agree?
Otherwise, it may collapse over time and perhaps collapse like a house of cards (I exaggerate a little).
We all agree I think, that when you notice that a house has bad foundations, you immediately criticize the quality of the work done by the mason.
Where am I coming from?
For a site, it’s the same.
If the foundations of a site are sloppy, then we can say that it is a poor site … no?
In fact not really for us at first sight.
On the other hand:
For search engines, a site with many technical, structural, semantic weaknesses that do not meet the criteria imposed by their algorithms, will be considered as a site of poor quality.
Not correcting all the weaknesses of your site, is tantamount to making a false start in your SEO campaign.
Why not show Google and others that your site is already well optimized and has a solid foundation?
Too many people focus solely on:
- writing blog articles galore
- creating backlinks
Because it’s more fun than doing an SEO audit.
Their site does not even meet the basic requirements of search engines because they have never done SEO auditing.
Do not make the same mistake …
If you want to get off to a good start, do a full SEO audit of your site.
When Do You Have to Do an SEO Audit?
Everything is going very fast in the field of natural referencing.
What works well today, may be more effective at all in 6 months.
Google updates its algorithms a hundred times a year.
To stay abreast of Google’s different algorithms, take a look at this article.
It is updated regularly.
You will understand that it is necessary to perform SEO audits regularly.
This will ensure that your site is always on top (or as much as possible).
I recommend that you conduct an audit at least twice a year and if possible quarterly.
To verify that no problem has been forgotten.
To identify possible new weaknesses.
Now that we have laid the groundwork, let’s get right to the point.
SEO Audit In 7 Steps
A successful SEO campaign is the product of a hundred positive ranking factors.
For you, the purpose of an SEO audit can vary according to your needs of the moment.
Maybe you want to spy on your competitors to replicate some of their strategies?
Or maybe you need to analyze your own keywords to find new opportunities?
Some will especially want to correct all the technical weaknesses of their website.
There are many things you can do better with an SEO audit …
Follow these 7 steps and you and your website will be off to a great start.
Let’s move on to the first step of the SEO Audit.
Step 1: Keyword Audit
The first thing to do during an SEO audit is to do a keyword audit.
That is to say:
Check to see if the keywords you selected during your initial keyword search turn out to be ideal or if strategic changes would be preferable.
Find new opportunities for winning keywords.
How to do?
For this, you will need to re-analyze your current list of keywords.
The first thing you need to ask yourself is:
“Am I targeting the right keywords? “
It’s time to know if these keywords are:
- Accessible: not too competitive
- Relevant: which are in a pretty close relationship with what you sell
- Profitable: which have the potential to quickly trigger sales
A good SEO audit will help you determine the accessibility, relevance, and conversion potential of each of your current keywords.
When a keyword is accessible, it means that your site is likely to be able to position itself on it.
What good is it to try to position the pages of your site on keywords too ambitious?
The hard truth:
New websites can not rank too ambitious keywords.
- 1. Sites that rank on competitive keywords are older and have already earned the trust of Google.
- 2. These same sites have a domain authority (DA) much higher than that of your site because it’s been years that they acquire backlinks.
- 3. Because they rank on competitive keywords, it also means they have a much higher budget than you. This allows them to purchase high-authority bond placements to maintain their position.
- 4. Finally, sites that have existed for a long time are more likely to have a large list of subscribers, which allows them to collect a significant amount of backlinks to each of their publication of blog articles …
Be realistic and rely on real data.
During an SEO audit, it is very common to redefine a list of keywords much less competitive.
Often these are long-tail keywords.
- 1. It is easier to position yourself on this type of keywords
- 2. Their intention of purchase is often very superior, so any benefit.
Do you know Hubspot?
It’s an inbound platform that helps companies increase their growth through marketing, sales, and CRM tools.
Their site generates tons of traffic.
Look at this :
Almost 5 million hits a month from search engines!
It makes you dizzy.
Ok, now let’s take a look at which keywords are driving the most traffic.
Keywords translated into French:
- Download via Youtube (an article explaining how to download and save Youtube videos).
- Make Gifs (an article explaining how to create animated Gifs).
- CV templates (11 Word CV templates).
Oh boy … seriously?
In your opinion, what percentage of people who have typed these 3 keywords are interested in Hubspot Marketing tools?
Not many people.
However, let’s take the example this time of the Ahrefs site.
Almost 20 times less traffic than Hubspot …
But look at the 4 keywords that drain the most traffic to their site:
Keywords translated into French:
- 1. Ahrefs
- 2. Website Traffic
- 3. Keyword Research
- 4. Submit sites to search engines
In your opinion, what percentage of people who typed these 4 keywords are interested in their SEO tool “Ahrefs”?
Their traffic is super qualified.
You understand the interest of creating highly relevant content, which will drain (perhaps) less traffic but which will undoubtedly make much more pleasure to your banker …
This is the ability of a keyword to convert visitors to customers.
The higher the commercial intent, the more the keyword has the potential to generate cash.
There are several types of keywords but here are the 3 we are interested in:
These are keywords like:
- Cheap Adidas sneakers
- Buy shirts online
- Book plane tickets …
No need to explain that these keywords are typed by people who want to make a purchase very quickly.
They all have a high commercial intent.
Do you have to include it in your content?
The keywords Products and Services
As the name implies, such a keyword describes a product or service sought by the user. It can be the name of a brand of a product, the name of a service, a product or a product category.
- camera comparison
- smartphone reviews
- best micro pc
- NIKE shoes
- iPhone 8
- site creation
- cheap computer
These keywords convert less well than transactional keywords but still have high business intent.
Someone looking for reviews on smartphones or comparing cameras tends to want to switch to buying shortly.
Do you have to include it in your content?
From time to time, when you have the opportunity.
Unlike the first two categories of keywords, seen above, this is the vast majority of search terms typed by Internet users.
These are all keywords or phrases used to search for information.
Some concrete examples:
- How to lose weight quickly
- Tips for generating traffic
- Best ways to learn English
- vegan cooking recipes
Tracking the performance of your keywords
Did you know that the positioning of a keyword varies according to:
- 1. the type of device you are using (computer or mobile)
- 2. where you are when you search.
That’s why it’s important to keep track of your keywords by device type and location.
It is now impossible to know the exact traffic generated by a given keyword.
On the other hand:
You can know how much traffic a page is able to generate and what keywords it is positioning on.
It is thus possible to know the “value” of a page, rather than that of a keyword.
To track the placement of your keywords, it’s best to focus on a handful of keywords.
Be sure to choose a few keywords by category of keywords:
- 5-10 keywords with high commercial intent
- 5-10 long-track keywords
- 2-5 brand keywords (your company or brand)
To measure the performance of keywords you must first identify goals:
1 / Being able to attribute gains or losses of traffic:
- Global tracking,
- Follow-up by geographical location,
- Tracking by device types,
- Followed by page.
2 / Identify pages that need improvement and high traffic potential with little effort
- Keywords for which pages of your site are already positioned on page 2,
- Keywords for which pages of your site are already positioned between positions 4 and 7,
- Untapped keywords,
- Keywords of a long train.
3 / Identify the keywords of current and future competitors
- Track domain names that come up most often on page 1 and 2 for all your keywords,
- Track each competitor for new opportunities for keywords and content.
To track the placement of keywords, I use SEMRush.
Here is a video made by the SEMRush USA team about positioning tracking in the tool:
Find New Keyword Opportunities Through The Google Search Console
Now, let me show you how you can find opportunities really easily.
For this, I will use the Google Keyword Planner and the Google Search Console.
1. Go to the Google Search Console and click on Status → Performance and select in the Date field the last 3 months.
2. Then click on the colored rectangles Total Number of Impressions and Average Position.
Total impressions: The number of times one of your pages appeared in Google results.
Average position: Average ranking on the Google results page.
3. Finally, sort the results Position highest to the lowest.
Take a look at the keywords that rank your pages at the bottom of the first page or on the second page of Google results.
In other words, positions from 6 to 20.
And see if some keywords get a lot of impressions.
In my case, I spotted the keyword “Google News”.
My SEO tutorial on Google News is only 14th and the number of impressions is over 12,000.
My article appeared 12642 times in Google’s SERPs the last 3 months.
On this keyword, I am able to generate a lot of impressions while I am classified on the second page …
So imagine if I spend some time optimizing this blog article for this keyword …
Spend some time in your Google Search console, you will undoubtedly find nuggets.
In addition to the initial keywords on which you want to position yourself, add the new keyword variations you’ve found during the analysis of your Google search Console account.
Step 2: Competition Audit
Analyze the competition is essential at the beginning of an SEO campaign (and very useful regularly).
Too many people skip this step.
They prefer to search for keywords directly, create and optimize their content and create backlinks.
But this step allows you to understand exactly:
- 1. What are your competitors,
- 2. What is their keyword strategy?
- 3. What is their Netlinking strategy,
- 4. What content allows them to generate qualified traffic.
A real gold mine.
This allows you to identify new opportunities that you would not have thought of without this analysis.
Thanks to the competition audit you will know exactly which tactics are working in your sector or niche and what you will have to improve in your strategy to see your site climb up the rankings.
Seeing the weaknesses and strengths of your competitors you will be able to determine the difficulty and the necessary efforts that will have to be done during your SEO campaign to be better than them.
To summarize, a good audit of the competition will allow to:
- 1. See if a keyword is too competitive.
- 2. Understand what types of content work best for your competitors.
- 3. Analyze the link profile of your competitors, in order to discover new link opportunities.
Analyze the level of competition on your current keywords
This is to analyze the authority of Page (PA) and the authority of the domain (DA) of the 10 sites present on the first page of the results of Google.
For this, you will need to install the Moz Toolbar.
Let’s say you want to position yourself on the keyword “content marketing”.
Then enter “content marketing” in Google and wait for the results:
We are looking here for sites with a DA less than 40.
We find that the second and the third site are in this case.
The DA – domain authority, is an excellent gauge to determine whether or not a keyword deserves to be considered.
This step eliminates very quickly the first set of keywords from your list.
Keep in mind:
The measure of the level of competition is relative.
For example, it would be unthinkable for a newly created site to target the keyword “content marketing”.
A newly created site (with a new domain name) will have no authority in Google’s eyes and will be completely ignored at the beginning.
The DA of a new domain name is equal to 1/100.
On the other hand:
If your site was created several years ago and has a decent domain authority (ex: 38/100), you can try to position yourself without problems.
To go further, read this guide on Competitor Analysis of a Keyword.
Identify Your Competitors
The goal here is to list all the competitors who are able to position themselves on the first page of Google results on each of your keywords.
Do not hesitate to add to the list of other direct competitors that you think.
To find some of these competitors, at first, you can use the SEMRush tool.
Once in the tool, you must enter the domain name of your site (without https: // www).
Once the analysis has been launched, go down to the Main Organic Competitors section → Read the detailed report.
Then sort by common keywords by clicking on the small down arrow.
Browse the list and select only those that you think are relevant and have decent traffic.
I advise you to spend a few seconds or minutes on each site to validate the relevance.
By default, SEMRush shows you the organic competitors on the computer.
To see the competitors on mobile, click the Mobile button.
You will see that the results are quite different.
Gap Analysis on Key Words
Keyword gap analysis is about spying on competitors to determine which keywords are on their site, and which ones are not.
From there, it is enough to do the reverse-engineer to understand why these competitors are able to position themselves better than you and deduce solutions so that you can also rank on these keywords.
Here are some effective solutions:
- Rework on Meta tags,
- Improve the architecture of the site,
- Improve and optimize existing content,
- Create a new blog article specific to a theme,
- Build links pointing to the content that must be positioned on the desired keywords.
To perform this Gap Analysis in SEMRush, simply go to Opportunity Analysis → Keyword Opportunities.
Enter your domain name and that of your competitors.
You can compare up to 5 sites.
Then click on Validate.
You can then export the results to Excel or CSV formats and find new keyword ideas.
Gap Analysis on Backlinks
Would you like to have in front of you the list of all the backlinks that your site does not have but that your competitors managed to obtain?
For that, I will still use SEMRush.
This time click on Backlinks Options.
Then indicate the different domain names of your competitors.
You will have in front of your eyes all the referent domains pointing towards the sites of your competitors.
To know in detail all the pages and backlinks coming from a referent domain, click on the blue link (as in the example below).
By examining each competitor’s backlinks, you’ll have a list of backlink opportunities at your fingertips …
Spy Your Competitors To Steal Content Ideas
One of the great strengths of SEMRush is to instantly discover all the keywords on which your competitors are positioned and what are the contents that allow them to obtain such good results.
Why spend a lot of time searching for keyword ideas and content that may be able to get you traffic when you just need to pitch your ideas to competitors and produce better content than they do.
You are just a click away from knowing all the keywords and content that actually generate organic traffic from your competitors.
You just have to copy and improve.
Imagine the number of hours, days and even weeks that you will save …
Open SEMRush and type the domain name of the first competitor.
Then let the tool analyze the site.
Scroll down to the Best Organic Keywords section and click the Read Detailed Report button.
Then go to the Top pages section to see the pages that generate the most traffic from your competitor.
You can then take a look at the contents of each page to inspire you …
Do this with all your competitors and list all the keyword ideas and content that you can use for your own site or those of your customers.
Now let me show you how to do a technical analysis.
Step 3: Technical SEO Audit
Contrary to what is said, the technical audit is not so complicated to achieve if you have the right methodologies and if you use the right tools.
Keep in mind that your site probably has tens or even hundreds of technical problems.
It is always possible to improve your site from a technical point of view.
This SEMRush study conducted in 2017 shows that:
- More than 80% of sites analyzed have dead links (error 404)
- More than 65% of sites have duplicate content.
The technical audit allows you to discover many errors that can be corrected quickly and that will have an almost immediate beneficial effect for your site.
We all want to have a site that is better positioned, generates more traffic and brings more customers.
One of the key steps:
Correct all technical errors on your website.
Technical issues can really ruin the SEO performance of your site.
Let’s see how to perform a technical SEO audit.
1 / Browse (Crawler) Your Entire Site
If you want to generate a visual technical audit report showing you very quickly the technical errors of your site or that of your client then there are several tools available on the market.
For my part, when I want to have a quick overview I use SEMRush.
Just use the technical audit tool.
For this, select SEO Toolkit.
Then click on Site Audit.
Then you will need to enter the URL of the site to analyze and start the analysis.
Once the scan is complete, you will be able to access the following dashboard:
To access the problem details, click the Problems tab.
For a more graphical view of the technical audit, you can click Statistics.
Remember one thing:
To have a global vision of all the technical problems of your site, this kind of tool for the generation of a report of audit SEO is very interesting …
For an in-depth technical audit, I recommend, without any hesitation, to crawl your site with the Screaming Frog SEO Spider tool.
This tool allows you to detect a lot of technical problems …
It exists in the free version, which allows crawling 500 URLs maximum.
If your site is bigger I advise you to invest in this tool.
If you have the paid version, you can configure the crawling by clicking Configuration → Spider.
Click OK and type the URL of your site.
Click on the Start button, and the crawl will launch.
You must now ask yourself:
“Ok, but how to exploit this data? “
Do not panic:
That’s exactly what I’ll explain a little further …
But first check if the site is well indexed.
2 / Is Your Site Correctly Indexed?
Your website can only get organic traffic if your pages are indexed in Google.
That’s why it’s always a good idea to check if your entire site is properly indexed.
A good place to start is your robots.txt file.
Sometimes, by accident, the owners or developers of sites block the indexing of their site by the search engines.
That’s why you must audit your robots.txt file to make sure your site is indexed.
The command that you need to check in your robots.txt file is ” disallow “.
If you configure your robots.txt file incorrectly, you are likely to prevent the search engines from indexing your site.
The specific command you need to find is “Disallow: /” – this tells robots not to index your site or part of your site.
Your site must have a sitemap because it helps with indexing.
If your site is on WordPress, Yoast will automatically create one for you.
If you do not use Yoast, you can install the Sitemap XML plugin
For those who have a non-WordPress site, you will have to do this the old-fashioned way.
Go to Google then type ” site: yoursite.com “.
This will tell you if your site is pretty well indexed or not.
If your site is not displayed first, it means that you have certainly been penalized by Google.
Maybe you are preventing search engines from indexing your website.
Let’s move on to further analysis with Screaming Frog.
3 / Is There a Cannibalization Of Keywords?
One of the most annoying problems to check during a technical SEO audit is keyword cannibalization.
The term “cannibalization of keywords” means that 2 pages are fighting to position themselves on the same keyword, and are therefore in direct competition.
This phenomenon can disrupt Google and force it to make a decision not always just …
In this case, Google wonders:
«Which page is the” best “for this query?
So it’s always better to guide Google instead of letting it make that decision.
You must get rid of any cannibalization of keywords.
There is a form of keyword cannibalization that is very common:
When you optimize the homepage AND an internal page for the same keyword.
This is more common during optimization for local SEO.
Take the example of an SEO consultant located in Paris.
The homepage would have in the title tag:
“SEO Consultant in Paris | SEO Agency »
And another internal page would be optimized in this way:
“Best SEO Consultant in Paris | SEO Agency “
This must be avoided.
Choose a single page to optimize for the term “SEO Consultant in Paris” and “de-optimizing” the competing page.
There is another kind of cannibalization of keywords that you need to check and this has something to do with your blog.
There is no problem if you write content about a topic several times.
But if it becomes excessive, it can cause confusion.
Google will struggle to understand which page is the most relevant for this same keyword.
More important again:
Google wants you to write content:
- Well thought,
He does not want a short article, skinny, that does not explain a subject in depth.
There are of course exceptions to the rule. But in general, short content should be avoided for most businesses.
Keep in mind that quality. Well-written content will perform much better in search engines and lead to better engagement with your audience.
On the other hand :
Short and poor quality content will probably lead to a cannibalization of keywords.
Google could interpret your actions as a simple manipulation of long-tail keywords.
If this happens, the Panda algorithm could hit your site with full force.
That being said, let me now show you how to quickly spot keyword cannibalizations:
Open Screaming Frog SEO Spider.
Enter the URL of your site and press Start :
Then go to Page Titles.
Enter one of your keywords in the search bar. (This will show you all the pages competing for this keyword).
Then look at all the pages containing this keyword in their title tag to identify the pages in direct competition.
4 / Are there any redirection problems?
There are four kinds of redirects that can penalize the SEO performance of your site:
- 1. 302 redirects
- 2. Chain redirects
- 3. The “no www” version without redirection 301 to the “www” version
- 4. The “unsecured” version of a domain name without redirection 301 to a “secure” version.
Let’s start with 302 redirects.
302 redirects are “temporary” redirects and do not convey any authority.
302 redirects need to be changed to 301 redirects to convey authority through links.
To check if you have 302 redirects, open Screaming Frog SEO Spider.
1. Enter the URL of the site to scan and click Start
2. Go to the ” Response Codes ” tab
3. Click the ” Filter ” drop-down menu and select ” 3xx Redirection “
4. Click ” Export ” to export all 302 redirects.
Redirections chained are represented by a series of interconnected redirects 301.
Breaking the chain will transmit all authority directly to the landing page (instead of a partial authority).
Here’s how to check if you have channel redirects with Screaming Frog SEO Spider:
1. Go to Settings → Spider
2. Click Advanced, select Always Follow Redirect, and click OK
3. Enter the URL to scan and click Start
4. When the scan is complete, go to Reports → Redirect Chains
Does the non-preferred version (without www) of the domain have a 301 redirection to the preferred version (www)?
Whatever version you choose to show, it will have no impact on SEO performance.
Google treats them identically, so it depends on everyone’s tastes.
Problems occur when you do not redirect the non-preferred version to the preferred version.
Suppose you want to use www.supersite.com.
In this case, the version with “www” becomes your favorite domain.
The non-www. becomes your non-preferred domain and vice versa.
You must do a 301 redirect from your non-preferred domain to the preferred one.
Otherwise, you will end up with two duplicate sites AND you will let the authority flow … like a water leak.
I noticed that a lot of fully custom coded websites were suffering from this problem.
Developers generally underestimate the repercussions this has.
Very often they do not perform any redirection 301 of the non-preferred version to the preferred one.
This often results in two duplicate websites …
I use this tool to see if everything is clean in terms of redirection.
Does the unsecured version of the site have a 301 redirect to the secure version?
Let’s say that the transition to SSL did not happen in a very clean way.
Many webmasters or site owners have made the decision to secure their site with a certificate.
But, many of them face difficulties in implementing the certificate.
They forget to perform a 301 redirection of the unsecured version (HTTP) of the site to the secure version (https).
This has an effect similar to not redirecting a domain without www to a domain with a www (vice versa).
Identifying the problem is simple:
1. Go to your website: https://www.yoursite.com/
2. In the address bar delete the ” s ” from HTTP and hit the ” enter” key.
This should automatically redirect to https://www.yoursite.com
Otherwise, we will have to fix it.
You can also use the tool mentioned above to check this point.
5 / Are there Duplicate Meta Tags?
Duplicated content can really hurt your website and possibly cause a Panda penalty.
E-commerce sites are more subject to duplicate content because they generally copy the product descriptions provided by the manufacturers.
To top it all:
They also use standardized Metadata for these pages.
This creates a real “Tsunami” of duplicate content.
Let me show you the problems encountered with the duplication of metadata, at first.
I will discuss the duplicated content aspect later in the “Content Audit” section.
Duplication of META Descriptions
Duplication of metadata is more common at e-commerce sites.
This is because many e-commerce sites contain many pages displaying similar products.
Thus, site owners or webmasters, simply copy the same META descriptions on all pages, for sheer idleness.
This is not a good practice at all.
If your site has a lot of similar pages, you have to consolidate all that.
There is no reason to have multiple pages for each color or size variation for the same product.
Once you have fixed this problem, you will be able to write unique descriptions for each page.
Yes, I said each page.
You will have to do your best to have metadata and unique content on each page of your site.
It will take a lot of time and effort, but it’s well worth it.
You do not have to do it all in one day.
To find duplicate data meta you can use Screaming Frog SEO Spider and Google Search Console.
Let’s start with Screaming Frog:
Enter the URL of your site and click START
Go to Meta Description, in the Filter drop-down menu select Duplicate, then Export.
Another way is to use Google Search Console.
Go to Google Search Console then Appearance in search results and finally HTML Enhancements :
In this section you will find all the duplications of META descriptions and title tag.
6 / Are There Errors 404 (With Referencing Juice)?
404 errors are not all equal.
First, let me clear up the myth that “all 404 errors are bad for SEO”.
This is wrong.
404 errors are an effective way to tell search engines that pages no longer exist.
When a search engine like Google detects a 404 error, it removes the page from its index.
So you can do that intentionally.
Think for a moment:
Would you like someone to find one of your pages with 404 error via Google?
Of course, no.
Google removes them because they are useless for users.
That being said:
There are pages with 404 errors that can negatively impact the performance of your site: those with backlinks.
This type of 404 page is losing authority to your site.
What has to be done:
Redirect these 404 pages to pages with the same theme on your site.
Thus you recover the authority flow of these pages.
If you can not find pages dealing with the same theme then simply redirect them to your homepage.
To find 404 errors, I recommend using the Google Search Console again.
On the left side menu, click Cover.
You will see, on the dashboard, all the errors including the “URLs sent not found (404)”.
7 / Is the architecture of your site optimized for SEO?
Many SEO audits do not include an analysis of the site’s architecture.
And you know what?
It’s a huge mistake.
The majority of websites are not designed with natural referencing in mind.
Which, in fact, is not a bad thing.
Indeed, many build their site thinking of what their future customers would like to see.
You should always focus on what users/visitors want from your site when you optimize your site.
You will still need to guide and satisfy the search engines.
An optimized site architecture, to satisfy both users and search engines.
When analyzing the architecture of your site, you need to ask yourself the following questions:
- Is the navigation clean or scrambled?
- Do internal links use optimized anchor texts?
- Can I improve navigation to make it easier for users and search engines?
8 / Is the structure of URLs optimized for SEO?
I’m still analyzing URL structure during SEO audits to make sure they’re SEO friendly.
I am very careful too.
You should not change the structure of a URL when the site is already performing well.
Because it requires a 301 redirection of the old URL to the new URL.
301 redirects do not always give the same results.
Some will not send you authority and trust from the old URL.
This can end in a loss of positioning for a certain period of time.
Changing your URL for a more optimized and clean version will probably help your site in the long run.
You just have to agree to lose some organic traffic upstream.
Do not change your URLs.
Now, if the site is not positioned for any keyword, I always advise you to improve the URL structure (if necessary).
In an attempt to pigeon Google, some will make the stuffing of keywords in their URLs.
Obviously, this is a very bad practice.
In fact, it may even penalize the performance of your site in terms of SEO.
Here is an example of keyword stuffing in a URL that we often encounter:
As you can see, ” men’s shirts ” appears three times in the URL.
Whether intentional or not, this will penalize the performance of the page.
9 / Are Internal Links Correctly Constructed?
An inefficient and non-strategic internal mesh can disrupt search engines.
Internal links are meant to be as clear as possible and have exact keywords in their anchor text.
If you have a page about ” jeans shirts “, you will need to create a link with an anchor text ” jeans shirts “.
This concept seems simple, is not it?
Unfortunately, I see this problem all the time when I audit sites.
Finding inefficient internal links is not an easy task …
You have to go page by page to identify and optimize them.
This is, in my opinion, one of the most expensive “on-page” improvements in time.
To prevent this from happening, be careful to always follow good practices in terms of internal mesh.
The majority of anchor texts for your internal links should include exact or partial keywords.
10 / Does your site show up pretty quickly?
The loading speed of your site directly impacts the user experience in a positive or negative way.
That’s why this criterion is very important.
Use Pingdom and Google’s website speed tool to perform this analysis.
If your site takes more than 3 seconds to load, tell yourself that you have a lot of improvements.
The ideal is a load of less than 1 second, even if it is not easy at all.
11 / Is your site suitable for mobiles?
Today, Google uses a “mobile-first” index to determine the ranking of web pages.
Your site MUST be IMPARTIALLY optimized for mobile.
Use Google’s mobile-friendly check to verify this point.
The solution to this problem is very simple:
If your site is not suitable for mobile devices, ask a web designer to make it mobile friendly.
12 / SEO Logs File Analysis
To go further in the technical audit you can analyze the log files.
These files allow you to have 100% reliable data on how robots crawl your site.
This analysis is an integral part of the technical SEO audit and correcting the problems found in your log files will help you get better rankings, more traffic, and better conversions.
- When your site generates too many error codes (404 errors for example) this can force Google to reduce its crawling frequency of your pages
- It is necessary for you to know if the search engines crawl all your content, new and old, that you want to position in the SERPs.
- It is important that you make sure that all referrals transmit SEO juice.
The tools to analyze your log files and to show you all the data digestively on dashboards are relatively expensive …
It may be for this reason that many SEO do not realize this step (yet important).
To better understand what lies behind the analysis of log files in natural referencing, I invite you to read this article of Oncrawl on the analysis of log files SEO.
Step 4: SEO Audit Pages
Any SEO audit must check the quality of the content and the optimization of each page.
Quality content without effective optimization will not perform well.
Low-quality content with excellent optimization will not get good performance either.
You need both quality content and effective optimization to drain traffic.
The first thing I do is scan the target page with the Copyscape tool.
2 / Is the keyword in the Title tag?
Your main keyword for the page in question must be in the title tag.
This keyword should appear only once.
3 / Is the keyword in the meta description tag?
Take care to insert once (no more) your keyword in the META description.
The keyword will thus appear in bold and will be quickly visible by the user having seized this keyword.
4 / Is the keyword in the first 100 words?
Your keyword should appear once at the beginning of your content.
This sends a signal to Google that this keyword is important to the page.
5 / Is the URL Clear and Optimized for SEO?
The page must include the main keyword in its URL and it must be short.
6 / Does the ALT Parameter of the first image contain the main keyword?
All ALT parameters in your images should be filled in, but your main keyword should appear in the ALT parameter of the first image. For more information read the guide on referencing images on Google.
7 / Does the last sentence of your content include the keyword?
The last sentence or conclusion of your content helps solidify the relevance of the page for your keyword.
8 / Is There Between 2 and 5 Outgoing Links?
Be sure to mention 2 to 5 authoritative sites related to the theme of your blog post in order to add value to the content.
This has the effect of helping your readers and showing Google that your content is serious.
In addition, it has been proven that this gives a boost to the positioning of the content.
This may help you to get backlinks because you mentioned some resources that might mention or share your article.
9 / Is There Between 2 and 5 Internal Links? If yes, are they in the right place?
As noted above, if you have internal links, be careful to use anchor texts with the exact keyword.
Be sure to insert between 2 and 5 internal links.
It’s all you need to analyze your pages.
Now, let me show you how you should analyze your content.
Step 5: Content Audit
Content auditing is a bit like taking inventory of all the content on your site that the search engines have indexed, looking at a certain number of metrics to finally decide what action should be taken – keeping, improving or remove.
The analysis of your content must explore both your landing pages and all the publications made on your blog.
Analyzing content is the longest part of an SEO audit.
Simply because it is the most important part of the entire audit.
You may have mastered all the other parts of your SEO campaign, but with bad content, your good results will not last.
1 / Take Inventory Of All Your Content
A content audit always begins with an inventory of all the content you want to position on the search engines.
The goal is, subsequently, to analyze all the contents listed using various metrics.
The final result will be one of 3 choices:
- Keep as is
To make an inventory of your indexable contents, I advise you to use several tools:
- 1. Crawl the indexable contents with Screaming Frog and export them
- 2. Export indexable content with Google Analytics
- 3. Do the same with the Google Search Console
Then consolidate everything into an Excel file and delete all duplicates.
If you use only Screaming Frog, you risk missing some indexable content, hence the combination of the 3.
Once your Excel file is finalized, proceed to the next step, which is to check if any of your content is likely to trigger a Google penalty.
2 / Check If Some Of Your Contents Risk A Penalty
There are 3 major reasons why your content could (possibly) trigger a penalty:
- 1. Poor quality
- 2. Duplication
- 3. The irrelevance
Let’s see some mistakes to avoid …
The Content of poor quality
Pay attention to spelling and grammar mistakes.
He can not say enough about his …
Use a spelling and grammar checker.
For example the Grammalect tool.
Also, avoid writing for search engines (and not for your readers) by stuffing a max of keywords. Your content will be useless and imprecise.
When your site has irrelevant content here and there, you do not get much.
On the other hand:
Avoid filling your blog with content that has nothing to do with your theme.
This problem is recurrent.
Many blogs offer articles containing:
- Almost images (eg fashion blogs …)
- Very little text
Unfortunately, this can be very harmful when there are many … Panda could well go there one day …
In addition, the longer content ranks better.
A tip: aim for at least 2000 words.
If you are not on WordPress, to check the number of words you can use a tool like Word Counter
This is for blogs that copy the content of others without adding value.
Misleading optimization (which misleads)
Be careful not to optimize pages (Title tag and keywords) for queries that have nothing to do with the content … Visitors to the pages in question will not get the information they expect. Be careful if this is repeated too often.
Other harmful elements
- Indexable results from an internal search engine to your site
- The high amount of tags or categories of indexable pages
3 / Do You Have Duplicate Content on Other Sites?
Some bloggers frequently copy other people’s content on their own blog.
Do they risk a penalty?
No. Except if each content is copied hundreds of times …
Do you risk a penalty if someone copies your content?
On the other hand:
It can happen in some cases that a blog having copied your content is better positioned than you on the search engines …
For example, if this blog is much more authoritative than yours.
So check if your content has not been copied elsewhere.
There are several tools to achieve this. One of the best tools is simply Google Alerts.
Once on the tool, make a pasted copy of a section of your content in Google Alerts and choose which types of Google Alerts sites to inspect.
Finally, tell him the email address to which he must send you the alerts.
You can create as many alerts as you want and be alerted daily, weekly or whenever the case arises.
Another alternative is to use the Copyscape tool.
type the URL of the site to scan and click on “Copyscape Search”.
Report duplicate content without your permission
Many bloggers or webmasters put a contact page (or an email address) on their sites. First, contact them to request that the duplicate content is removed from their site.
Many will ignore your request, but some will listen to you.
For those who do not know, contact the company hosting their site directly.
To find the host you can use the tool Who is hosting this.
The hosts are (in general) quite receptive and can take offline the entire site of the person who illegally copied your content.
As a last resort, you can report it to Google and they will delete the contents of the index.
4 / Do You Have Duplicate Content on Your Site?
To find duplicate content, you will need to siteliner.
This tool will show you which pages share the same content.
Go to Siteliner.com and enter the URL of the site to analyze.
Then click on ” Duplicate Content ” and identify which pages have duplicate content.
Keep in mind that this tool is not always accurate.
He may not know that you have actually deindexed all your category pages.
Therefore, it will classify these pages as having duplicate content.
He’s just chewing on you …
It’s up to you to draw your own conclusion.
I explain how to manage your duplicated content a little further …
5. Is your content unique and original?
This sounds obvious, but the content on your site needs to be unique and original.
That means using your creativity to come up with great ideas and create original content.
6. Is your content useful and informative?
In addition to original content, you must make sure that it will be useful and informative for your readers.
This means that it has to educate them on the topic being addressed or solve a problem that your ideal client is experiencing.
You must always think about your ideal client when producing content.
The content on your site is there to help your potential future customers.
7. Is your content better than that of your competitors?
You do not need to create content if you do not think it will be better than anyone who is already on the first page of Google.
Each of your content must be created with the intention of beating your competitors.
Otherwise, you are wasting your time.
8. Is your content captivating?
Your readers need to believe that you are talking to them directly. ” You ” and “you’re ” should become your two favorite words.
9. Are Your Information Accurate and Correct?
Never invent statistics or falsify information.
10. Do you moderate the comments on your blog?
Spammers love to inject junk links into your blog comments.
That’s why you need to check the comments.
To avoid any association with a questionable site, clean the ” comments ” section of your blog, if necessary.
These different points form the basis for determining whether your content strategy is working or not.
But the ultimate indicator of the performance of your content will be all the data relating to the experience of your users (their behavior).
11. Classify Your Duplicated Content
Once steps 3 and 4 have been completed (see above). You can now sort the duplicated contents (internal and external) according to the 3 categories of actions:
- 1. Rewrite/improve the page
- 2. Delete / Unindex page
- 3. Leave the page as it is.
Which pages need to be improved/rewritten?
- Important pages, such as category pages, homepages, product pages that have a good ROI.
- Pages with good social metrics (eg number of shares) and having a good profile of links.
- Pages with decent traffic.
Make sure to improve these pages by providing unique and useful content.
What are the pages to remove / de-index?
- Delete invited articles (present on your site). That have been published on other sites.
- Delete all content that has been plagiarized by your customer (or by yourself)
- Delete content that does not deserve to be improved. Such as content with no backlink, few social shares, little traffic.
- De-index all deleted content. For this a simple method is to let the URLs of these pages return 404 or 410 response codes. Care must be taken to delete all internal links pointing to these pages (sitemap included).
What are the pages to keep?
- Important pages whose content has been plagiarized.
12. Classify Your Non Duplicated Content
Once the duplicated content is sorted. You can now sort the unduplicated content according to the 4 categories of actions:
- 1. Rewrite / improve the page
- 2. Consolidate the page with other pages (merge)
- 3. Delete / Unindex page
- 4. Leave the page as it is.
Which pages need to be improved?
- Pages having many visits but a low conversion rate, average time spent on the weak page, page views per weak session, etc.
- The key pages you clearly see that there are several points of improvement (the visual aspect for example, the length of the text, adding videos etc.).
Which pages should be consolidated?
- When several pages deal with the same subject, without offering a real unique value but combined together form a superb resource. In this case, we must keep the best pages and delete, then redirect (301) others to this canonical page. Take care to indicate which sections of content deleted pages to consolidate. Note the internal links to keep.
- All time pages that could be merged into a timeless page (forecasts for 2012, forecasts for 2013 … -> forecasts for this year)
Which pages should be deleted?
- Pages with bad links, little traffic, few social signals and poor content. In this case you can delete and return a 404 or 410 response.
- Pages with irrelevant content. If these pages get good traffic and have some good backlinks you can do a 301 redirect and if not, delete.
- Pages with obsolete content and no interest to be consolidated. If these pages get good traffic and have some good backlinks you can do a 301 redirect and if not, delete.
Which pages should be kept?
- Pages with good traffic, good conversions, with a fairly high on-site time, and good content.
Step 6: User Experience Analysis (UX)
Thanks to the UX audit you can, among other things:
- Understand why your conversion rates are low,
- Improve the reception of users of your site
- Discover why users do not stay
Auditing the user experience in order to understand the weaknesses is a key point, especially today.
The ultimate goal?
Satisfy the users as much as possible and empathize with your future customers.
Fortunately, there are tools to give you a big boost in your analysis.
1. Audit of UX with Google Analytics
You’re probably wondering: ” What is a good bounce rate? “
Unfortunately, there is no clear answer on this subject.
The bounce rate varies depending on the type of website.
For example, a site of “funny photos” will have a high bounce rate.
Simply because the visitors will come to see the funny photo, laugh a good shot and leave the page.
Blogs like mine have much lower bounce rates because people want to read and learn what I have to share.
Although it is not possible to generalize what a good bounce rate should be, we can still say that:
- 50% and less, is excellent
- Between 60 and 70%, is normal
- Between 70% and 80, is bad
- Above 80%, very bad ..
The Average Time Spent On The Site
The more your visitors stay on your site, the more likely you are to convert them.
Like the bounce rate, the time spent on a site depends on the type of site.
However, if the average time spent is less than 1 minute, it will be imperative that you seek to understand why.
In general, users will stay longer on a site that contains more content to consume.
For example, my readers stay on average 2:47 minutes on my Blog.
If my readers stayed less than 1 minute, I would be forced to start questioning the strategy of my content and my site in general.
If there is one thing that will repel your readers, it’s the lack of quality content
Local businesses are often struck by a very short time on site.
Because anyone looking for a ” computer repairman on Montpellier ” is probably interested in the rates.
These people will jump from one repair company to another to find the best prices.
The best way to combat this problem at the local level is to produce more useful content for readers.
You need to spend time educating your potential local customers.
Education and transparency inspire trust.
And once trust is established, it makes sales easier.
Concentrate on providing more value than your competitors.
This will improve your bounce rates and encourage your visitors to stay longer on your site.
The more your readers stay on your site to digest your content, the more they will feel like you.
Tracking your goals is the most important metric in Google Analytics.
The only reason your business should have a website is to convert its visitors.
It does not matter if your bounce rate is low or if many people stay for hours on your site ….
If these people are not converted into prospects, customers or subscribers to your newsletter, then you are wasting your time.
The ultimate goal of improving the performance of a site is to make a figure.
Keep in mind that SEO is intended to make your site more visible, but it does not include conversion to customers.
So SEO alone will not help you make more money.
It is YOU who earn money selling your products and services.
You can have the best SEO on earth, but if you are not able to sell your own products and services, it will not work.
The word ” sell ” is interpreted differently by individuals.
But there is one thing that every company on the web has in common:
You must sell your products or services through the creation of articles or videos.
If you skip this step, no one will buy your products or anyone will become a prospect.
That being said, as soon as the objectives achieved are null I immediately look at the sales strategy put in place by the customer on his site.
- Is it easy for prospects to contact you?
- Is there enough information about your benefits?
- Do you sufficiently emphasize your good reputation (social proof)?
Identifying which page (s) are most often posted before your visitors leave your site is a first step in understanding the issues.
You must analyze the page that is used most frequently as a way out by your visitors.
The question to ask is: ” Why do they leave my site from this page, more than others? “
Believe it or not, it does not always prove to be a bad thing to have a high exit rate on a particular page.
At times, the content has played a role in bringing all the things that readers wanted to get, and pushing them out to put into action what they have learned.
They are satisfied.
Do not think that users leave a page because they hate it.
If the contents of this page solved their problem, and they leave the site, you won.
There is one very important thing to consider when you analyze the exit rate in Google Analytics.
Do not focus on the total number of ” Outputs “.
The total number of exits will always be higher on pages that receive more traffic.
The number to focus on is ” Outputs (in%) “.
Sort the percentage data from highest to lowest.
A high exit percentage is above 80%.
A “normal” percentage of output is around 50-65%.
The # 1 problem causing visitors to leave a page very frequently is simply that the content on this page did not meet their expectations, and therefore solved none of their issues.
There are other factors that drive people from a page like the design, however, content is almost always the cause.
Head to the page with the highest exit rate and ask yourself the following questions:
- Does this page solve a problem or answer a question in a truly complete way?
- Are there still some unanswered questions?
- Is the content nice to read?
- Are there too many blocks of indigestible text?
- Pictures too small?
- Broken image links?
- Does the page load very slowly?
- Are there any distracting/disturbing elements such as advertisements that would make visitors flee?
- Have you configured your external links on “open in a new tab” (if no, you should)?
These questions are sufficient to find and solve the problem.
Follow this process for all pages with a high output rate.
The number of visitors visiting your site again is a very positive user experience.
This means that your site or content deserves to be seen and read recurrently.
Having repeat visitors is a good point in terms of conversion.
Indeed, this gives you more opportunities to convert them into prospects or subscribers to your newsletter.
If you do not have a high percentage of repeat visitors this can mean that your content is missing something.
Or maybe your website has technical or content issues (see my explanation above) that pushes your readers.
2. Audit of UX with the Google Search Console
Search With Keywords Containing Your Brand
Just like recurring visitors, searches using your brand name or company (example: Drujok) is a very positive indicator since it means that these people are interested in your website or brand.
If you produce quality content and your site is built with the purpose of satisfying your readers, they will necessarily want to come back to visit your site.
This means that next time they will go on Google directly typing your brand or company name.
To check if some of your readers already like you, you need to use Google Search Console.
Go to “Search Traffic ” and click on “Search Analysis “.
Filter by ” Clicks ” in such a way as to display the search terms with the most clicks first.
Your company or brand name must be at the top of the rankings.
3. Social Signals
Social signals alone are not very powerful.
BUT, if you combine them with all the other positive metrics for the user experience, this will have the effect of literally boosting the positive signals for your site’s ranking.
One of your company’s priorities should be to get real social cues.
The only way to get and create quality content and satisfy your readers.
You can also use plugins like Social Locker if you have a little trouble getting shares and likes.
4. Audit of UX with Hotjar
As you know, the forms on your site are crucial.
They allow your users to
- Request more information
- Ask for a quote
- Subscribe to your Newsletter
It is therefore important to make them easily and quickly refillable.
Otherwise, you risk dealing with the high dropout rate.
Hotjar lets you understand how users interact with your forms with these two Formsand Recordings tools.
You can test it for 15 days for free.
For that, you have to copy a code on your site (in the file Header ).
The Forms feature provides everything you need to know the performance of your forms.
It lets you know, among other things, which forms your users do not fill and which fields are often left empty.
The recordings feature is extremely useful.
It allows you to watch how users (anonymous) interact with your site.
So you can watch them fill out your forms … for example.
You will see where people are blocking and which fields seem to be rejecting them.
5. Audit of UX with Crazy Egg
Another way to know how your visitors navigate your site: the Heatmaps.
Crazy Egg offers this feature.
You can see the parts of your place where there is the most commitment from the visitors.
Red areas mean that there are a lot of clicks.
Blue areas, the opposite, little or no clicks.
Now it’s finally time to take a look at your link profile.
Step 7: Audit Backlinks
As you probably know, backlinks (or inbound links) can literally boost an SEO campaign.
That’s why a big part of your SEO audit should be analyzing the profile of your site’s links.
For this, I use the tool Ahrefs and Google Search Console to deeply analyze the links.
Now you must ask yourself: ” Que should we look? “
It is necessary to check different factors:
1. Relevance of Links
The relevance of links is a key factor when it comes to creating links.
It is almost always the starting point of the links audits that I realize.
It is not necessary to have 100% relevant inbound links, but a majority is required.
In order to quickly identify the relevance level of a link profile, it is sufficient to export the links from Ahrefs and use the bulk check function in the Majestic tool.
When you export from Ahrefs, be careful to include the referring domains as well:
Open Majestic to use the “bulk backlink checker ” tool to see the “Trust Flow by subject” of the different “referring domains” that you have previously exported from Ahrefs.
Although the “Trust Flow by Subject” metric is not perfect, it is the only metric of evolutionary relevance that exists today.
Manually checking the relevance of each site including incoming links would be a waste of time.
The goal here is to obtain an overall picture of the relevance of DOMAINS linked to the customer’s site.
For this, go to ” Tools “, ” Link Mapping Tools “, then ” Bulk Backlinks “.
In the Bulk Backlinks, use the referring domains previously exported from Ahrefs and export the results.
Sort your CSV file according to Trust Flow by subject.
Identify which sources of links are completely irrelevant to the rest.
If you are a plumber and you have an incoming link from a site with the theme “sport”, this is not normal.
You will then have to record each irrelevant incoming link.
This does not necessarily mean that you will have to ask to delete them.
It’s just a way for you to know they exist.
Thus, when your site will be hit by a Google penalty (hoping that this will never happen to you) you can take a look at these sites and remove irrelevant links.
2. Authority of Links
After the relevance of the links, the authority of the links is another important criterion.
In fact, a strong authority can sometimes hide the irrelevance of a link.
I prefer to focus first on the relevance of the links (and then the authority) because I think it secures your site away from possible penalties distributed by Google’s algorithms, regularly updated.
There are several ways to know if a backlink is authoritarian.
You can check this by launching a ” bulk check ” in Majestic or Ahrefs.
Ahrefs ” Domain Rating ” (DR) is a precise gauge of the authority of a site.
This is much more accurate than the PA (Page Authority) and the DA (domain authority) of MOZ because this metric is simply updated very frequently.
Data from Open Site Explorer, as it is. Is updated infrequently and is not accurate most of the time.
You can use Open Site Explorer to double-check with Ahrefs but do not rely on these metrics alone.
Another metric on which it is virtually impossible to bet is the SEM Rush traffic score.
The reason is that this metric is based on real rankings in search engines.
SEM Rush uses its own algorithm to determine the quality of your organic traffic.
It’s not perfect but it’s a metric that I rely on every day to determine the quality of link opportunities.
The use of all metrics at your disposal allows you to gauge the quality of your current backlinks or new links opportunities.
3. Diversity of Links
Diversifying your incoming links makes your link profile more “natural”.
In addition to the “type” of incoming link, it is also necessary to diversify the links with SEO juice (DoFollow) and without SEO juice (NoFollow).
At this stage of the analysis, just ask the following question
” Is my link profile quite diverse? “
4. Targeting Links
Another important factor for the links you need to analyze is the ratio of the links on your homepage to the deep links.
If you follow a content-focused SEO approach, the majority of your inbound links should be towards the deeper pages of your site.
In general, it is necessary to distribute the incoming links equitably on your entire site.
This will build the overall authority of the site and improve your chances of seeing positive results quickly.
5. Diversification of Anchor Texts
The optimization of anchor texts is very common, that’s why you always have to check the ratios.
The first ratio to check is the percentage of anchor text having an “exact match”, that is, containing an exact keyword.
You have to look at the percentage of anchor texts containing a brand/company name.
If the percentage of anchor texts with an exact keyword is greater than the percentage of anchor texts with a brand name, it is imperative to change the strategy.
As you may know, much of your anchor text profile must use your brand or company name.
Anchor texts with an exact keyword must be used at a very small rate because they are simply a very strong signal of SPAM in the eyes of Google.
If your site is suffering from on-text anchor optimization, there are several solutions:
- Construct new incoming links with anchor texts of the type “brand or company name” in order to dilute the number of anchors on optimized
- Change some anchor texts with exact keyword in anchor text with the brand/company name
6. Total number of reference domains
The more a site has different referring domains related to it, the better.
The analysis we are doing here is a simple comparison with the highest ranked competitors.
For example, know how many referring domains are linked to your site and compare with your competitors?
The solution is simple:
Just get more quality and relevant inbound links from unique domains.
7. Propagation Speed of Incoming Links
Does the speed of propagation of your site’s inbound links remain constant over time?
Is it irregular?
Periods showing significant losses of links are suspect.
Inbound links from real sites are rarely removed.
Inbound links from artificial websites are deleted when the link providers do not renew their hosting or renew a domain …
Your goal is to get a growth curve of incoming links in this way:
Ready To Do Your SEO Audit?
The 7 steps of an SEO Audit described in this guide are tedious but crucial to get better results in search engines, so download the checklist and start the 57 tests of the SEO audit.