When creating a website's SEO strategy, it is important to carry out a detailed analysis of every single factor that could impact web optimization and positioning.
An SEO checklist may be the easiest way to perform this complete review. We suggest one that you can adapt to your project and that we structure in the following blocks:

Download this FREE PDF template to print it and analyze your project
Initial setup
First of all, we are going to configure or review the existing configuration of our website, both in analysis and monitoring platforms, as well as in our own server.

1. Google Analytics
Take advantage of the Google Analytics system from the beginning, to measure a complete evolution of the visits to your website. Check that the tracking code is well inserted and that it reflects visits in real time.
In addition, you can implement the entire tracking system of goals and conversions to see the value of each visit to your website.
2. Search Console
Search Console is the indispensable tool for every SEO, with which you will have access to a complete monitoring of your website and its status in the search engine.
But if you are passionate about data and want to get the most out of the information provided by Search Console, don't stop there: with SEOcrawl you can go a step further and add vitamins to your analysis, reports and monitoring so that nothing escapes you.
https://www.youtube.com/watch?v=ZiTBBSD2T1c&t=193s
3. Check different URL versions
A website can have different versions:
-
http:// www.miweb.com
-
https:// www.miweb.com
-
https//miweb.com
No errors should occur in any of them, and they all should direct to the main one, which will be the one you decide, to avoid problems of duplicate content or 404 errors, among others. Confirm that all secondary URLs direct the user to the main one by means of a 301 redirect.
4. Keyword Research
Keyword analysis sets the building blocks of any SEO strategy. With the following tasks you will be able to build a strategy with high profitability potential and a realistic growth plan.
Thanks to detailed keyword research we will be able to identify:
-
Core keywords for your business
-
"Long tail" keywords to attack immediately
-
Keywords your competitors are ranking for
-
And much more!

5. Identify main competitors
There are two ways to analyze the competition on the Internet. On the one hand, the real competition, those companies that are part of the sector in your niche, that you probably know and whose current position in the search engine you must identify.
On the other hand, those who are in the top positions in the key searches you are interested in. They may not be sectorial competition, but they are organic, so they have to be included in your objectives.
You can search for competitors using different tools:
-
Your knowledge of the sector.
-
Direct searches in the search engine.
-
Market analysis with specific tools and software, such as comScore, Semrush, Woorank or Ahrefs, among others.
6. Identify industry leaders
It is easy to identify the leaders in a business sector; they are often well-known and international brands.
It can be difficult to match them, but they can help you set a target, see what this sector is capable of offering both from a commercial and an SEO perspective and, incidentally, get some ideas that we can adapt to our own business.
7. Define the project's core keywords
The most difficult task of all is to identify our project's main keywords (core keywords) and, once determined, we must set an optimization on them. This process is fundamental because the success and certainly the revenue of the business depends on them: no matter how many keywords you have positioned, surely the flow of business and conversions come from a group of specific keywords, which we call “core”.
On the other hand, we must also start from these keywords to detect new secondary keywords, which can also offer us profitability.
Choosing these core keywords must be done by taking advantage of all the resources at our disposal. These are some of the most useful ones:
-
Again, your knowledge of the sector.
-
Keyword tools such as the one offered by Google Keyword Planner.
-
Keyword comparison tools such as Google Trends.
-
SEO keyword analysis tools, such as Ahrefs, Semrush, Sistrix, keywordtool.io, etc.,
Once you have identified them, tracking them is very easy with SEOcrawl: within Tags, set the conditions for a keyword to be considered core business and then use the advanced filters in Top Keywords to save a smart view with those conditions.

From this moment on, you will only have to choose the Core option within your Smart Views list to analyze that set of keywords. The best part? From the same dashboard you can set the alerts you consider relevant so you don't have to keep an eye on possible ups and downs every day. Easy, isn't it?

8. Initial photograph. Starting point
With all these parameters we can establish what is known as “initial photograph”. It is like a picture of the beginning of a race.
What is your company's position? Has it made any progress in optimization or positioning? And has it so far surpassed any rival in the sector? Establish your starting point and, from there, set your first SEO goals.
Note: If you connect your project to SEOcrawl, you can immediately see the detailed performance evolution of the last 12-16 months.

On-page SEO
The on-page SEO is the optimization that is performed on everything that we control within our own project. Therefore, it is exclusively up to us to analyze and improve it whenever possible.

9. Sitemap
Make your website's indexation as fast and complete as possible with a sitemap.xml. You can do it by hand, if you have programming skills, or take advantage of various plugins that create them in one click.
Once you confirm that it is created and uploaded you must link it from Search Console. It is not mandatory, but it is a very useful measure to monitor that the communication between your website and the search robot is fluid and effective.

10. Robots.txt
In most cases, the robots.txt file serves only to communicate to the search robot which pages are indexable on the web. A correct configuration will help you avoid crawling problems of pages with little SEO value or even potentially dangerous ones.
Both in these cases and if your project requires a more complex Robots.txt, it is essential to have it and upload it correctly.
11. Web architecture
Onpage SEO depends a lot on usability and this factor changes as navigation standards and technology evolve.
Currently, the optimal architecture of a website is one that has:
-
a clear and accessible menu
-
content that can be reached in as few clicks as possible (ideally no more than 4 clicks)
-
access to all parts of the site in an intuitive and fluid way.
From an SEO point of view, this architecture must be user-friendly and focused on the objectives set, highlighting the most relevant products or services and avoiding that sections of little value occupy areas that Google considers important. It is important that this architecture is as clear as possible and that it makes clear to the crawlers at all times the importance of each part within it.
12. Friendly URLs
The url is nowadays one of the most important elements in SEO on page optimization, that is why its analysis and continuous improvement will represent a lot of effort within the strategy.
A friendly url is one that includes main and secondary keywords, related to the content of the page and framed in our keyword strategy.
The tasks related to this optimization are the following:
-
Revision of unfriendly urls.
-
Optimization of existing friendly urls.
-
Creation of new friendly urls.
Example:
Wrong: example.com/blog/category/post?=34645
Correct: example.com/blog/travels/andalucia-2-days
13. Titles
The Title is the title of the page that appears in the search results, when the web is shown for a specific keyword.
The optimization of the title is more complex than it seems, since it must respond to SEO objectives, help to promote the brand and respect the limits imposed by Google.
14. Descriptions
The descriptions of all pages obey similar rules to those of the title, except that, since they have a greater number of characters, they have more optimization and creativity options.
Although meta descriptions are not an SEO positioning factor, they can have a decisive influence on the CTR, getting more users to click on your result, so it is very interesting to make the most of their potential.
Both for the optimization of the tags “title” and “meta-description”, as “Alt”, “headings”, etc., which we will see later, we recommend the SEOcrawl extension for Google Chrome, very practical to identify these and other values of each url in which we find ourselves browsing.

15. Breadcrumbs (Breadcrumbs)
Breadcrumbs (also called “breadcrumbs”) are the navigation links present on a web page, which serve both to indicate the location to the user, and to allow you to navigate between sections or previous categories.
In SEO, breadcrumbs offer both usability and keyword optimization, since each “crumb” allows the inclusion of a keyword.
In addition, breadcrumbs allow you to introduce schema.org's breadcrumb-specific data markup, which helps Google understand the hierarchies of your architecture.

16. ALT tags
In SEO every factor is important and the sum of the whole is what marks which website is better optimized. That is why you should also pay attention to ALT tags, an attribute with a double function that you can optimize with the keywords you are most interested in: it is the text that will be displayed in case the image does not load, so it helps both Google and the user to understand the content of the image.
Certain SEO tools, such as crawling programs like Screaming Frog or similar, can help you identify missing ALTs in certain parts of the web so that you can optimize them much faster.
17. Correct hierarchy of headings
Headings are not only useful for the user, but also have weight in web design. Users increasingly “scan” the texts, looking for the paragraph that best suits the exact information we need.
By using them correctly, i.e. using:
-
a single H1 per page defining the main idea
-
several H2s that support those semantically relevant kws,
-
H3 that provide information, extra information, etc..,
We will be helping not only the user, but also Google to understand well the content of the web.

Analyze your website and make sure that all the headings have been implemented correctly: it is very common to fail at this point for design reasons (remember that the function of the headings is to order the information, never to layout the content, for which we have other visual resources), often using a large number of H1s per page, which can confuse the Google spiders.
18. Images
Images are graphic resources that can visually enrich a web page, but you must pay attention to their optimization.
On the one hand, you should reduce their weight as much as possible, without affecting their quality. On the other hand, you should make good use of their optimization, since they have alternative texts and titles, they can be linked, have coding of their dimensions, etc.
Finally, do not fall into the temptation of saturating the content with an excess of images, as this can affect the loading of the web and reduce the usability of the site.
19. Internal links
Internal link juice serves to boost the positioning of main pages and, sometimes, to derive part of their authority towards those that have less weight, as long as an appropriate strategy is drawn up and the links are dofollow, so that the search engine takes them into consideration.
Prioritizing the main pages and designing an internal link distribution strategy is key to ensure that they receive the highest possible authority, thus avoiding that less relevant pages have much more weight than them.
20. Mobile Friendly
With the generation of the use of mobile devices and the announcement by Google of the Mobile First Index (whereby the mobile version of a website is considered the main one and is the first to be indexed in the search engine) it is unnecessary to announce that any website must be multi-device and this means that it must be “Mobile Friendly”, i.e. accessible, fast, usable and functional from any cell phone.
We can easily check it from Search Console or through the Lighthouse tool.

21. Hreflang
Hreflang tags are used to tell Google that there are different language versions of a website. They are necessary in multi-language websites and must be correctly implemented, according to Google's guidelines.

It is very common to make mistakes in its implementation and, in the case of very similar languages, as may be the case of Spanish with Latin American languages, can even lead to duplicate content.
You can check that they have been implemented correctly with the Google Chrome extension Hreflang Tag Checker or with the SEO extension SEOcrawl, which will analyze all the implemented versions and inform you if there is any error in any of them.

22. Correct use of pagination
Pagination is one of the elements on page less cared by SEO professionals, when in fact it has a great importance.
A correct pagination is not only functional, it must also be optimized. These are very common mistakes:
-
Using canonical from all the pagination to the first page (this is serious, since each page has different content).
-
Prevent Google from crawling them correctly
-
Adding them using JS code, which makes them unreadable.
23. Filters
Filters can become the key piece of a website's SEO, especially in ecommerce. If you have a page where they have a place, you should try to optimize them correctly and decide whether you are going to work the SEO on the resulting pages or if you prefer not to index them and prioritize the pages of categories or products that interest you most.
It is an important decision but, in any case, we recommend that you optimize each page well, so that you always have the potential to index it and make it attractive to the user.
An incorrect use of the filters can generate hundreds of thousands of urls resulting from combinations between them, causing Google to waste time crawling and indexing these pages, even above your own main products, which can weigh down the whole project.
24. Indexing analysis
From the first day you allow indexing, you should analyze from Search Console how it evolves. Google's tool will identify possible errors and detect low-quality indexed pages, among other issues. In addition, you can perform the same analysis (with some extended functions) from the SEOcrawl Indexing option.

The objective of the indexing analysis is both to confirm that the urls we are interested in are well indexed, and to try to correct all the errors or lack of optimization that the others may be suffering.
25. 404 errors
A 404 error occurs when the content or resource the user is trying to access is no longer available. 404 errors have a very bad reputation, sometimes deserved, when in fact they represent an opportunity.
With tools like Search Console or any url crawler plugin, you can identify if there are broken links on your website that are leading to a 404 error page.
SEO optimization consists of deciding whether to redirect that page to an equivalent page or to create an optimized 404 page, which offers the user other entry options and avoids losing their visit.

Having 404 errors on a website is not bad per se, but Google identifies it as something natural; it is the fact of not dealing with them and the fact that they have more and more presence on our site that can become a problem.
26. Soft 404 Errors
Google mainly identifies these errors as incorrect redirects, i.e. redirecting old pages that no longer have value or exist in a massive way to the home page: when a redirect occurs, Google wants it to be to an equivalent page, if we do it systematically to the home page, they will end up being considered as soft 404 errors.
Imagine that you are in a physical store and you have reached the end of the store walking in search of a particular model of sneakers, but when you get there they tell you that they are sold out and, instead of offering you something similar, they take you back to the door of the store so that you go back to the store looking for them, wouldn't it be appropriate, right? Well, at the web level it is something similar.
27. Redirects - Redirection Chains
Be careful with redirection chains, that is, redirections of other redirections. On the one hand they will increase the loading time of your website and, in addition, they can motivate the search robot to stop crawling through them.
Example:
-
http://ejemplo.com (redirect 1 with 301)
-
https://ejemplo.com (redirect 2 with 301)
28. Optimized and personalized 404 page
As we said before, a 404 page can be an opportunity for any error encountered by a user visiting your website.

Pour all the creativity you can, take advantage of a text optimized in SEO and promotional and offers a list of valid links for the user, which encourages him to continue on the web.
29. Canonical
The canonical url or canonical link is the one used to avoid duplicate content. With it we communicate to Google which url is the original and which can be a derivative copy, for example, in online stores.
Use canonical links correctly with the structure:
and make sure that your website does not suffer from duplicate content.

30. Urls cannibalization
Cannibalization occurs when two or more urls try to rank for the same keyword. Normally in SEO it represents the waste of one of the two, but it can have more serious consequences, such as not positioning any of them or giving a bad image to the user.
With SEOcrawl's Cannibalizations feature you can identify which urls are trying to rank for the same keywords and should be optimized independently.

Technical SEO
Within this section of our SEO checklist we have added some points that require more advanced knowledge, but that are a fundamental part once we have the rest of the points well optimized.

31. Rendering
With Search Console we can check if the rendering of our pages is correct, which will allow us to check if the crawlers are finding any difficulty to load and understand correctly the content of our urls. It is very common, especially when we start using JS-based technologies, that crawlers find difficulties when rendering the content of a page.
Using Search Console, tools like Screaming Frog or software like SEOcrawl (Crawler function), we can get a comparison between the code and what Google actually sees.

32. Rich Snippets
Rich Snippets are rich code snippets, based on Schema.org data markup, that appear prominently in search results.
Not all urls can take advantage of Rich Snippets, but those that can enjoy their potential benefits should not miss out.
33. Open Graph tags
Social networks have popularized Open Graph tags, which are used to identify which elements can be shared on these platforms. That is, how we want them to be seen once we share content on social networks. You can also check them from the SEOcrawl Crawler functionality.

34. Server Logs Analysis
Technical SEO goes all the way to the server. Log analysis allows us to read what really happens on our website, i.e. what exactly Google spiders are doing on our site, thus discovering problems of crawl budget, thin content, crawling, etc.
Log analysis is a very important part, since thanks to the crawling programs we have a simulation of what is happening on our website, but only by contrasting it with the logs we will have a global image of our site.
35. WPO
Web Performance Optimization basically consists of analyzing all the elements that affect the loading time of a website and optimizing them to 100% efficiency.
Google's PageSpeed Insights is a free and very detailed tool that can help you in this important technical SEO task. We recommend contrasting its results with those of another similar tool, such as GTMetrix, to detect all the points of speed loss.

SEO Content
Content is king for Google and, consequently, also for its search and indexing robot. That's why you should optimize the content of all your pages as well as possible if you want to beat the competition.
When analyzing your website, confirm that the content meets these requirements.

36. The text is structured in paragraphs
Readability is an** important SEO factor**. Structure the text of your pages in paragraphs so that it is attractive and comfortable for the user.
37. The main question is answered at the beginning
Not only is it an optimal resource in terms of information, but Google values it to the point of having created a “position zero”, which rewards web pages that answer questions directly and concisely. In addition, always remember to validate and cite your figures and data so that everything is as credible and secure as possible.
38. Bold type is used
Bold type is still a very useful factor, since it allows the user to quickly “scan” the content and quickly access the section that really interests him. A text with well used bold is a useful text and, therefore, has more chances of ranking.
39. Questions are answered
Google's “position zero” is also designed for pages that answer questions that users have. If this answer is placed at the top of the page, you have more chances to reach it.

40. Multimedia content is used
Without loading the weight of the web, you should take advantage of multimedia content, to enrich the rest of the page and have more optimized elements.

41. Thin Content
The concept of Thin Content refers to low quality content, usually due to the absence of sufficient textual content or because, in fact, what there is has very little informative value.
Identifying and optimizing these pages is key, as they can generate penalization problems if Google considers that you have an excess of them.
Off page SEO
In the analysis of the SEO positioning of your website, the off page SEO responds to those influential elements that occur outside your website. However, you have many optimization options that you can take advantage of.

42. Toxic inbound links
You can take advantage of many tools to analyze inbound links to your website, such as the official Google Search Console, or the best known in this branch such as Ahrefs, SemRush, MOZ, Majestic SEO...
It is important to have a considerable number of internal links, but quality is more important than quantity in these cases.
Toxic links derive from websites with unrelated or even inappropriate content, which can negatively affect your SEO positioning. For this reason, you should invalidate them as soon as you detect them.
Whether they are unwanted toxic links or come from a failed SEO strategy, you should keep in mind that any link to your website must come from a site related to your subject matter, valid for search engines and if possible from a site with authority.
43. Is there any disavow uploaded?
By means of the disavow file that is uploaded to Google, we are indicating which of the incoming links we do not want the search engine to take into account. It is important to know that when you upload a new disavow file, the previous one is deleted, so it is very important to always check if there is a previous one to update it instead of deleting it.
44. Anchor text profile
You must create a natural anchor text profile suitable for your link building strategy. It should respond to different terms and include related keywords, avoiding focusing only on the brand + main keyword combination. If the strategy is not natural and Google suspects unlawful techniques, it may end up invalidating the links you get.
45. Natural growth of inbound links
Our link building strategy must be “natural” in the eyes of Google, this means that we must avoid abrupt growth in the number of incoming domains. If, for example, we create a website from scratch, it is not normal to receive a high number of links as soon as it is up in the air. Wanting to rush to increase our authority in a short time can end up being a dangerous weapon. It is better to have a more continuous growth without these jumps.
46. Checking the most used metrics (DA/PA, TF/CF, DR,UR)
In SEO positioning, some of the metrics based on the most contrasted tools worldwide, such as MOZ, Majestic SEO and Ahrefs, are usually used as a reference.
DR/UR, DA/PA and TF/CF are the metrics related to each of them:
DR (Ahrefs): Domain Rating. The Ahrefs tool provides us with this metric with which to measure the quality of a website's link profile, through the quality and quantity of a website's external links. It is currently considered one of the most reliable ways to measure the link profile. It goes from 0 to 100
-
UR (Ahrefs): URL Rating. If the previous term referred to the total authority of the domain, this one is based on a specific URL. It goes from 0 to 100
-
DA (Moz): The Moz tool provides us with its own metric “Domain authority”, which measures the authority of a website through different factors, not only the quality and quantity of links. It goes from 0 to 100
-
PA (Moz): Another Moz metric, known as “Page Authority”, allows us to know the authority of a specific page.
-
TF/CF (Majestic): The Majestic tool offers us the known as Trust Flow and Citation Flow, another way to measure the quantity (CF) and quality (TF) of links. Remember that you should always seek to capture links on websites where the TF is greater than the CF.
47. Has the profile of competitors been analyzed?
The off page SEO comes into play in the analysis of the competitors' profile. You must maintain this monitoring and evolve as your rivals do in order to look for niches, opportunities and strategies with which to overcome them.
48. Correct distribution of links to different pages
Link building is the most important off page SEO strategy and also the most difficult. You must look for inbound links from third party sites to different pages, not always the same one or Google will identify over-optimization.
Bonus: monitor your off page strategy with SEOcrawl's Links function.
In addition to working on growing your link profile, you should also take care to monitor the links you've earned, to make sure they don't get lost or change status between follow and nofollow, for example. So that you don't waste time in this monitoring, SEOcrawl offers you the Link Monitoring function, which takes care of this work for you.

With this complete SEO checklist, you will be able to analyze and have your website in perfect condition to climb the Google rankings. It is important to analyze each one in detail, develop the necessary corrective measures where appropriate and make a periodic review of them.
If you have any questions or suggestions, please send us a message or leave a comment!

CEO & Co-Founder at SEOcrawl
I first encountered SEO in 2011 and since then it has been a huge part of my life — something I am completely passionate about. It is a pleasure to be the CEO of SEOcrawl, an innovative all-in-one SEO software that is changing the way companies manage their SEO strategies.
View all posts by David Kaufmann →


