Using SEO Tools Correctly: Not Every Problem is Truly a Problem

17. 06. 2024
|
SEO
|

2,561 missing alt attributes, 978 duplicate title tags, 722 pages without H2 headers, 11,243 pages lacking meta descriptions.

Is this the end of my website’s visibility in search results?

No, not at all. Not every “problem” flagged by online marketing tools is a genuine issue, and not all data is equally relevant. When you understand how SEO tools work and how to interpret their data, they become powerful allies that significantly aid our efforts. However, if metrics and numbers are misunderstood or taken out of context, they can lead to wrong conclusions, wasting time and money.

In this article, we’ll explore some of the most common misconceptions about SEO tools, how to use them effectively, and how to accurately interpret the data they provide.

SEO Tools and Their Data

SEO tools excel at analyzing various aspects of a website and presenting the results in numerical form. These programs are unmatched in terms of both precision and efficiency. However, they fall short in interpreting these results accurately. It is our responsibility to determine the significance of the numbers and draw the correct conclusions from them.

While it is the tools’ job to generate data, it is our job to interpret it.

Without context-specific interpretation, the numbers are meaningless and disconnected from reality. Only those who understand the numbers and can evaluate their relevance can make the right decisions for necessary optimization measures. This article will focus on how to derive the right actions from these numbers.

Don’t Panic: Not Every Problem is Truly a Problem

The primary function of SEO tools is to identify errors. Therefore, they will find errors on almost every website. This can quickly unsettle beginners or website owners without a solid understanding of SEO, as they encounter numerous “warnings” and “errors” from various SEO tools. But how significant are these error messages really? Before panicking, take a deep breath and ask yourself:

  1. Which errors are relevant and which are irrelevant?
  2. Which problems must be resolved, which are merely cosmetic, and which are not problems at all?
  3. What actually impacts rankings and user experience?
  4. Lastly, how much effort is required to fix the error? Is it worth it, or is the effort disproportionate to the effect?

Warnings in Google Search Console

Let’s first look at arguably the most important tool available to webmasters and SEOs: Google Search Console. It is the only tool that provides data directly from Google. We recommend everyone who runs or builds websites to verify them in Google Search Console to access this valuable data.

Google Search Console provides information about the indexing status of a website, rankings, clicks from Google Search, and Core Web Vitals. Naturally, the GSC also issues warnings and errors.

Some people get alarmed by every warning, while others simply ignore the messages and delete them from their inbox. Reacting to every warning causes unnecessary stress, while ignoring these messages can result in neglecting issues that may significantly impact the website’s performance.

Among the most common warnings from Google Search Console are:

Non-Indexed Pages

The GSC provides information on which pages are not indexed and why. Whether this is a problem or not depends on the context and must be assessed individually. Most of the time, there is no need to worry, but sometimes the error messages point to serious issues.

Is it a problem if 114,000 URLs are excluded from crawling via “noindex”?

Problem: The warning might indicate that a “noindex” tag was accidentally added without your knowledge.

No Problem: If the pages were excluded from indexing for a good reason, such as preventing Google from wasting resources on crawling and indexing unimportant pages, then everything is fine.

Are 404-Errors a Problem?

Problem: If important URLs are affected—such as landing pages or articles with many backlinks—significant SEO value can be lost. 404 errors due to faulty internal links are also problematic for users and the overall quality of the website.

No Problem:

If you’ve just deleted 500 products from your shop or cleaned out old, unimportant blog posts, numerous 404 errors will naturally appear. After all, the pages were deleted, and a 404 code from the server is the correct response—provided there are no other products that meet the same user intent that you could redirect to.

Core Web Vitals

It is well known that loading time is a crucial factor — both for users and for Google. The term “loading time” was long vaguely defined and therefore difficult to measure, which is why Google introduced the “Core Web Vitals” to provide webmasters with clear data on a website’s loading time. These metrics are visible in both GSC and PageSpeed Insights. Simply put, they focus on three key factors:

  1. How quickly does a page load? Specifically, how long does it take for the most important and largest element on a page to be displayed so that the user no longer sees a white screen? This metric is called LCP (Largest Contentful Paint).
  2. How quickly does a page respond to user actions? In other words, how fast does something happen when users click a link, fill out a form, or perform a search? As of March 2024, this metric, known as INP (Interaction to Next Paint), has replaced the First Input Delay (FID).
  3. Lastly, visual stability is measured. This is measured by CLS (Cumulative Layout Shift), which becomes noticeable when you try to click on something, but the element shifts at the last moment, causing you to unintentionally click elsewhere.

For each of these KPIs, GSC provides three assessment options: Good, Needs Improvement, and bad. PageSpeed Insights additionally gives an overall performance score from 1 to 100, with 100 being the best possible score.

While a higher score is obviously better, the question remains: How much better? And is the effort worth it? In other words, does the optimization impact the rankings?

 

As is often the case with SEO, various factors come into play:

  • Who is the target audience, where are they located, and what devices are they using? If the end users are in metropolitan areas and primarily use high-end devices, an average score will suffice. Users will hardly notice the difference. However, if the target audience is internationally distributed, located in areas with poor internet coverage, and using outdated devices, much more effort is required to provide them with a decent user experience.
  • How fast is the competition? If you are the sole provider of high-quality content on a specific topic, you will rank well even if your loading times are all in the red zone. Loading time does not trump content!

 

Let’s examine these criteria using the example of an informational search for seal species in the UK:

For this keyword, a six-year-old blog post ranks first, featuring excellent content, 18 great self-made photos, and well-researched informative text:

However, the page’s loading time is a disaster. The Largest Contentful Paint (LCP) takes almost 10 seconds, which is not surprising since the first image is over 8 MB in size:

Despite this, the page ranks number one. And this is despite the competition having nearly perfect PageSpeed scores and much higher domain authority, such as the WWF page on seals:

This is not to say that a completely unoptimized page has no impact on rankings. But it demonstrates that for Google, the quality and relevance of a page’s content still come first. If you have by far the best and most relevant content, you can rank well even with a page that is poor from an SEO perspective.

But beware: it doesn’t work the other way around! Just because a page loads quickly does not mean it will rank well.

This means that while loading times are a ranking factor (although no one knows exactly how this works or how much weight it carries), it is fundamentally about optimizing the user experience.

If you get lost in the numbers and optimize only for a better score, you will most likely waste resources and achieve only moderate success, if any. However, if you analyze the numbers with the goal of improving the user experience, you save resources and achieve multiple improvements at once: better UX, higher conversion rates, and better rankings.

Our tip: Focus on UX, not just better numbers.

Visibility Index

Some SEO tools, such as Sistrix or Xovi, calculate a overall visibility index for websites. This index gives a good overview of a website’s ranking development over a certain period of time.  At a glance, you can see whether the website is generally strong and if it has been affected by various Google updates.

But is the visibility index a KPI that should hold special significance for your own website?

Let’s take a closer look at what’s behind it: The Sistrix Visibility Index is based on a set of one million commonly used keywords. The tool checks how many of these keywords a specific domain ranks for in the top 100. The resulting value depends on several factors and is subject to a number of uncertainties:

  • The choice of keywords: Which and how many keywords does the tool consider? Does the number or selection of keywords change? In the case of Sistrix, yes, as the tool is constantly adding new keywords in response to new trends. Xovi does not do this and remains faithful to a constant list.
  • Size of the niche: How big is the niche in which a particular site is located? The broader the topic, the higher the calculated visibility. A site with a lot of content on popular topics can have very high visibility. On the other hand, a site on a niche topic that only ranks for 5 to 10 keywords will have a low – or meaningless – value.
  • Search volume: The tools take search volume into account when calculating visibility – so not all keywords are weighted equally. If the search volume for a set of keywords changes, this can quickly lead to fluctuations in visibility. The same principle also leads to distortions between well-known and unknown brands. Both will be ranked number 1 for their brand name, but this ranking will have a much stronger effect for a well-known brand with high search volume than for an unknown brand.
  • CTR – Click-through rate: The CTR also affects the calculated visibility. Changes to the search results page, such as reordering elements or adding a featured snippet, can have a significant impact on visibility.

 

So if your site is in a small niche and you are targeting rankings for a few important keywords, the Visibility Index is almost meaningless. In this case, it is essential to feed your own keywords into the tool and track rankings specifically for these important keywords.

On the other hand, if you are playing in a broad field and want to rank for thousands of keywords, not all of which can be tracked via lists, the visibility index is definitely a meaningful value.

However, these tools are particularly valuable for agencies, as you can gain insight into the SEO performance of websites without client data. You can compare several market players using the same metrics. As a bonus, you can also find useful historical data for all of them.

SEO score & analysis from (free) tools

The SEO score is often used as a metric to evaluate websites, but this value should be viewed with nuance. By itself, the SEO score is largely meaningless. It is only significant in conjunction with other metrics (e.g., the previous year’s value) or when compared to the numbers of a direct competitor.

In addition to the SEO score, these tools typically provide a list of suggested to-dos, some marked as “very important.” With exclamation points and red markings, they can create a sense of urgency. Is disaster imminent? Will the website plummet in Google rankings?

Take it easy! While the statements are generally correct, they lack context and the necessary details.

Let’s take a look at some of the most common error messages:

2,561 Missing Alt Attributes

Whether this error message is significant depends on whether you want your images to rank or not. Ask yourself: Are my images relevant? Is accessibility important for my users? If not, there is no compelling reason to go through the effort of writing alt attributes.

978 Duplicate title tags

This is a problem if all your important landing pages have the same title tag. However, in 99% of cases, the warning is for pages that do not have a unique title. The legal page for Austria has the same title tag as the legal page for Germany? No problem. Minor versions of product pages, all pointing to the main version via a canonical tag, have the same title tag? No problem.

722 Pages Without H2

Ideally, every page should have a well-structured hierarchy of headings with an H1 containing the main keyword, followed by several H2s and, if needed, H3 to H5. However, the tool does not distinguish whether the H2 is missing on an important landing page or on a page that you don’t intend to rank. It’s your job to determine the significance of each page and weigh whether the effort is worth it.

11,243 Pages Without Meta Descriptions

Not the end of the world: Firstly, meta descriptions are not a ranking factor. Secondly, Google generates a snippet automatically if a page doesn’t provide a meta description. Sometimes, the effort isn’t worth it, and you can safely leave it to Google to handle.

 

This almost sounds like you could disregard all warnings without consequences. However, that’s not the point. The key is that evaluations from – mostly free – SEO tools are often not very nuanced, and you shouldn’t let the many warnings unsettle you. It’s advisable to acknowledge the warnings but not to panic immediately.

Focus on the Basics

Our recommendation is: Don’t be unsettled by the data and don’t chase every trend or blindly aim to improve every score. Take your time to analyze which data is relevant to you and focus on the basics:

  • Solid Keyword Strategy: What are the relevant search terms for your target audience?
  • Meaningful Structure: Create a sensible information architecture and organize the website well.
  • Quality Content: Content generated 100% by AI like ChatGPT will only rank temporarily and won’t satisfy users in the long run.
  • Build Authority

 

This is how you win the race, not by increasing your SEO score by 5 points.