🔥 The SEOs Diners Club - Issue #108 - Weekly SEO Tips & News

Here are the weekly SEO insights for the SEOs Diners Club members.

Welcome to another insightful week with The SEOs Diners Club newsletter. Leveraging over a decade of SEO expertise, I aim to distill the week's most critical developments into a concise read that will take less than seven minutes. Let's dive into the insights that matter.

Investigate Traffic Drops With Google Search Console

A guide to diagnosing and fixing traffic drops on your website with Google Search Console.

On March 5, 2024, Google released the core algorithm and SPAM updates simultaneously. These updates caused significant fluctuations in the organic traffic of many websites.

If your website has been experiencing a decrease in traffic lately, you can attribute this decrease to Google's algorithm updates.

This week, we'll look at three basic ways to analyze traffic drops using Google Search Console. Thanks to these methods, you can diagnose the cause of the traffic drop and make the necessary corrections.

First, if you haven't already, I recommend checking out Google's beneficial guide to analyzing traffic decline:

After reviewing this guide, you can query the possible causes of traffic drops by applying the following methods:

  1. Performance Report:

  • Period Selection: Select a period starting from March 5, 2024.

  • Comparison: Compare the decline period to a period before the update.

  • Problem Identification: Examine metrics (clicks, impressions, click-through rate) that may be associated with the drop in traffic.

  • Data Visualization: Track trends and anomalies through graphs and tables.

2. Search Queries Report:

  • Queries Experiencing Traffic Loss: Identify queries experiencing a decrease.

  • Ranking Changes: Check out the changes in your rankings after the update.

  • Click-Through Rate (CTR): Is the decline due to a drop in CTR?

  • Competitive Analysis: Examine the change in your competitors' rankings.

3. Pages Report:

  • Pages Experiencing Traffic Loss: Identify the pages experiencing a decrease.

  • Impressions: Is the decrease in impressions due to a change in ranking?

  • Click-Through Rate (CTR): Is there a decrease in the CTR of the pages?

  • Technical Issues: Check for technical issues that may cause a crash.

I hope this guide helps you analyze traffic drops in Google Search Console and understand how the March 5 update affects your website.

Google Confirms: High-Quality Content Is Crawled More Often

Google confirmed that high-quality and user-focused content is crawled and indexed more frequently.

The Google search team made exciting statements in this week's episode of the Search Off The Record podcast. Google spokespeople explained that search engines crawled and indexed high-quality and user-focused content more frequently. This means webmasters should focus on high-quality content to increase SEO success.

Google spokespersons eliminated misconceptions about the crawling budget and explained that crawling priority is determined based on content quality. My advice to site owners is to constantly improve the content quality and user experience to ensure that their valuable pages are discovered, crawled, and indexed.

Google's prioritization of high-quality content underlines that webmasters should focus on content quality. Creating user-oriented and informative content ensures that your website is crawled more frequently and ranks higher in search engines.

John Mueller's Interesting Robots.txt Adventure

Google spokesman John Mueller's blog has been removed from the Google index. Let's take a look at how this funny event happened.

I want to tell you about an exciting development regarding John Mueller's blog. A Reddit user suggested that the robots.txt file for Mueller's blog was faulty and was removed from Google's index because it was affected by the Helpful Content update. This news made a significant impact on the SEO community.

However, John Mueller denied this claim in his Twitter statement. Mueller clarified that the error in the robots.txt file was unrelated to the Helpful Content update and that his blog continued to be indexed.

So what happened?

There was a small error in the robots.txt file. This error was preventing Googlebot from crawling some pages of the blog. After the error was fixed, all blog pages were indexed again. Who would use such a command to prevent robots.txt from being crawled? Now we all know who this is 🙂 

The lesson from this incident is how vital the robots.txt file is for SEO. Even a tiny error in robots.txt can cause big problems. Therefore, checking your robots.txt file and fixing any mistakes regularly is essential.

New INP Metric Added to Google Search Console Core Web Insights Report

Google Search Console added the INP (Interaction to Next Paint) metric to the Core Web Vitals report. What does this new metric mean, and how will it affect SEO?

I want to tell you about an important update made in Google Search Console. Google added the INP (Interaction to Next Paint) metric to the Core Web Vitals report.

INP measures when a user interacts with an element on a web page (for example, clicking a button) and when the browser visually presents that interaction to the user. This metric indicates how fast and responsive a web page is.

Google began using the INP metric as an official part of Core Web Vitals this month. Therefore, website owners and SEO experts should monitor this new metric closely and optimize their websites accordingly.

Here are some things that can be done to optimize the INP metric:

  • Reducing the size and complexity of JavaScript

  • Optimizing images

  • Optimizing the critical render path

  • Using browser cache

Google Explains How It Uses Page Experience and Core Web Vitals as Ranking Signals

Google has clarified how page experience and Core Web Vitals are used as ranking signals. How will this change affect SEO?

Recently, Google published a document clarifying how page experience and Core Web Vitals are used as ranking signals. This document aroused great interest in the SEO community.

According to the document, Core Web Vitals are used as a ranking signal on a limited basis. Other page experience signals are not used directly as ranking signals.

This means Core Web Vitals still matter. However, they alone are not a determining factor for ranking. To rank higher, websites must be user-friendly, fast, and mobile-compatible.

Some key points included in the document are:

  • Important Web Metrics consist of three metrics: “Largest Contentful Paint” (LCP), “Interaction to Next Paint” (INP), and “Cumulative Layout Shift” (CLS).

  • These metrics measure how good the user experience of a web page is.

  • Good Web Vitals can help a web page rank higher in search.

  • However, Core Web Vitals alone are not a determining factor for ranking.

How will this change affect SEO?

This change will require SEO experts to take a more comprehensive perspective. Ranking signals are more than just Core Web Vitals. Many factors must be considered to create a user-friendly website.

Free JavaScript SEO Analysis Tool

You can solve your Javascript-related problems with Jetoctopus' JavaScript SEO analysis tool.

JetOctopus' free JavaScript SEO tool lets you analyze the effects of your website's JavaScript on SEO. With this tool, you can discover common JavaScript SEO issues such as JavaScript load time, missing content, technical SEO errors, page rendering errors, JavaScript redirects, and broken SSR.

  • WhitePress: WhitePress is a content marketing platform that enables accessible article publishing across 90,571 websites in 30 languages, offering native copywriting and automated global publication services.

  • Cannibalization Explorer Looker Studio Report is a tool for identifying keyword cannibalization and providing insights into how keywords perform across different pages.

  • SurferSEO is a tool that helps you create and optimize content for SEO. It analyzes the top-ranking pages for your target keywords and gives you suggestions on how to improve your content. Surfer AI also has a feature that can generate ready-to-rank articles in minutes.

  • Copilot Pro: Copilot Pro is available for $20/month for cloud-based GenAI models and $27/month with Microsoft 365. It is integrated with Microsoft 365 applications and provides access to GPT-4 Turbo and advanced AI rendering tools.

  • AlsoAsked is a helpful tool that can boost your visibility in the “People Also Ask” section of Google's search engine results pages.

  • Semrush Map Rank Tracker is a game-changing tool for local businesses looking to dominate their niche and attract more customers. I've been using the free version for a while now, and I'm impressed with its ease and effectiveness.

  • ContentShake AI is an intelligent writing tool that combines AI with real-life competitor insights. It guides you from ideation to publishing directly to the blog, generates SEO-friendly articles, and helps you optimize them for organic traffic and engagement.

  • Microsoft Clarity has recently introduced a feature called Copilot, which incorporates Generative AI through Large Language Models (LLMs) into the analytics tool. This integration aims to make data more accessible and understandable for users by leveraging the same technology created by OpenAI, which powers the new Bing or ChatGPT.

  • InLinks is an SEO tool that uses a knowledge graph to optimize content for search engines. It automates internal linking and schema markup, simplifying SEO processes and content strategies.

Subscribe to my newsletter to stay informed about the latest in SEO and digital marketing. Next week, I'll continue to share the newest insights, tactics, and strategies for navigating the ever-evolving digital marketing landscape.

Support My Newsletter:

If you find my content valuable, consider buying me a coffee to support my work. Your contribution will help me continue sharing the latest SEO developments. Let's achieve more together!

Best,

Mert