Exclusive Offer: Sign up for any LeadsHouse Plan and Get a Lead-Generating Website Built for You!

Google Unveils “GoogleOther” as a New Web Crawler: Essential Details

Google recently introduced a gleaming new web crawler, “GoogleOther,” designed to provide a much-needed respite for its primary search index crawler, Googlebot.

As per Google Analyst Gary Illyes, this fresh crawler will handle non-essential tasks like research and development (R&D) crawls, allowing Googlebot to concentrate on its core duty of indexing the web.

This update is expected to streamline and enhance the efficiency of the tech giant’s web crawling operations.

While Illyes assured that the new crawler won’t have a significant impact on websites, it raises the question: What implications does this hold for search engine optimization (SEO) endeavors?

In this blog post, we’ll delve into GoogleOther’s limitations, its potential influence on search engine rankings, and whether businesses should be apprehensive.

A Swift Overview Of Web Crawlers, User Agents, And Googlebot

To comprehensively grasp how GoogleOther updates affect web crawling, it’s crucial to first revisit the fundamentals of web crawlers, Googlebot user agents, and Googlebot’s role in the web crawling process.

Google Web Crawlers and User Agents

Web crawlers, also known as bots or search engine spiders, systematically explore and scan websites by traversing links from one page to another. Search engines utilize Google spiders to gather information about web pages and generate recommendations for search queries.

To identify themselves to servers, Google web crawlers employ a user agent, a text string included in the request headers sent to the server.

The Googlebot user agent then informs the server about which bot is making the request to crawl the page. This enables website owners to monitor bot activity and restrict Google crawling access if needed.

The server responds with a status code indicating whether Google spiders are permitted to crawl the website or not.

Googlebot and Rankings

If crawling is permitted, Googlebot scrutinizes the web page, including its text, images, and links. These pages are ranked based on relevance, with the highest-ranked pages deemed the most pertinent to the query.

This search result ranking relies on an algorithm that takes into account various factors like keywords, content, and backlinks from reputable sites.

To enhance a website’s ranking, many businesses enlist the services of a technical SEO consultant for tasks like website optimization or on-page optimization.

Where GoogleOther Fits In

The web crawling process is ongoing, with Googlebot continuously visiting and revisiting websites to ensure the Google search index remains current with the latest information.

However, given the billions of pages to be indexed, one can imagine the resource-intensive nature of this task. Google web crawlers, including Googlebot, must adapt to efficiently handle the escalating volume of data.

With the addition of GoogleOther, Google can relieve some of the burden on Googlebot by assigning non-essential tasks to the new crawler.

Dividing Responsibilities Between Googlebot & GoogleOther

GoogleOther will primarily serve Google’s product teams in building the Google search index internally. As Illyes mentioned on LinkedIn:

“We added a new crawler, GoogleOther to our list of crawlers that ultimately will take some strain off of Googlebot. This is a no-op change for you, but it’s interesting nonetheless I reckon.

As we optimize how and what Googlebot crawls, one thing we wanted to ensure is that Googlebot’s crawl jobs are only used internally for building the index that’s used by Search. For this we added a new crawler, GoogleOther, that will replace some of Googlebot’s other jobs like R&D crawls to free up some crawl capacity for Googlebot.”

In essence, it will assume various tasks that were traditionally handled by Googlebot, including research and development (R&D) crawls. And by “historical,” here’s what we mean:

GoogleOther’s Constraints And Features

GoogleOther inherits the same infrastructure as Googlebot, meaning it carries the same limitations and features when crawling web pages. This encompasses:

  • Host Load Limits: Bound by the same restrictions on how much load it can generate on a server, preventing it from overwhelming a site’s resources or causing downtime.
  • Robots.txt Restrictions: Abides by the same rules in robots.txt as Googlebot search engine spiders, albeit with a distinct Googlebot user agent token. This allows site owners to dictate which parts of their site are crawled and which are not.
  • HTTP Protocol Version: Utilizes the same HTTP version as Googlebot, presently HTTP/1.1 and HTTP/2 (if supported by the site).
  • Fetch Size Limit: Subject to the same page size limit as Googlebot, presently set at 10MB. This prevents large pages from consuming excessive resources, which could hinder the Google crawling process.

As Ilyes pointed out, GoogleOther is essentially Googlebot with a different name.

What GoogleOther Signifies For Your SEO Approach: Insights From Experts

Although Google has assured webmasters that the new crawler won’t have a substantial impact on websites, many SEO experts are still pondering its potential effects on site rankings.

It is too early to determine how GoogleOther will impact SEO efforts. Given that GoogleOther is a recent addition, there are no case studies available to indicate how it might influence rankings and traffic.

Google regularly updates its algorithms and crawlers, and these changes can impact search results and website rankings. However, until more information is available about GoogleOther, it’s impossible to predict how it might impact SEO.

When it comes to optimizing your SEO campaign strategy,it is advised to stick with your current systems while keeping an eye on this new update. However, if your progress has been stagnant or slow, he recommended exploring new opportunities to rank well and attract more visitors to your site.

One thing I’m sure of is that you should continue to focus on creating high-quality content relevant to your target users. This is your best bet to rank higher in Google SERPs and attract more audiences.

In addition to high-quality content, on-page optimization services, such as keyword research, title and meta tags optimization, image optimization, and internal linking can also significantly improve your website’s SEO performance.

These services help search engines better understand your website’s content and context, making indexing and ranking your pages for relevant queries easier.

How To Monitor GoogleOther

If you’re still feeling wary about GoogleOther, here are some steps to incorporate into your SEO campaign strategy to monitor its crawling activities:

  • Review Server Logs: Regularly monitor server logs to identify GoogleOther requests. This helps you understand its crawling behavior and the pages it visits.
  • Keep robots.txt File Updated: Make sure your robots.txt file is updated with special instructions for GoogleOther so you can control how it crawls your page.
  • Monitor Google Search Console (GSC) Crawl Stats: Use GSC to track changes in crawl frequency, quantity, budget or number of indexed pages since GoogleOther was implemented.
  • Track Website Performance: Monitor web performance indicators, such as bounce rates, load times and user engagement to spot issues that could manifest as a result of GoogleOther crawling web pages.

Adapt To GoogleOther Changes With Ease

As “GoogleOther” continues to raise questions about its potential impact on SEO, website owners and businesses can take proactive steps to stay ahead. One such step is to partner with top SEO experts like Leadshouse.

By partnering with Leadshouse, businesses can gain a competitive advantage and adapt quickly to any changes in the SEO landscape.

Our technical SEO consultant will help optimize your website content, build high-quality backlinks, conduct comprehensive keyword research and much more to improve your rankings.

We offer a range of website optimization services, including on-page optimization services, off-page optimization services, technical SEO, local SEO and franchise SEO.

Contact Leadshouse today to see how our website optimization services can help you stay prepared for any potential impact from GoogleOther.

Facebook
Twitter
LinkedIn
Pinterest

Leave a Reply

Your email address will not be published. Required fields are marked *