If you’re managing a website, you’ve likely heard about various web crawlers and their impact on SEO. One that’s making waves right now is OpenAI’s SearchBot.
In this article, I’ll explain how you can allow or disallow OpenAI’s SearchBot WebCrawler on your site and what this means for your SEO strategy.
What is OpenAI’s SearchBot?
OpenAI has introduced a new web crawler known as SearchBot. This bot is designed to index websites and provide relevant search results for users of OpenAI’s SearchGPT prototype. Unlike some crawlers that gather data for training AI models, SearchBot is focused solely on surfacing website links in search results.
The full user-agent string for SearchBot is:
Mozilla/5.0 AppleWebKit/537.36 (KHTML, like Gecko); compatible; OAI-SearchBot/1.0; +https://openai.com/searchbot
You can check the IP addresses it uses here.
Dive deep in Difference between Search Engine and AI Answer engine.
Steps to Allow or Disallow OpenAI’s SearchBot webCrawler?
Allowing SearchBot is straightforward if you want your site to appear in SearchGPT’s results. Simply update your robots.txt
file to include:User-agent: OAI-SearchBot Allow: /
This setting ensures that SearchBot can crawl your entire site, which can be beneficial for visibility in SearchGPT.
On the other hand, if you prefer not to have SearchBot access your site, you can disallow it by updating your robots.txt
file with:User-agent: OAI-SearchBot Disallow: /
What About Other OpenAI Bots?
ChatGPT-User and GPTBot are other notable bots from OpenAI. While ChatGPT-User interacts with web pages to provide context in responses, GPTBot is used to enhance OpenAI’s generative AI models by crawling and indexing content. If you want to manage these bots:
- To allow ChatGPT-User, use:
User-agent: ChatGPT-User Allow: /
This bot doesn’t crawl for indexing but may visit pages to include links in responses.
- To disallow GPTBot, use:
User-agent: GPTBot Disallow: /
Blocking GPTBot means OpenAI won’t use your content for training its models.
Also check out Google AI Overviews and impact on SEO.
Impact on SEO
So, how does allowing or disallowing these bots affect your SEO? Well, if you allow SearchBot and it indexes your site, you might see increased traffic from SearchGPT users. However, if you block it, your site won’t appear in its search results, which might limit your reach.
The same goes for GPTBot—blocking it prevents your content from contributing to AI training, which might impact how OpenAI’s tools interact with your site. But remember, managing these bots effectively can help tailor your SEO strategy to ensure you get the best results.
Should You Block or Allow?
In my experience, it’s crucial to weigh the benefits of increased visibility against the potential for content use by AI models.
While allowing SearchBot might enhance your site’s presence in OpenAI’s search results, blocking GPTBot can protect your content from being used in AI training.
Ultimately, the choice comes down to your SEO goals and content strategy. Whether you choose to allow or disallow OpenAI’s bots, it’s essential to stay informed and make decisions that align with your website’s objectives.
Frequently Asked Questions
What is the purpose of O-AI SearchBot?
OAI-SearchBot is used to index content for OpenAI’s SearchGPT search engine, not for training AI models.
Can I allow O-AI SearchBot to crawl specific parts of my website?
Yes, you can specify directories in your robots.txt file to control which parts of your site are accessible to OAI-SearchBot.
How can I block GPT-Bot from crawling my website?
Add the following to your robots.txt:
User-agent: GPTBot
Disallow: /
Does blocking OpenAI’s WebCrawler affects my SEO on other search engines?
No, blocking OpenAI’s crawlers won’t impact your ranking on Google or other search engines.
Discover more from Shahnoorblogger
Subscribe to get the latest posts sent to your email.