This might help to detect the robots user agents while also keeping things more organized:
Javascript
const detectRobot = (userAgent) => {
const robots = new RegExp([
/bot/,/spider/,/crawl/, // GENERAL TERMS
/APIs-Google/,/AdsBot/,/Googlebot/, // GOOGLE ROBOTS
/mediapartners/,/Google Favicon/,
/FeedFetcher/,/Google-Read-Aloud/,
/DuplexWeb-Google/,/googleweblight/,
/bing/,/yandex/,/baidu/,/duckduck/,/yahoo/, // OTHER ENGINES
/ecosia/,/ia_archiver/,
/facebook/,/instagram/,/pinterest/,/reddit/, // SOCIAL MEDIA
/slack/,/twitter/,/whatsapp/,/youtube/,
/semrush/, // OTHER
].map((r) => r.source).join("|"),"i"); // BUILD REGEXP + "i" FLAG
return robots.test(userAgent);
};
Typescript
const detectRobot = (userAgent: string): boolean => {
const robots = new RegExp(([
/bot/,/spider/,/crawl/, // GENERAL TERMS
/APIs-Google/,/AdsBot/,/Googlebot/, // GOOGLE ROBOTS
/mediapartners/,/Google Favicon/,
/FeedFetcher/,/Google-Read-Aloud/,
/DuplexWeb-Google/,/googleweblight/,
/bing/,/yandex/,/baidu/,/duckduck/,/yahoo/, // OTHER ENGINES
/ecosia/,/ia_archiver/,
/facebook/,/instagram/,/pinterest/,/reddit/, // SOCIAL MEDIA
/slack/,/twitter/,/whatsapp/,/youtube/,
/semrush/, // OTHER
] as RegExp[]).map((r) => r.source).join("|"),"i"); // BUILD REGEXP + "i" FLAG
return robots.test(userAgent);
};
Use on server:
const userAgent = req.get('user-agent');
const isRobot = detectRobot(userAgent);
Use on "client" / some phantom browser a bot might be using:
const userAgent = navigator.userAgent;
const isRobot = detectRobot(userAgent);
Overview of Google crawlers:
https://developers.google.com/search/docs/advanced/crawling/overview-google-crawlers