If you want to cache the output of a render array differently depending on if the site is visited by a known crawler or not, this is the module you need.
Use cases
Say you want Google to index all pages of a search result, even if you want the average user to get an infinite scroll. You can then want to display a "next page" link to google only. Or maybe to crawlers only:
$build['next'] = [
'#type' => 'link',
'#url' => $some_url
'#title' => t('Next page'),
'#cache' => [
'contexts' => [
// This will limit it to google bot only.
'crawlers_cache_context:googlebot',
],
],
'#access' => $access,
];
$build['next'] = [
'#type' => 'link',
'#url' => $some_url
'#title' => t('Next page'),
'#cache' => [
'contexts' => [
// This will limit it to crawlers in general.
'crawlers_cache_context',
],
],
'#access' => $access,
];
What are the crawlers
This module relies on the package jaybizzle/crawler-detect. If you download the module with composer, this should be downloaded automatically for you.
Currently you can get a general idea of what crawlers are detected with this link
Project information
- Maintenance fixes only
Considered feature-complete by its maintainers. - Module categories: Performance, Search Engine Optimization (SEO)
- 15 sites report using this module
- Created by eiriksm on , updated
- Stable releases for this project are covered by the security advisory policy.
Look for the shield icon below.
Releases
Development version: 8.x-1.x-dev updated 1 Apr 2024 at 06:21 UTC