Info |
---|
Available in SCORE 2.1 |
Web site owners use the /robots.txt file to give instructions about their site to web robots. According to specification robots.txt file must be in the top-level directory of the host, accessible though the appropriate protocol and port number. Static robots.txt file works just fine if Sitecore hosts only one file but multi tenant setup may require different rules for every site. This topic describes how you can achieve that with SCORE.
There are several steps to add dynamic behavior to /robots.txt file:
- Switch HTTP handler for site
- Create File Template
- Configure site settings
Turning on ASP.NET HTTP handler for /robots.txt
IIS by default delivers all *.txt files using StaticFileHandler
HTTPHandler. That module loads file from disc and returns its content to caller. We need to register a custom HTTPHandler that can generate the right tenant specific content.
...
Add Item below to the end of section <handlers>
Code Block | ||
---|---|---|
| ||
<handlers> ... <add name="Score.Robots.Txt" verb="GET" path="robots.ashx" type="Score.Custom.HttpHandlers.RobotsTxtHandler, Score.Custom" preCondition="runtimeVersionv4.0" /> </handlers> | ||
Note | ||
Depending on your IIS pool, the <httpHandlers> section may be called <Handlers>. Note the difference of Integrated mode from Classic mode in the element name — Handlers compared to httpHandlers. |
Option #1 - Creating a robots.txt as a site setting content item
...
At this time RobotsTxtHandler supports only one variable.
<sitemap> text in file template substituted with full URL to site sitemap.
<sitemap> value generated using following pattern:
Code Block | ||
---|---|---|
| ||
xmlSitemapFilename = $"{scheme}://{targetHostName}/{xmlSitemapFilename}.gz"; |
All values are taken from corresponding site attributes.
Warning |
---|
Sitemap URL assumes that compression processor CompressXmlSitemapFiles used and file saved with *.gz extension |
...