If you have one application on your IIS and several domain names pointing to it, it’s possible to have different robots.txt files served based on the current domain name.
1. Add a route
2. Create the RobotsController
3. The SeoHelper just implements a simple way of determine what robots.txt file to return
4. Add the different versions of the robot.txt to the site root – remark the content for the robots.txt could be served from any source, I have just chosen a simple model to keep it simple
5. Run the site and request robots.txt
Get the source code: DynamicRobotsTxt.rar (21,66 kb)