# User-agent: * Disallow: /Admin/ # User-agent: * # # Disallow for WebResource.axd caching issues. Several instances below to cover all search engines. # # To specify matching the end of a URL, use $ # Disallow: /*.axd$ # # However, WebResource.axd and ScriptResource.axd always include a query string parameter the URL does # not end with .axd thus, the correct robots.txt record for Google would be: # Disallow: /*.axd # # Not all crawlers recognize the wildcard '*' syntax. To comply with the robots.txt draft RFC # Note that the records are case sensitive, and error page is showing the requests to be in lower case # so let's include both cases below: # Disallow: /ScriptResource.axd Disallow: /ScriptResource.axd$ Disallow: /ScriptResource.axd* Disallow: /scriptresource.axd Disallow: /WebResource.axd Disallow: /webresource.axd Disallow: /App_Browsers/ Disallow: /App_Code/ Disallow: /App_Data/ Disallow: /App_Themes/ Disallow: /aspnet_client/ Disallow: /bin/ Disallow: /js/ Crawl-Delay: 10 Visit-time: 0000-0800 User-agent: MJ12bot Disallow: / UserAgent: Java/1.4.1_04 Disallow: / UserAgent: Java/1.6.0_04 Disallow: / UserAgent: Java/1.8.0_31 Disallow: /