3, set in the robots on Disallow / and / / difference. In the robots file, Disallow:/abc is forbidden to access abc.html and /abc/123.html files, and for disallow:/abc/, is to allow access to /abc.html, but is banned for /abc/123.html.
first, you need to determine the content of the ban, and then registered Google webmaster tools, through the search engine to find the content you want to forbid writing, and then landing Shanghai love administrator tools, select the site configuration > permissions > crawler; robots.txt test you want to prohibit the content of grammar in the text box, then click on the following test the results of the test, there will be, if there are errors, according to the error in the corresponding modification on it.
1, as the first search engine access to the directory, the robots file is too long will affect the spider crawling speed, so as to stop the search engine ", can be appropriate to use the noffollow tag, which does not transfer the site weight.
robots set "*" and "$", which "$" matches the end of a line." *, 0 or more of any character.
knowledge, you might say I still don’t know how to set the robots file, then how to do
? In 4,
in Shanghai dragon website optimization, do robots.txt is crucial because each search engine grab site information is the first step to climb to the robots.txt file to guide the search engine crawling; reasonable use of robots files to the weight on the site better, to avoid some do not want to view the file search engine crawl, is a means of Shanghai dragon very necessary, but a lot of friends in Shanghai Longfeng files using robots is not particularly understand, just know that Disallow is prohibited, allow is allowed, as well as some commonly used search engine access forbidden in some way, for don’t know how to write complex, actually I also don’t know how to start for the robots to write but, as do Shanghai Longfeng time and experience, the author summed up a set of their own. The method of robots, the first to introduce some matters needing attention in robots settings:
, 2 for Disallow and Allow is set up a sequence of points, the search engine will be based on the first successful matching Allow and Disallow to determine the first visit the URL address.