• The actual effect of tracking at the robots.txt website optimization rules

    survey found that Google on the robots.txt reaction relatively quickly, third days to find the signs in the web site administration tool. Love Shanghai performance is not satisfactory, that Shanghai does not recognize the robots.txt rule that is nonsense, but the reaction period is so long, will inevitably leave slow as suspicion.

    in 20 days of adjustment, there are two rules I was done to delete. I open the blog robots.txt, adjust the contrast to write and 20 days, the change of the. Is that the reason for this adjustment, such as 20 days of writing, second days I found that webmaster tools in Sitemaps three was selected to address before the cross – robots.txt file rules to prevent the unnecessary. The screenshots can not find, the following three selected can look at:

    20 on the robots.txt file rule, this is what

    The Sitemap site map

    searched every page, no problem, except some / found? P= short connection makes people feel bad, everything is perfect. Strictly speaking, robots.txt should be the rule does not exist, the implementation of the rules of robots.txt Google hundred-percent. Google search "site:***贵族宝贝 inurl:? P" found only incomplete 14 (no title or abstract). The future of these addresses will be cleared.

    October 20th, I conducted a large-scale adjustment of the blog, just like to grow trees pruning shears, in order to have a good growth trend. My robots.txt is to fully utilize the. Now a week is over, the robots.txt file rule is correct, whether have effect? Love Shanghai Google search engine is in response to the robots.txt rules for indexing adjustment? As stationmaster I need further study to grasp the dynamics of a blog included state.

    ?The

    more than 2000 connection address is robots.txt rules limit

    20 rules enacted on this map, is not through, or my eye? I checked the IIS log, love Shanghai 20 days after repeatedly download the robot.txt file, the server returns is 200 – > success

     

    robots.txt rules, Google stopped more than 2000 connection address capture. The more than 500 can not find the address, because some time ago to delete the article tag tags sequelae. Here is a screenshot:

    Google robots.txt

    submissionThe response of

    love response in Shanghai robots.txt rules

    reaction rules

    Categories: nfxofclr

    Leave a Reply

    Your email address will not be published. Required fields are marked *