Best way to learn web design, business web design, graphic web design, web design development, web design services
Thursday, September 24, 2009
Use the power of robots txt
Use the power of robots txt If we have a website and running, you must ensure that all visitors to the search engine to access all the pages you want at.Sometimes, we want the search engines do not index parts of pages or even prohibit other search engines on the web site where all together.This is a simple, small 2-line text file called "robots.txt" in.Robots.txt is located in the root directory of your website (on Linux systems, This is your / public_html / directory), and something like this: User-agent: * Disallow: The first line is the "bot" that the visitors to your site, the second line if you are able to be controlled, or which parts of the site are not allowed to visit? Do not use more "bots", then simply repeat the above lines. Thus, an example: User-agent: googlebot Disallow: User-agent: askjeeves Disallow: / This allows Goggle (user-agent Googlebot name) to visit each page and the directory, while the prohibition of the Ask Jeeves website. To create a "reasonable" list of robot user name this visit, if you want all robots to index all the pages on your website, it is still highly recommended to create a "robots.txt" file on your website. It prevents your error log is filled with items from search engines to access your robots.txt file, which is not exist.For more information about robots.txt is to see the complete list of resources on robots.txt Heard on the characteristics, and run a page full of tips and tricks to get the most from your existing website. Subscribe to our newsletter today WebSiteSecrets101 printing and more from your website.
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment