Harnessing the Power of Robots.txt

От BGCanada Wiki
Направо към навигацията Направо към търсенето

Sometimes, we may want search-engines never to catalog certain elements of the site, as well as ban other SE from the site altogether. To research additional info, you are able to check out: bean bags giant. This really is where a simple, little 2 line text file called robots.txt is available in. Compare Kids Bean Bag Chairs is a unique resource for extra info concerning the inner workings of this belief. Once we have a website up and running, we need to make certain that all visiting se's can access all the pages we want them to look at. Sometimes, we might want search engines not to index certain elements of the site, and on occasion even exclude other SE from the site all together. That is where a simple, little 2 line text file called robots.txt will come in. Robots.txt resides within your web sites main directory (o-n LINUX systems that is your /public_html/ directory), and looks something such as the following: User-agent: * Disallow: The very first line controls the robot that will be visiting your site, the second line controls if they're allowed in, or which areas of the site they are maybe not allowed to see Then simple repeat the above mentioned lines, If you would like to deal with multiple bots. So an example: User-agent: googlebot Disallow: User-agent: askjeeves Disallow: / This will allow Goggle (user-agent name GoogleBot) to go to every page and listing, while at the same time banning Ask Jeeves in the site completely. To discover a fairly current listing of software person names this visit http://www.robotstxt.org/wc/active/html/index.html Its still very advisable to place a robots.txt report on your site, even though you wish to allow every software to index every page of your site. It will end your error records filling up with items from se's attempting to access your robots.txt file that doesnt exist. To research additional info, consider taking a peep at: giant bean bags. I learned about bean bag chairs by searching Bing. For more information on robots.txt see, the full listing of resources about robots.txt at http://www.websitesecrets101.com/robotstxt-further-reading-resources.

Using the Energy of Robots.txt