Linker IT Software
menubar-top-links menubar-top-rechts
Home Help Search Login
Welcome, Guest. Please Login.
SQL*XL: Database to Excel bridge litLIB: Excel power functions pack ExcelLock: Locking and securing your valuable Excel spreadsheets encOffice: Protect your Excel file easy and safe encOffice: Protect your Excel file easy and safe
Pages: 1
Robots.txt (Read 2858 times)
Gerrit-Jan Linker
YaBB Administrator

Posts: 75
29.10.07 at 19:57:09
In my website log I found that I have 800 failed requests for the file robots.txt. I tried to find out why this is.
Aparently robots.txt is a small text file that is used by search engines to keep some of the website pages outside the searchengine.  
The robots.txt file is a simpel text file and it is placed in the root of the web server. The robots.txt file works on the basis of the robots exclusion protocol. This protocol can be used to exclude parts of the website from being indexed by certain search engines.
You can just make a plain text file, call it robots.txt and place it in the root of your web server as
The contents of the robots.txt file is as follows:
User-agent: [name spider]
Disallow: [file or directory]
In the following example all robots will be disallowed in the cgi-bin and in the private directories.
User-agent: *
Disallow: /cgi-bin/
Disallow: /private/

Back to top
« Last Edit: 11.12.07 at 21:47:39 by Gerrit-Jan Linker »  

Gerrit-Jan Linker
Linker IT Software
Email WWW Gerrit-Jan Linker   IP Logged
Pages: 1