Standard for robot exclusion



World-Wide Web
A proposal to try to prevent the havoc wreaked by many of the early World-Wide Web robots when they retrieved documents too rapidly or retrieved documents that had side effects (such as voting). The proposed standard for robot exclusion offers a solution to these problems in the form of a file called “robots.txt” placed in the document root of the web site.
W3C standard (http://w3.org/TR/html4/appendix/notes.html#h-B.4.1.1).
(2006-10-17)

Tagged:

Read Also:

  • Standard function

    noun 1. (computing) a subprogram provided by a translator that carries out a task, for example the computation of a mathematical function, such as sine, square root, etc

  • Standard-gauge

    noun 1. See under gauge (def 13). verb (used with object), gauged, gauging. 1. to determine the exact dimensions, capacity, quantity, or force of; measure. 2. to appraise, estimate, or judge. 3. to make conformable to a standard. 4. to mark or measure off; delineate. 5. to prepare or mix (plaster) with a definite proportion […]



  • Standard generalised markup language

    spelling ISO spell it “Standard Generalized Markup Language”. (1996-12-13)

  • Standard generalized markup language

    language, text (SGML) A generic markup language for representing documents. SGML is an International Standard that describes the relationship between a document’s content and its structure. SGML allows document-based information to be shared and re-used across applications and computer platforms in an open, vendor-neutral format. SGML is sometimes compared to SQL, in that it enables […]



Disclaimer: Standard for robot exclusion definition / meaning should not be considered complete, up to date, and is not intended to be used in place of a visit, consultation, or advice of a legal, medical, or any other professional. All content on this website is for informational purposes only.