Search:   Help

Navigation

Groups

LyX documentation

Edit

Shared groups

Links

ControllingWebRobots

<< | Page list | >>

Dealing with search robots.

Table of contents (hide)

  1.   1.  Information
  2.   2.  Further information

1.  Information

This site has a directive called '(:robots ... :)' enabled that helps you instruct web robots about what to do on a page.

Here is the syntax:

(:robots index,follow:)
(:robots index,nofollow:)
(:robots noindex,follow:)
(:robots noindex,nofollow:)

The meaning of the arguments are not documented here, see e.g. Cookbook:ControllingWebRobots for more information. Basically this directive controls what is put into the HTML code of the generated page, i.e. in the tag <meta content='robots' .../>.

To put it briefly, add the markup

(:robots noindex,nofollow:)

to a specific page in order to instruct web robots to ignore the page.

If you wish all the pages in a group to be ignored, place the markup in the group header of that group. For example, if you wish to instruct web robots to ignore all pages in the group 'Playground/', edit the page

and there insert the markup

(:robots noindex,nofollow:)

2.  Further information


Category: Site

Edit - History - Print - Recent Changes - All Recent Changes - Search
Page last modified on 2007-06-04 10:51 UTC