Warning: This page has not been updated in over over a year and may be outdated or deprecated.
administration:robots.txt
Differences
This shows you the differences between two versions of the page.
Both sides previous revisionPrevious revisionNext revision | Previous revision | ||
administration:robots.txt [2015/12/14 19:15] – ↷ Page moved from robots.txt to administration:robots.txt demiankatz | administration:robots.txt [2020/06/04 15:06] (current) – [More Information] demiankatz | ||
---|---|---|---|
Line 12: | Line 12: | ||
===== File Location ===== | ===== File Location ===== | ||
- | The most important thing to know about robots.txt is that it must exist at the root of your server. If VuFind is running in the root of your server, this means you can simply create a robots.txt file in VuFind' | + | The most important thing to know about robots.txt is that it must exist at the root of your server. If VuFind is running in the root of your server, this means you can simply create a robots.txt file in VuFind' |
To summarize: the URL //must// be < | To summarize: the URL //must// be < | ||
Line 26: | Line 26: | ||
Disallow: / | Disallow: / | ||
Disallow: / | Disallow: / | ||
+ | Disallow: /vufind/EDS | ||
+ | Disallow: / | ||
Disallow: / | Disallow: / | ||
Disallow: / | Disallow: / | ||
Line 33: | Line 35: | ||
Disallow: / | Disallow: / | ||
Disallow: / | Disallow: / | ||
+ | Disallow: / | ||
+ | Disallow: / | ||
Disallow: / | Disallow: / | ||
Disallow: / | Disallow: / | ||
Line 45: | Line 49: | ||
We've recently added the Browse module to avoid redundancy. We also disabled access to the Alphabrowse and Results pages in compliance with the Google crawling guidelines and to reduce server strain. We recommend providing a sitemap of all records to the bot to make sure each of your records is crawled. See here for more information: | We've recently added the Browse module to avoid redundancy. We also disabled access to the Alphabrowse and Results pages in compliance with the Google crawling guidelines and to reduce server strain. We recommend providing a sitemap of all records to the bot to make sure each of your records is crawled. See here for more information: | ||
+ | |||
+ | ===== More Information ===== | ||
+ | |||
+ | Google offers some [[https:// | ||
---- struct data ---- | ---- struct data ---- | ||
+ | properties.Page Owner : | ||
---- | ---- | ||
administration/robots.txt.1450120550.txt.gz · Last modified: 2015/12/14 19:15 by demiankatz