The domain is running on a linux server, and what I'm trying to accomplish is have a robots.txt file that search engine crawlers can read but the any user can't access/view via the browser.
2006-08-26
04:17:25
·
1 answers
·
asked by
Omnis
1
in
Computers & Internet
➔ Internet
The whole point of using a robots.txt file is to hide folders/files from ending up in search results in google, yahoo, etc, which in turn the public can access. Even though it hides folders/files from search engine crawlers, the actual robots.txt file is still accessible through http://www.domain.com/robots.txt, which displays all the folders/files I'm trying to hide, unless you can place robots.txt file in the actual directory your trying to hide? I know you can password directories and thats what I intend to do as well.
2006-08-26
16:21:38 ·
update #1