Answers
Oct 31, 2006 - 08:24 PM
The search engines should be easy enough to stop, just put a file called "robots.txt" in the root directory with this in it:
User-agent: *
Disallow: /
For stopping users I would use whatever way your host offers to password-protect directories, or if you're using Apache on Linux you can create a .htaccess file in the root of the website containing:
AuthName "Members only in here"
AuthType Basic
AuthUserFile /path/to/your/.htpasswd
AuthGroupFile /dev/null
require valid-user
and then use the utility htpasswd (in the directory you want to create the .htpasswd in) like this:
htpasswd -c .htpasswd username (for the first time only. remove the -c if you want to add more users)
User-agent: *
Disallow: /
For stopping users I would use whatever way your host offers to password-protect directories, or if you're using Apache on Linux you can create a .htaccess file in the root of the website containing:
AuthName "Members only in here"
AuthType Basic
AuthUserFile /path/to/your/.htpasswd
AuthGroupFile /dev/null
require valid-user
and then use the utility htpasswd (in the directory you want to create the .htpasswd in) like this:
htpasswd -c .htpasswd username (for the first time only. remove the -c if you want to add more users)
Nov 01, 2006 - 01:03 AM
Thanks a lot nhinds
Sep 04, 2010 - 04:50 AM
but can you close the crawler, cause i can't get on a game that i downloaded, so can you close it please
Dec 27, 2012 - 09:46 PM
rel="no follow" and robot.txt are the only things which avoid search engine to crawl your website and nothing else i think thanks
SOURCE(s):
Search engine optimization consultant
SOURCE(s):
Search engine optimization consultant
Add New Comment