recently I am having a lot of link requests, for a couple of sites I have done.
Although I have kindly said no, I got one request that made me fume, a company wanted to link to a certain page. This page is on a site that I have done and that site is not yet live, so I have an htaccess file that redirects all people to a holding page.
no most people will not even know how to bypass this, but one page that is not used but is still hidden was accessed and "viewed" and this is a page they want to link exchange to.
ok I guessed it is a robot doing the searching for the info, but this page is not being used and wont be used on the site, just there as a back up just now.
ok, this is how I am combatting robots:
a robots.txt file that is searching all with * and only disallowing certain directories, mainly thte cgi-bin, images, and stuff, but letting access to the sites.
an htaccess to redirect to a holding page, unless I give certain links out, but I hardly give out links.
what else can I do to secure sites from nonsense like this, and concentrate on just good honest folk.
Trying to ‘hide’ sites that are live and "indexed" I don’t think it’s possible..
Some robots/spiders probably are polite and respect directives .. but I’m guessing the companies
that contact You aren’t.