Grow your CSS skills. Land your dream job.

How to stop programs gaining access & contacting you ?

  • # January 20, 2009 at 5:26 pm

    Hey everyone,

    recently I am having a lot of link requests, for a couple of sites I have done.

    Although I have kindly said no, I got one request that made me fume, a company wanted to link to a certain page. This page is on a site that I have done and that site is not yet live, so I have an htaccess file that redirects all people to a holding page.

    no most people will not even know how to bypass this, but one page that is not used but is still hidden was accessed and "viewed" and this is a page they want to link exchange to.

    ok I guessed it is a robot doing the searching for the info, but this page is not being used and wont be used on the site, just there as a back up just now.

    ok, this is how I am combatting robots:
    a robots.txt file that is searching all with * and only disallowing certain directories, mainly thte cgi-bin, images, and stuff, but letting access to the sites.
    an htaccess to redirect to a holding page, unless I give certain links out, but I hardly give out links.

    my contact scripts have javascript to make someone fill in the form, and a body check to see if nothing is in the body, it redirects to google.

    what else can I do to secure sites from nonsense like this, and concentrate on just good honest folk.

    # January 21, 2009 at 3:24 am

    Hi ikthius

    For sites that shouldn’t be live:
    I guess you could set up a rule in apache to only allow certain ip-addresses to access the site/page.
    http://httpd.apache.org/docs/2.2/mod/mod_authz_host.html

    Trying to ‘hide’ sites that are live and "indexed" I don’t think it’s possible..
    Some robots/spiders probably are polite and respect directives .. but I’m guessing the companies
    that contact You aren’t.

    /Jocke

    # January 21, 2009 at 12:02 pm
    "jocke76" wrote:
    Hi ikthius

    For sites that shouldn’t be live:
    I guess you could set up a rule in apache to only allow certain ip-addresses to access the site/page.
    http://httpd.apache.org/docs/2.2/mod/mod_authz_host.html

    Trying to ‘hide’ sites that are live and "indexed" I don’t think it’s possible..
    Some robots/spiders probably are polite and respect directives .. but I’m guessing the companies
    that contact You aren’t.

    /Jocke

    Hey Jocke76

    I will look at the apache thing later, after football.

    I am gettin all of these since putting up a robots.txt, to stop spiders, this was really concerning.

    but it would be good to know the best way to stop nonsense coming thorough for any site.

    if anyone else has other information, please share

Viewing 3 posts - 1 through 3 (of 3 total)

You must be logged in to reply to this topic.

*May or may not contain any actual "CSS" or "Tricks".