The forums ran from 2008-2020 and are now closed and viewable here as an archive.

Home Forums Other Solutions to Hide Things from Search Engine

  • This topic is empty.
Viewing 4 posts - 1 through 4 (of 4 total)
  • Author
  • #31812

    I’d like to know what you guys using these days to prevent SE from indexing pages, as well as unique parts of a page.

    the most common use for this, is of course hiding comments links.

    i remember, there used to be noindex tag.. but i wouldn’t be surprised if it got depreciated.

    so what are the solutions?


    Creating a robots.txt file is the best way to prevent SE indexing entire pages.

    You can find out more information on RobotsText


    TT_Mark, robots.txt are cool, and thanks to your link i found this article:
    which reminded me of


    and it’s cool too, but it’s still easier to simply put *noindex* tag inside the element where those comments are going and be sure, that whole block won’t be indexed.

    UPD: ahh, i remember the “problem” with noindex now!
    Russian search engines Yandex and Rambler introduce a new tag which only prevents indexing of the content between the tags, not a whole Web page.

    Do index this text block.
    Don't index this text block

    so that tag is only works with specified SE, which i don’t think anybody cares about.

    damn, that’s a really nice tag :’-(
    does anybody know a synonym for Google?


    Social bookmarking and photo sharing websites that use the rel="nofollow" tag for their outgoing links include YouTube and[18] (for most links); websites that don't use the rel="nofollow" tag include (no longer an active website) (formerly, Yahoo! My Web 2.0, and Technorati Favs.[19]

    well, if those guys using it, then i suppose it’s the only way :-(

    can anyone point me to a script that would automatically insert *nofollow* property into strictly specified block?

Viewing 4 posts - 1 through 4 (of 4 total)
  • The forum ‘Other’ is closed to new topics and replies.