Grow your CSS skills. Land your dream job.

SEO factor affecting web visibility in search results ?

  • # December 12, 2012 at 9:37 am

    Hi,

    what all factors you know of go into better displaying of web in search ?

    Please add what you know, here is what comes to my mind:

    * page title
    * H1 headings
    * heading “weight” (h2 – h6)
    * text on page (keyword density)

    * meta data (keywords in code)
    * bad code structure (good is better)
    * robots TXT
    * XML sitemap

    * age of domain (1 month old gets worse then 10yo ?)
    * number of inpage links
    * number of visitors comming to web
    * backlinks leading to web

    # December 12, 2012 at 10:35 am

    At the moment, having fresh current content is a pretty big thing. If you have a blog or something on your site, it’s usually a good idea to at least display your most recent article or articles somewhere on there.

    For the company I work for, we have found it helpful having an overall site/page description with keywords/phrases in small text below your footer. SEO is always changing and I’ve found this blog to be a great resource for keeping up [http://searchengineland.com/](http://searchengineland.com/ “http://searchengineland.com/”)

    # December 12, 2012 at 11:41 am

    Actually fresh and current content doesn’t matter. Infact, a page that is 10 years old could rank much higher than another page that is 2 months old. In some cases fresh content will perform better, especially in the case that an article’s given information is time sensitive. For instance, I wouldn’t care to read an article about CSS media queries that was dated for Aug of 2006, that would be pointless and I would just simply leave the page. Just creating new content is pointless unless that content is dated. This is part of the reason I put dates on virtually all my articles I publish online, so that people know the content is current. XML sitemaps have nothing to do with SEO other than the fact they help search engines to crawl links contained in PHP, java server pages, javascript, flash, etc. Other than that XML site maps are worthless and don’t matter. As long as your site has a clear and easy structure to navigate you are fine. This is another reason why Google is still telling people to create “static” pages, and use text links! Why, because they are simple to crawl, and you don’t need a foreign translator to make sense of them.

    Code structure does matter! If you are creating link menus, like say sidebars full of links using the “h1″ tag, search engines will penalize you for this, as they think you are trying to game the system, Google has stated they do take action against this. Anyone can build a website, but are they adhering to search engine guidelines? I think this is the real question.

    @jkinney “keywords/phrases in small text below your footer?” This is considered spamming of keywords as Google’s Matt Cutts states it, and they are seeking to de-rank sites for doing this too much. Your keywords should be directly in your content, not obscured from it! If your webpage is about pacman, then talk about pacman in your pages headings, paragraphs, etc, not below the footer of the page.

    And just my opinion, but Searchengineland is worthless. Sites like that are full of self proclaimed “know it all’s” who do nothing other than rant on and on about keywords, over and over again. They are SEO specialists, but most of them don’t know webdesign? How does that work when every webpage in existence uses well, HTML and CSS as their underlying technology. If SEO specialists are so smart, then how come they’re all getting sued since Google’s more recent Panda and Penguin updates took effect? JCPENNY has filed a multi-million dollar lawsuit against their previous SEO! They’re all crying “Penalties!” I wasn’t penalized for anything, then again I’m a web designer and I understand how webpages, search engines, and the internet work, so I have no reason to cheat the system, which is exactly what they do because they know no other way!

    # December 12, 2012 at 11:53 am

    Relevant and frequently updated content will improve your SEO ‘ranking’.

    Certainly, I would not pay any so-called SEO specialist…it’s extremely unlikely (nay, almost impossible) that they have any clue as to how Google’s (for the prime example) search algorithm works.

    > Just creating new content is pointless unless that content is dated.

    I disagree with this except to say that any new content should be relevant to your target market/arena.

    Certainly it makes sense to date one’s content but that’s for the readers benefit more than anything else. Google can tell when the page content was loaded/ updated, that’s part of the bots job.

    # December 12, 2012 at 12:42 pm

    And if nothing else, frequently updated content gets googlebot to crawl more often, which as @Paulie_D so kindly pointed out, can help or hurt your rankings based on the relevancy of the content.

    # December 12, 2012 at 2:30 pm

    SEO has become a complex over confused issue tbh. But there are a few simple things to think about. Google’s (and other) algorithms are probably smart enough to pick up on when sites are deliberately trying to artificially inflate their rankings and insome cases penalize them for it.

    If you take that into consideration it doesn’t actually leave much to do for SEO other than simple best practices and the bits of advice given by the search engines themselves, designers probably do forget about these more than they care to admit.

    watch chris’s article on it its quite good http://css-tricks.com/video-screencasts/83-thoughts-on-seo/

    # December 12, 2012 at 2:45 pm

    @paulie I think you misunderstood what I was trying to state. I meant that new content does not necessarily rank higher than content from 10 years ago. Although, Google’s Matt Cutts has stated that you shouldn’t leave a website idle with no updates for months on end. Hubpages started setting their “hubs” (articles) to idle if they haven’t been updated for too long of a period of time. This makes no sense. They have plenty of new content on a daily basis. Its best to either create new content and possibly update old content when possible. However, I know from my own experience that changing your content can cause you to lose a pages rank as based on keywords. This happens alot. I’ve seen time and time again on Google’s webmaster forums people complaining that they lost 95% of their organic traffic after redesigning their sites. This isn’t because they changed their sites appearance, its because they changed their sites content. Anytime you change your content, then Google takes you through the entire process of matching your content to keywords to be returned to users. This is also part of what caused whateverlife .com’s organic traffic to plummet earlier this year. They redesigned the site and its overall content, also the site crashed like god knows how many times, but its definitely not been in good shape, thats for sure.

    Change your content = Lose your rank and start all over again. “If it ain’t broke, then don’t fix it!” That’s my motto.

    I have one website I have made no changes to in 4 years and it ranks for 28 sets of keywords, yet its a localized site that consists of only 4 pages. The HTML even has errors in it, the site looks like crap, but it gets over a 1,000 visitors a month from people looking for piano lessons in st louis

    # December 12, 2012 at 6:47 pm

    I think you got them all there I can’t think of any more. Maybe a meta tag in the header of you page to explain the page to search bots

Viewing 8 posts - 1 through 8 (of 8 total)

You must be logged in to reply to this topic.

*May or may not contain any actual "CSS" or "Tricks".