SEO factor affecting web visibility in search results ?
# December 12, 2012 at 9:37 am
what all factors you know of go into better displaying of web in search ?
Please add what you know, here is what comes to my mind:
* page title
* H1 headings
* heading “weight” (h2 – h6)
* text on page (keyword density)
* meta data (keywords in code)
* bad code structure (good is better)
* robots TXT
* XML sitemap
* age of domain (1 month old gets worse then 10yo ?)
* number of inpage links
* number of visitors comming to web
* backlinks leading to web# December 12, 2012 at 10:35 am
This reply has been reported for inappropriate content.
At the moment, having fresh current content is a pretty big thing. If you have a blog or something on your site, it’s usually a good idea to at least display your most recent article or articles somewhere on there.
For the company I work for, we have found it helpful having an overall site/page description with keywords/phrases in small text below your footer. SEO is always changing and I’ve found this blog to be a great resource for keeping up http://searchengineland.com/# December 12, 2012 at 11:41 am
Code structure does matter! If you are creating link menus, like say sidebars full of links using the “h1” tag, search engines will penalize you for this, as they think you are trying to game the system, Google has stated they do take action against this. Anyone can build a website, but are they adhering to search engine guidelines? I think this is the real question.
@jkinney “keywords/phrases in small text below your footer?” This is considered spamming of keywords as Google’s Matt Cutts states it, and they are seeking to de-rank sites for doing this too much. Your keywords should be directly in your content, not obscured from it! If your webpage is about pacman, then talk about pacman in your pages headings, paragraphs, etc, not below the footer of the page.
And just my opinion, but Searchengineland is worthless. Sites like that are full of self proclaimed “know it all’s” who do nothing other than rant on and on about keywords, over and over again. They are SEO specialists, but most of them don’t know webdesign? How does that work when every webpage in existence uses well, HTML and CSS as their underlying technology. If SEO specialists are so smart, then how come they’re all getting sued since Google’s more recent Panda and Penguin updates took effect? JCPENNY has filed a multi-million dollar lawsuit against their previous SEO! They’re all crying “Penalties!” I wasn’t penalized for anything, then again I’m a web designer and I understand how webpages, search engines, and the internet work, so I have no reason to cheat the system, which is exactly what they do because they know no other way!# December 12, 2012 at 11:53 am
Relevant and frequently updated content will improve your SEO ‘ranking’.
Certainly, I would not pay any so-called SEO specialist…it’s extremely unlikely (nay, almost impossible) that they have any clue as to how Google’s (for the prime example) search algorithm works.
> Just creating new content is pointless unless that content is dated.
I disagree with this except to say that any new content should be relevant to your target market/arena.
Certainly it makes sense to date one’s content but that’s for the readers benefit more than anything else. Google can tell when the page content was loaded/ updated, that’s part of the bots job.# December 12, 2012 at 2:30 pm
This reply has been reported for inappropriate content.
SEO has become a complex over confused issue tbh. But there are a few simple things to think about. Google’s (and other) algorithms are probably smart enough to pick up on when sites are deliberately trying to artificially inflate their rankings and insome cases penalize them for it.
If you take that into consideration it doesn’t actually leave much to do for SEO other than simple best practices and the bits of advice given by the search engines themselves, designers probably do forget about these more than they care to admit.
watch chris’s article on it its quite good http://css-tricks.com/video-screencasts/83-thoughts-on-seo/# December 12, 2012 at 2:45 pm
@paulie I think you misunderstood what I was trying to state. I meant that new content does not necessarily rank higher than content from 10 years ago. Although, Google’s Matt Cutts has stated that you shouldn’t leave a website idle with no updates for months on end. Hubpages started setting their “hubs” (articles) to idle if they haven’t been updated for too long of a period of time. This makes no sense. They have plenty of new content on a daily basis. Its best to either create new content and possibly update old content when possible. However, I know from my own experience that changing your content can cause you to lose a pages rank as based on keywords. This happens alot. I’ve seen time and time again on Google’s webmaster forums people complaining that they lost 95% of their organic traffic after redesigning their sites. This isn’t because they changed their sites appearance, its because they changed their sites content. Anytime you change your content, then Google takes you through the entire process of matching your content to keywords to be returned to users. This is also part of what caused whateverlife .com’s organic traffic to plummet earlier this year. They redesigned the site and its overall content, also the site crashed like god knows how many times, but its definitely not been in good shape, thats for sure.
Change your content = Lose your rank and start all over again. “If it ain’t broke, then don’t fix it!” That’s my motto.
I have one website I have made no changes to in 4 years and it ranks for 28 sets of keywords, yet its a localized site that consists of only 4 pages. The HTML even has errors in it, the site looks like crap, but it gets over a 1,000 visitors a month from people looking for piano lessons in st louis
You must be logged in to reply to this topic.