SEO is an evolving art or science, depending on how you look at it and is actually a bit of both. Over time the Search Engines have refined the way in which they grade or determine which websites should be given the highest standing in the SERPs (search engine results pages).
What may have worked 3 years ago to push a website to the top of the search engines will not work the same today, and most likely will not work the same a year from now.
There is no way of knowing precisely how Google or Yahoo or Bing calculates their SERPs exactly but a solid strategy proven through trial and error does exist.
For the purpose of this article I will only discuss those techniques that would be considered ‘white hat’ or you might say ‘officially approved’ of by the Internet community.
There are many sources out there perpetuating the idea that you cannot get indexed quickly or rank near the top using only ‘white hat’ SEO. This is simply not true. In many cases site owners actually damage their ranking status and lower their SERP position by trying to use some ‘quick’ ‘Black Hat’ trick to get their website into the top position over night.
In some cases there could be a temporary increase in page rank and SERP position for some of the ‘black hat’ strategies that exist out there but in the long run those ‘quick ranking’ tools are only an illusion. Over time the search engines weed out the ‘trash’ and give high rank and good SERP placement to those websites that genuinely provide quality content, outstanding service, or fill a vacant ‘niche’. Usually, it is the websites that manage to accomplish all three of these requirements that sit in the number one position.
Ok everyone the lecture is over. Let’s start talking about the things that we can do to get our website ranked high and stay at the top.
First, foremost and most importantly is the basic design and structure of the actual website itself. A carefully planned out, user friendly, logical, and straightforward landing page or home page is critical. Remember, this is the page that your first time visitors usually see first and it is the one that the search engines ‘bots’ see first as well.
‘You never get a second chance to make a first impression.’
Actually I tend to believe that the search engines are far more forgiving than a typical Internet customer. The search engine’s bots will keep coming back to crawl your site no matter how bad it sucks. Your website could be full of broken links, scripting errors, spelling errors and grammatical mistakes and the bot doesn’t care at all. Well actually it does care, it reports all of these things back to search engine HQ.
Because it is only a bot, and bots do not have feelings, the bot does not hold a grudge for long and in a few days or sometimes weeks, the bot will come crawling back to you again (and yes, the pun is intended).
In all seriousness however, if a new ‘human’ guest lands on your site for the first time and runs into those same kind of issues: broken links, scripting errors, misspelled words or poorly worded content pages, they are more than likely not going to spend the time to come back and see if the website has been updated or corrected. That particular ‘potential customer’ has just become a ‘statistic’ (definitely in the red and not in the black).
You will hear a lot of talk about ‘Meta Tags’ and ‘Meta Descriptions’. These are actually words or phrases that are included in the website’s ‘code’ that are only visible to the search engines. The Meta Description however, many people see quite frequently but probably didn’t know what they were looking at when they saw it.
What is placed in a Web Page’s Meta Description is normally what the search engine will use to display in the area just below a Web Sites title in the SERP.
Using ordinary English and being descriptive in the Meta Description on each page of the Website will result in a good looking SERP listing.
Meta Keywords are also important. Meta Keywords are the keyword phrases that best target the specific page that they are used upon. These keywords should be targeted and be related to the content contained within the page body.
Keyword density refers to the ratio of a how many times a specific keyword phrase appears within a body of content to the total number of words that are within that content. In the early days of SEO this number was very critical to ranking high is the SERP’s. Today this number is much less important and while the appearance of a targeted keyword is important within article content, keywords phrases should be used in a natural sounding context and not ‘overloaded’ or ‘stuffed’ into the content.
Other practices that should be completely avoided are: the use of hidden content and the use of content that is completely unrelated to the overall website subject matter. The technique of using very frequently searched for keywords but keywords with no relationship to the actual website they were embedded within was once a very common and effective tool to increase SERP rank. This does not work today so do not be tempted to try using it. Some Internet Marketers might try to convince unwary website owners to engage in this technique; do not bother as it will not be effective.
Use the H1 header tag to focus each web page on a particular keyword phrase. Embedding the keyword phrase that best summarizes the content of a particular web page in the H1 header tag helps the search engine organize the content of your website and rank the importance of the content of the pages. And, of course, if you have chosen a topically relevant and appropriate Keyword Phrase, it will help your Website guest know they have found the right page.
Maintaining a structured, easy to navigate, uncluttered and not overly complicated web design results in lower bounce rates. A confusing or very unorganized web design will turn off most Internet Surfers really fast. The typical Web Surfer is expecting to find a website with information that is easy to find, well organised, and truly informative.
In designing any website probably the most important thing to keep in mind during the actual development of the content is to include as much useful, high quality, informative data, about the business, industry, or focal point of the website as possible. The more original content that appears within the body of the website the better the search engines will like it. Your target market will appreciate this as well.
In addition to highly descriptive content describing the products and services that a website is providing, a glossary of terms related to the industry the website specializes in is a good addition to any website as well as a FAQ’s (frequently asked questions) page.
When it comes to SEO, there can literally never be too much information included in a website as long as it is all completely original unique content (or where appropriate, references are duly noted).
Posting articles with content that relates to the target website and then including links that have your keyword phrase tagged as the anchor text either embedded within the article or as part of the author’s signature, is an excellent way of building good high quality back-links.
Blogs are another great place to submit articles where there is a good chance for a posted or submitted article to be viewed by a large audience. Blogs also allow the author of the article to receive feedback through comments left by readers of the blog.
This brings us to the topic of links, and more specifically ‘back-links’. The term back-link refers to a hypertext link, which is located on an external website that links, references, or points back to one’s own website. A ‘back-link’ is made up of two parts:
1. The ‘URL’ of the destination site (or the site the link is pointing to).
2. The ‘anchor text’ which is made up of the ‘keyword phrase’.
An example of a ‘back-link’ written in HTML code would look like the following:
When displayed this link would appear as:
Your Keyword Phrase
‘Your Keyword Phrase’ would be an active link pointing to the ‘url’ that follows the ‘href’ in the link statement above. (Please be sure to remove the asterisks – I used them only to prevent the link from becoming active.)
To help prevent the misuse of link farming and other types of link manipulation the use of the ‘no follow’ attribute was created and has been implemented. When the use of the ‘no follow’ attribute is applied to a link it directs the search engine to award no PR or ‘PageRank’ benefit to the landing site no matter how high the ‘PageRank’ of the referring site may be.
The use of this attribute has only been somewhat effective in reducing the amount of spamming and ‘spamdexing’ that is used to try to boost SERP position. The practice of simply blasting back-links out all across the Internet is on the decline as its effectiveness diminishes.
There is however, some benefit to having a good deal of back-links inbound to the target site. Today it is more a matter of how to acquire numerous good high-quality back-links than simply an exercise in who can accumulate the most.
I own a Computer Repair and Data Recovery business in San Antonio, TX. I spent 10 years in database development and the past seven mainly repairing and servicing hardware. However, I am now also offering Website Development, Internet Marketing, SEO, and Hosting.
Author: donmillerThis author has published 19 articles so far. More info about the author is coming soon.