Search Engine Optimization Strategy

Introduction to Search Engine Optimization

The motive of search engine optimization and submission is to attract targeted traffic by attaining very high positions in the search results. This can be done by using the most appropriate keywords relevant to the content of your site. This article suggests strategies to optimize your web site and improve its search engine ranks for the appropriate keywords, together with search engine optimization and placement advice and tips.

In the past, it was enough to tweak some of your meta tags and certain on-the-page factors to attain top search engine ranks. Today, search engine optimization is not that simple. It is not enough to simply modify meta tags and submit your site to the search engines. When optimizing a web site, there are thousands of factors to keep in mind.

If you are new to the concept of search engine optimization and placement, I would advise you to read the basics first. Click on this link for the concepts of Search Engine Optimization.

The foremost criteria for a web site to be able to attain and retain top search engine ranks for a long period of time is to have a great Content. If your site is not a quality-content rich site, add content to it to make it one. One of the final and crucial steps of search engine optimization campaign is the proper submission of your web site to the major search engines.

The in-between steps are purely search engine optimization. These steps are critical to obtain first-page ranks, and most sites unknowingly ignore these steps, but without these, you are destined to be a search engine failure.

Remember – there are no search engine optimization secrets – just optimization methodologies to follow. In order to beat your competitors and obtain first page ranks in the search engine results for your most desired keywords you have to follow certain steps.

How the search engines work

The term “search engine” is commonly used to specify both the search engine crawlers as well as the human powered search directories.

The crawler based search engines, such as Google or Altavista, use software programs (called robots, crawlers or spiders) to create their index. These software programs crawl the world wide web to accumulate information about new or modified sites. When people search, they actually search in the search engine indexes.

The human edited search directories, such as Yahoo, ODP or Looksmart, depend on humans for including web sites into it. They maintain a hierarchical structure of directories and sub-directories called categories or sub-categories. Web sites are submitted into their related categories.

However, there are some search engines which use mixed results, sometimes using human edited directories while at other times uses index created by crawler, in the results page. These are sometimes termed as hybrid search engines. Yahoo presents mixed results to its surfers. It will give preference to any one type of listings over another depending on the search query. Yahoo prefers to provide the human-powered results for popular key terms. It presents the crawler-based results (powered by Google) for the search terms which are cryptic or less popular.

Understanding how the search engines rank web pages

When a surfer conducts a search in a search engine, the search engine software scans through the entire search engine index and returns a list of web pages, which it feels are relevant to the search query. It even ranks the web pages in descending order of relevance and popularity. This happens in a fraction of a second. Each search engine has got its own algorithm of ranking the web pages.

The algorithms that the search engines use to determine the relevancy of a page, when confronted with hundreds of millions of web pages to sort through, is a closely kept trade secret. In spite of the difference in their algorithms, they still follow a set of common rules. I will discuss the common factors in this section of the article.

If you are interested to know how a particular search engine works, choose a search engine from the drop down list box and click on view.

// Click here to choose a particular search engine Altavista ins & outs AOL ins & outs Fast/alltheweb ins & outs Google Ins & Outs Hotbot ins & outs Inktomi ins & outs MSN ins & outs Overture ins & outs

However it is advised to read this article first to get a detailed overall idea of the major search engines and then move on to the details of specific search engines.

One of the main on-the-page factors, which determines the relevance of a web page to a search query, is called the location-frequency factor.

Location means the different parts of the web page where the keyword is present. Frequency is the number of times the keyword was mentioned in the web page. The search engines evaluate the locations as well as the frequency of the keywords to analyze the relevance of the page. The web pages with a higher frequency are often deemed more relevant than other web pages.

So, what are these special locations of a web page that are important in search engine optimization?

Titles, Meta descriptions, Meta keywords, Headings,
Alt tags, Anchor texts, First two paragraphs

Pages with the keywords appearing in the title tag are often assumed to be more relevant to the search query than others. Remember, title tag is the most important location of a web page after the main content of the body tag. Almost all search engines give importance to this tag.

Some search engines also give importance to the presence of the keyword in the meta description and keyword tag.

The next most important location which all search engines give importance to is the body tag. The main content of your web page should contain the keyword a considerable number of times to make the search engine spiders feel that the content is relevant to the search query. The first two paragraphs are extremely important in search engine optimization.

The headings of the web pages are also deemed as important locations by some search engines. The other important locations are – the text in the alt tags of the images and the anchor tags of the hyperlinks.

Almost all the major search engines follow the above-mentioned location frequency algorithm to some extent. No two search engines do it the same way. Some search engines index more web pages than others. No search engine has the exact collection of web pages to search through. This is the reason why the same search conducted on different search engines produces different results.

In the early days of the search engines, meta tags were assumed to be the secret to propel a web site to the top of the search engine results. But now, all search engines do not read meta tags. So meta tags can be part of the ranking algorithm, but they are no more the dominating factor for deciding relevance.

However, sometimes the search engines may also penalize web pages or ban them from their index, if they detect spamming. Search engine spamming is the practice of using unethical tricks to rank high in the search engine results, and thereby manipulating the search engine to produce irrelevant or low quality results. Some common examples of search engine spamming are unnecessary repetition of keyword, hiding text by using font color same as the background color of the web page, etc.

Click here for a detailed article on search engine spamming.

The programmers who created the search engine software are aware that many web masters make use of spam techniques in order to get their low-quality web sites ranked high in the search engine results. Spammers sometimes go to the extent of creating their web pages by reverse engineering the location-frequency algorithm.

In order to prevent spam, almost all search engines use certain off-the-page factors in their ranking algorithm. These are the factors that are not easy to influence, but are important in search engine optimization.

The majors among them are link analysis and click-through measurement. In a link analysis system, only the most relevant and popular sites will attain a high search engine position. Link analysis is a very useful and foolproof way to determine which pages are good for particular key phrases. It is a mathematical way to determine if other web sites related to your industry think that your site is important.

Link analysis is the analysis of the number and quality of the incoming links that are pointing to your site. It is beyond the number of links. Here, it is not the number of links that matters, but the quality of those links that are important.

Click here for a detailed article on link popularity analysis.

Click through or click popularity is the number of times your web site gets a click when it is displayed in the results of the search engines. It also depends on how much time the visitors, coming to your site from the search results, spends on your web site.

These days, many crawler-based search engines as well as human edited directories seem to use the concept of click popularity within their ranking algorithm. The objective behind the concept of click-through measurement is to determine which web sites satisfies the queries of the surfers.

Search Engine Optimization Budget – Pay-For-Inclusion charges

We, at Search Engine Optimization Ethics, suggest that any web master, who wants to launch a new commercial web site or promote the existing web site, should set aside a certain search engine submission budget for online promotion of his business.

When an entrepreneur wants to start an offline business, he sets aside certain essential expenditures for the business, like registration of his business, trade license, professional tax certificate, etc. These expenses are the startup cost of an offline business. Similarly, for an online business, there are some startup costs, like the domain registration and web hosting charges.

In the recent past, this was enough. But nowadays, people have realized that search engine visibility is the life blood for any online business, and it is a wise decision that any site owner should establish a search engine optimization budget.

However, I would like to remind that free search engine submission is still possible and many web masters still depend on them entirely and many of them get through too. The paid programs that are offered will just accelerate the indexing process and will almost certainly start generating more traffic from the search engines.

The most essential: Yahoo

The ‘must’ search engine submission cost you should consider is submission to the Yahoo! directory. They charge an annual fee of $299 for inclusion. This easily pays for itself in traffic for most people.

Remember that this fee does not guarantee inclusion, it just guarantees that your site will be reviewed by a Yahoo! editor within seven days and you will also get a reply of acceptance or rejection. In case you are rejected, you will  be given the reason for  rejection and a one-month time period for a re-appeal. As a matter of fact, most sites using the paid inclusion program, get through if they follow the editorial guidelines.

Again, you can get listed for free if  you are a non-commercial site, but it sometimes needs months.

Crawler submission budget

Usually, web masters interested in getting their sites listed in the major search engines, want to get visible quickly. If you can afford a little more money, you may consider the paid submission programs of some of the major crawlers, like Inktomi, Altavista and Alltheweb. These paid inclusion programs will help you get included in their respective index within a few days. These programs commit inclusion in their index for a year (Altavista’s express inclusion is for a term of six months), but be aware that they do not assure the rankings.

Design for the search engines from the beginning

More than 85% of the Internet population finds new web sites through the search engines. Still, most web masters do not take into account the search engines when they are creating their web sites. Their web sites are not able to seize the traffic from the search engine that they might receive without much effort.

In this section, I notify you of certain design strategies that may prove to be a roadblock on the path to search engine optimization success. Imagine the situation, when you spend several weeks to design a web site. When it is ready, you plan to optimize it for the search engines. The search engine optimization consultant tells you that you have used a methodology that prevents the search engine spiders to read your web site. In that situation, you might have to re-design the entire web site again to make it visible for the spiders.

I believe prevention is better than cure. In this section, I warn you of certain design constructs that you should avoid.

But what if you have a web site which already has used one of these constructs? Do not panic. There are solutions. I will cover those as and when required.

But, if you are in the  process of creating or plan to create a web site, take our tips.

Many web sites make use of dynamically generated web pages using Perl, ASP or Cold fusion in their web sites. These technologies do great from a user standpoint, but from a search engine stand it can be difficult. The problem is that these technologies often use urls that has a ? symbol within them. It prevents the spiders from crawling the web pages.

So, if you manage to get rid of the ? from your web site address and make it look like a static url, you will do tremendous promotion to the sales of your company.

Click here for detailed information on search engine optimization of dynamic web pages.

Frames can be another roadblock. If you are creating a web site, stay away from frames from the very beginning. If you have already used them, getting rid of them is the best idea. Sometimes frames can make the web site easily navigable. But they create significant problems for the search engine spiders.

The first and best suggestion is to get rid  of the frames. But, if you insist on keeping them for some reason, optimize the frames. Click here for detailed information on how to optimize a framed site and reduce its impact.

In the past, when web masters used to design their web sites, they usually checked out how it would look like when seen from the popular browsers, and made sure that they looked impressive to all popular browsers. As the search engines are gaining more importance for online businesses, clever web masters will keep the search engines in their minds while designing their pages.

Planning Your Search Engine Optimization Strategy

In this section, I will guide you on how to plan search engine optimization for your web site. I will not recommend you ways to trick the search engines to attain top ranks in its results. In fact, there are no search engine secrets. But a number of small changes in your web site and submission methodologies can produce big results.

Research and analysis of the scope of promotion

The foremost step in optimizing a web site is to study the nature and scope of the business. A lot of brainstorming is involved. This step gives you an insight into the scope and aspects of promotion of your web site. It later becomes useful in the different stages of search engine optimization strategy.

Brainstorm a little and think about the web sites that are not in competition with you directly. However, their businesses are in some way related to yours. This becomes very useful while link building, and to some extent, in the process of keyword research too.

Suppose you have a shoe shop. Your direct competitors are the other shoe shop or company owners, who might be a little reluctant to link to you. But a shoe manufacturer company might not be that reluctant to link. If you brainstorm a little more, you may  feel that links from web sites which discuss the latest trends in footwear, may also benefit your web site.

Assume you have a web hosting business and want to boost the link popularity factor. Links from other web hosting companies will definitely be useful. But getting links from the competitors is always difficult, though it is not impossible. Instead of getting confined to just the other web hosting companies and keep on trying without much success, you  may request links from domain registration companies, web site designing companies and probably also from search engine optimization companies. The chances are that they will link to you more easily.

In both the above examples, you have seen that with a little thought you can discover more areas of requesting links for your web site. The insight that you get from this step not only helps you to  identify greater scope in link popularity, but also in keyword researching and content building.

Let us consider the first example. If the web master feels that keywords related to the latest trends in footwear are very popular and the competition is also not very tough, and it will be easy and beneficial to get ranked for those keywords, then he might chose some popular keywords related to the latest trends in footwear and develop a new section that will relate to the keyword. This new section will contain information-rich web pages for those keywords. When these pages get ranked in the search engines, visitors will come to the other parts of his web site, like homepage etc., following the links from within those pages.

Let us consider the other example. If the web master feels that keywords related to domain  registration are very  popular and the competition is also less severe as compared to web hosting, then he might chose some popular keywords related to domain registration and develop a few information-rich pages on domain registration. This section will afterwards draw traffic to other parts of the web site.

Identifying your competition

I have mentioned earlier in this article that the objective is to beat your competitors and obtain first page ranks in the search engine results. Identifying them and reviewing their optimization strategies from time to time helps you throughout the search engine optimization process.

For this, you need to have an idea who are your competitors. Reviewing their sites and promotion techniques from time to time will give you many new ideas and tips of search engine optimization. These might help you in your online business.

How do you identify your competitors? Use the search engines. Search with the keywords that you expect surfers to use to locate your web site. The web sites that are ranked in the top positions are the competitors whom you have to beat to reach the top.

How does this help? This helps when you are trying to build links to your web site. Sometimes, you do not know whom to request a link. Suppose you search with your most desired keyword in a search engine and come across a few sites whose link popularity scores are very high. Find out how they have scored so high in the link analysis system. Ypu can then implement these ideas in your site and improve your ranks.

How do you know which site are linking to your competitors’ sites? Again, use the search engines. Analyze how and why they are linking to your competitors. You can also incorporate those strategies.

Sometimes you will find that your competitors have a number of informative articles. These are allowed to be republished in their web sites for free if they agree to put a link to their web site with the articles. At other times, you may find that the newsletter of your competitor is the main attraction, and others who do not have a newsletter section love to link to them.

This way you can take ideas from your competition. However, be sure that if they are doing something which can be termed as spam, do not implement it even by mistake. Forget about that competition right then, as sooner or later the search engines will detect them and they will be penalized.

Click here for detailed information on search engine spamming.

Picking the target keywords

Choosing the appropriate keywords for your web site is a very important step of your search engine optimization process. The keywords you pick in this step are repeated throughout the entire process.

So, what are the target keywords? Obviously they will vary site to site. The first step is to brainstorm and make a list of the terms which your probable customers will be searching with when they search for a product like yours.

Imagine you have a travel agency which operates in Australia, called John Travels. Apply your thought processes to make a list of keywords, which those net surfers who are interested in travelling to Australia will be searching with. Please ignore any one-word keyword that comes to your mind e.g. travel. Some examples of one-word keywords are


The one-word keywords should be avoided as they are hyper-competitive. Instead of wasting your effort with the one-word keyword, it is suggested that you concentrate on those keywords which are easily manageable. It is wise to use the two or three word keywords. Example of three-word keywords are:

travel to Australia
tourism in Australia

So, after going deep into the minds of your probable customers, you come up with a list of keywords which they might be searching with.

travel australia
travel agencies australia
tourism australia

Now you have the initial list of keywords. I will also out other details to make the final list.

You still need to be sure about your choices. You also need to check out how popular are the keywords that you have chosen. Otherwise, you may come up with keywords which are unpopular, probably only one surfer has searched for that particular keyword in the last five years. The idea is to come up with keywords which are relevant to your business as well as are popular among the net surfers.

Another thing that you have to consider is the number of competitors for those keywords. The best keywords are those with high popularity and fewer number of competitors. It takes a lot of time, skill and study to find those keywords. This step is one of the initial stages of search engine optimization. If you start it after the design is complete and content has been written. It will be too late.

Fortunately there are services available on the Internet from proprietary databases that let you access the search history of the net surfers. Each of these services are able to provide you with some definite numbers that interpret the effectiveness of the keyword. The balance between the popularity and number of web sites that compete for a particular keyword eventually decides the effectiveness of a keyword. Sometimes, it is a very subjective thing to decide upon the keywords.

Some of the most popular and free keyword research tools

WordTracker Keyword Research and Popularity Tool
Compiled database of keywords that people search for. Tells how often people search for the term and how many competing sites use that same term on a search engine. Helps measure keyword effectiveness.

Google AdWords Keyword Suggestions
7Search Keyword Suggestion Tool
This tool generates keyword popularity by the number of times that a keyword was searched in the 7Search database the previous month. It also displays related terms and their popularity.

Overture Keyword Suggestion Tool
Displays how many times a certain keyword was searched for in a given month. Shows all related searches for the entered keyword.

The best keyword research tool as of today is Wordtracker. Let us discuss a little how this tool works.

Step 1: Pick the first keyword from your initial list and test it at Wordtracker. First, it will return a list of keywords that contain the key phrase, which you have entered, either partially or totally.

Spep 2. Clicking on the first keyword from the list will generate the keywords related to them and which are popular as well. Now you have to use your own ideas to choose those keywords which are appropriate for your business. Take a piece of paper, label it as “Final List”, and note down the appropriate keywords.

Step 3. Repeat step 2 for all the keywords.

Step 4. Repeat the steps 1-3 for all the keywords in your initial list.

Now you have the list of keywords that are extremely popular among the net surfers. But a little editing of this list is still required. Check out the number of competitors for those keywords in the search engines. Instead of visiting all the search engines and checking the number of competitors, you may choose one search engine and treat it as the standard one. Let us choose Google as the standard.

Visit Google and search with each keyword from your list, and check the number of competitors for those keywords. If you feel that the number of competitors are TOO high for a keyword for you to beat your competitors and you have other moderately competitive keywords on your list, you may decide to exclude that from the list.

These finally make your final list of keywords where each keyword is extremely relevant to your web site. They are remarkably popular and you are confident that you will be able to beat your competitors.

How to use your title tag and meta tags

Title tag is the most important tag in search engine optimization after the real visible content of the web pag. All search engines make use of the text in this tag in their ranking algorithm.

Let us see how I create a title tag that is able to generate results. Our target is to create a keyword-rich title which will also entice the surfers to click on your web site’s link. Remember to use only those keywords that are in the visible textual content of your page.

When you start creating the title tag of your web pages, first revisit the final list of your keywords. You will then have some idea of the words that you are going to use.

Have a look at this title:

How does it look? The forecolor that has been used here is the general forecolor used by the search engine for links. To make it look distinct, the site owner has used all uppercase characters. However, this has only made it more difficult to read.

If your services are centered within a particular locality, state or country, you should try to focus on targeted traffic  based on location. The travel agency is in Australia and offers travel packages to the global market within Australia only.

With that in mind, let’s try to figure out what the web surfers who are interested in travelling to Australia would type in the search box:

travel Australia
travel agencies Australia
tourism Australia

Our objective should be to combine all the keywords in order to create an attractive phrase. Remember that the title tag is the first impression that your visitors have of your web site, so the marketing attitude should be the dominating factor. Let us see how many keywords I could combine without hurting the attractiveness of the words.

Australian travel agency – John travels

I think I should be happy with this title though I could not include the last keyword as I had to keep it attractive. I started with the word Australia, and this increases our chances to getting highly ranked when a search query is Australia specific.

Sometimes, web masters spend a lot of time and effort editing the meta tags more than it is needed. Meta tags are important but are definitely not a dominant factor for your web site’s high ranking. However, you can use their potential to the maximum if you re-establish the visible textual content of your web page.

Simply inserting a keyword in your meta tags does not imply that you will attain high search engine positions for that keyword. The meta tags can just be used to add more support to the actual text of the web page.

It is highly recommended that you add meta description and keyword tags to the html code of your web pages. Some search engines read them and will probably boost a web site slightly which contains them.

They exist in the head section of a html page, between the <head> and </head> tags. If you have any javascript on your page, it is advised to place the meta tags before them. The search engines give more importance to the text which appear higher on the pages.

Remember, meta tags are mainly a design tool for helping web pages with very little textual content to be better acknowledged by the search engines.

The meta description tag is a snippet of html code, which is usually placed after the title tag and before the meta keyword tag. The syntax of meta description tag is:

<meta name=”description” content=”the text you want to put in the descriptive tag.”>

This meta tag fulfills two objectives. The text placed within this tag is given some weight with most search engines, and can help a page to rank slightly higher in the search results.

The search engines will often use the first few lines of the visible text that appears on the web page as the description of the site that appears on the search results pages, if no meta description tag is used. But if description tag is used it takes the text of this tag and uses it as the description of the site that appears on search results page. It lets you control the description of your site in the search engine results.

It should definitely use the important keywords for the page and should be written in such a way that it will tempt the surfers to click on the link to your web site. It should be thought of as a marketing tool, apart from a search engine optimization tool.

Suppose you have a web page without the meta description tag. The title of the page says “John’s Shoe Shop”, the first visible text on the web page is a <h1> tag that says “Welcome to John’s shoe shop!” and some navigational links after that. The link of this web page will look like this in the search results:

John’s Shoe Shop
Welcome to John’s shoe shop!

This link is repulsive. Let us see how I can make the listing of this web page attractive.

Let us suppose that this page is about sports shoes, fancy shoes for ladies and kids, footwear fashion, etc. You can re-write the description tag.

<meta name=description content=”Every kind of footwear – sports shoes, fancy shoes at reasonable rates. All shoes are of the latest footwear fashion.”>

Now your web site link will look something like this in search engine results:

John’s Shoe Shop
Every kind of footwear – sports shoes, fancy shoes at reasonable rates. All shoes match with the latest footwear fashion.

Most search engine index contain approximately 200 characters of the meta description tag. Do not repeat words in this tag. You may use various forms of words in this tag, like using plural or singular forms of words, or using a different tense etc. Try to write it as an actual sentence, instead of making it a series of keywords. Limit this tag to one good descriptive sentence. If you are using two sentences, make sure that they are short.

However, going over the limit does not mean that your tag will be discarded. However, the search engine will not use the extra text beyond the limits. Stick to the minimum to be on the safer side.

The first words of any tag are given more weight than later ones. So put the important keywords first. Try to use the same keywords first that you will use in the title tag.

In the early days of search engine optimization, the meta keyword tag was a great tool for the search engines to use to help them determine how to rank sites in their search results. But soon people started to abuse this tag by stuffing it with keywords, sometimes even with irrelevant ones (if they were popular words on the Net). This brought untargeted visitors. Finally, it was neither good for the search engines, net surfers, nor the web sites.

Over time, search engines started to withdraw the importance they had given to this tag. The main content became the dominating on-the-page factor. But as there is no harm in placing the most important keywords in this tag, you may do it. The syntax of meta keywords tag is:

<meta name=”keywords” content=”keyword1,keyword2,keyword3″>

Optimizing a web site for the search engines is a cumulative effort. You should use everything available to you that the spiders might give some importance, and so you should use the meta keyword tag.

Creating, improving and adding good content to your web site

Content is king – This is an unanimous and ever-lasting truth in the search engine optimization and placement industry. The “long-term” success or failure of a web site in the search engines depends on the content of the web site to a great extent.

Therefore, if you do not have enough high quality textual content in your web site related to the keywords, start developing it right now. Writing articles that are related to your industry can be a fantastic way to increase your content. The underlying intention is to make the search engine spiders feel that your web site content is very much related to and close to the keyword you have chosen. This will make it easier to get links from other web sites.

It is very time consuming and requires a lot of effort and skill to develop the content of your site.  If  you do not have that much time to spend, hire a professional copywriter to do it for you. Great content is a must.

Make sure that there are no grammatical mistakes in your web pages. Proof-read the entire content of your web site for spelling mistakes or typos before you start promoting it. Misspelling or other textual mistakes create a bad impression on the visitors.

Optimizing the page content

Relevant and optimized content is the foundation of success in search engines, and is the most important factor that you should focus on. Relevant content strengthens the keywords used in the title and meta tags. Optimized content is the placement of keywords within the main content to ensure the best keyword density.

Position your target keywords to appear in the important locations. The heading tags are important. Include the keywords in the heading tag. Try to write them in such a way that the keywords are placed closer to the beginning of the tag.

Some say that text with a larger font size or that are made will carry more weight than text in the normal font size of the page. Similarly, text with a smaller font size than the regular font will get lesser weight from the search engine spiders.

It is believed that the first words of an html tag are of more importance to the search engine than the words that come later. Some spiders may not index the whole page, so it’s useful to have the code of your important body content as close to the top of the page as possible.

The first two paragraphs are extremely important when search engine spiders look at the content of  your web page. It is believed that the search engines prefer pages where keywords appear near the top of the page. Include them in the first two paras of your web page. However the entire content is important.

The bottom line is to ensure a proper keyword density while keeping the copy as attractive as possible to the human visitors.

However, placement of the body content near the top becomes a factor if you have a lot of javascript in the head section, or if you are using a table structure. The codes of the table or javascript pushes the real textual content downwards.

If you want to optimize javascript, put all the javascript code in an external .js file and call that file from within your html file.

Click here to learn more about search engine optimization of javascript.

Some people in the SEO industry believe that search engine spiders are not able to read the text in the tables. Tables can push your textual content further down, making keywords less relevant because they appear lower. This is because tables will break apart when search engine spiders read them. For example, visualize a two-column table, where the first column has links, while the second column has the main text.

Figure 1. – Humans will see the page this way.

Fruits are extremely nutritious and contains many essential vitamins. Let fruits constitute the major part of your diet.

Figure 2 – Spiders will see the page this way.

Fruits are extremely nutritious and contains many essential vitamins. Let fruits constitute the major part of your diet.

See how the keyword-rich main content has moved downwards. But I have never found tables to be an obstacle for high search engine rankings, provided if you have great content, optimized meta tags and a good link popularity score. However, there are a few things to remember while using tables.

Remember the spider reads the table left to right and top to bottom. So the text in the first column of the first row will carry more importance than any other cell. Try to keep your navigational links or the most important content in the first cell/column of a table. Use textual navigation links in your first column instead of graphical buttons.

Another suggestion is to place some great headlines by using the header tags at the very top of your page , outside the table. These will first be read by the engines.

Keep the structure of the table as simple as possible. Nested tables sometimes create problems. A nested table is a table inside another table. There’s one main rule when it comes to nesting – do not go too deep.

If you want to use every known method, you may try Jere Matlock’s Table Trick. It is a tool that helps you place your main table section of code first. I have not tried it personally, but some people claim that it is good. It is not considered spam technique.

Another important location where you should place your keywords are the text withinthe anchor tag and the alt tag of the images. Search engines read and lay emphasis on the text within the links, so use them to impress the search engine spiders. Try to use textual links wherever possible, since spiders cannot read images.

Place your keywords in the alt tag of your text. Do not just write a series of keyword, but make proper grammatical sentences.

Some believe that spiders assign more importance to the last few sentences of a web page, since they are probably looking for conclusion information. Include your important keywords atleast once in the last three sentences.

Making navigation easy for the crawler

Try to read the web pages from the standpoint of a search engine spider. It cannot read images, frames, javascripts or nested tables. All it does is crawl the world wide web and collect information about web sites. When it comes across a new web page, it puts it in the index. It then follows the links within that web page to find other pages within that site.

If you want to get a high rank high in the search engine listings, all you have to do is make the job of the spider easier.

Sometimes, web masters, in order to give their web site a pleasant appearance, place text in the form of graphics. Often you will come across a site on the Internet that has got nothing but a big image on it’s homepage. The visitors can go to the other pages of the web site. This might not have created any difficulty for the human visitors.

But a search engine cannot read images and therefore cannot enter the site. Often web masters use only image map links in their web pages. A search engine spider that can’t follow these links will not be able to read the other pages of the site. I highly recommend that you avoid image map links and use text links whenever possible.

If you still want to use image map links, solve the odds against it by adding text links at the bottom of the page. The spider will be encouraged to find other pages on your web site using these text links.

Add a sitemap page to your site and name it site-map.html. It will link to every page of your web site textually and submit this page to the search engines. It will help the search engines locate pages within your web site.

Be sure to internally link the pages of your site with each other in order to help the spider find your complete site. Make all your pages link to your homepage.

Link popularity building

The search engines incorporated popularity factors into their ranking algorithm in order to create better relevancy formulae. This does not mean that the on-the-page factors or the importance of the main content can be ignored. If there are two sites with equally optimized meta tags and attractive content, the site having more link popularity will win the search engine optimization game.

Therefore, if you want to succeed in the search engine ranking battle, on-the-page optimization will no longer help unless it is enhanced with high-end link popularity.

The link popularity of a web site is the total number of web sites linking to it. Link analysis is closely related to link popularity, but with a twist. Link analysis is the number and quality of the inbound links point to your site. Here, it is not the number of links that matters, but the quality of those links that is important.

In a link analysis system, it is easy to filter out the spammers who set up several free sites and then make them link to their main web site. Nowadays, search engines do not give importance to links from free web sites in their link popularity scores. They combine link popularity with link analysis and use it to assign a weighted link popularity score. Presently, many major search engines make use of link analysis in their ranking algorithm.

So, in order to rank well in the search engines, start building links to your site. How do you do that? This is the most difficult and time-consuming aspect of search engine optimization. I have a special article that covers every aspect of link analysis and also methodologies to increase your score in link analysis system.

However, the oldest and best way of increasing link popularity is to ask for a link. In this process, you directly contact the owner of a site via email or other means, asking for a link to your site. In your email you may briefly explain why you think the link will benefit their visitors. You would generally offer to link back to them in exchange for this courtesy.

Leave a Reply

Please log in using one of these methods to post your comment: Logo

You are commenting using your account. Log Out / Change )

Twitter picture

You are commenting using your Twitter account. Log Out / Change )

Facebook photo

You are commenting using your Facebook account. Log Out / Change )

Google+ photo

You are commenting using your Google+ account. Log Out / Change )

Connecting to %s