When it comes to framed sites and the effect that the use of frames by a site has on its search engine ranking, there are two schools of thought. Some people say that framed sites, if done properly, have no problems in getting good rankings in the search engines. Others claim that if search engine optimization is important to you, never use frames.
In my opinion, the truth lies somewhere in between. Yes, the use of frames does throw up a few issues when it comes to getting good rankings in the search engines which don't understand frames. Hence, when you are designing a new site, I would recommend that you avoid using frames, unless you have a specific reason for doing so. However, if you already have a site which uses frames, all is not lost. You can still get good rankings in the search engines even though you have used frames. Furthermore, using frames also has its own advantages when it comes to search engine placement, as we shall later on. This article assumes that you have a working knowledge of frames. For more information on frames, click here.
In case your site uses frames, the key to getting good rankings lies in using the NOFRAMES tag. The NOFRAMES tag is intended to help framed web sites display some content for those people who are using non frames enabled browsers. The search engines which don't understand frames also look at the NOFRAMES tag. Hence, if you are using frames, you need to add content to the NOFRAMES tag.
What you should do is to add a complete web page within the NOFRAMES tag. Ensure that this page repeats the important keywords for your site a number of times. Also, in order to ensure that the content within the NOFRAMES tag is as prominent as possible to the search engines, you should put the NOFRAMES tag right after the FRAMESET tag. However, don't put the NOFRAMES before the FRAMESET tag. If you do that, Internet Explorer will display your site correctly but Netscape will display the content present in the NOFRAMES tag, rather than the content present within the frames. Furthermore, you should also place a link to the page containing your navigation links in the NOFRAMES tag. This ensures that the search engines are able to spider the internal pages in your site.
Here's what the source code of your page might look like:
html
head
[title]Put an attractive title which contains keywords[/title]
[meta name="description" content="Put an attractive description which also contains keywords"]
[meta name="keywords" content="Your target keywords separated by commas"]
[/head]
[frameset border="0" cols="150,75%"]
[noframes]
[body]
[h1]Heading containing keywords[/h1]
[p]Here, you should add a lot of content and should repeat your keywords a number of times.
[p]More keyword rich text for the search engines.
[a href="left.html">Link to page containing navigation links[/a]
[/body]
[/noframes]
[frame src="left.html" name="left" scrolling="no" noresize]
[frame src="main.html" name="main" scrolling="auto"]
[/frameset]
[/html]
One problem that occurs when you use frames is that the search engines may often display one of the internal pages in your site in response to a query. If this internal page does not contain a link to the home page of your site, the user will be unable to navigate through your entire site. The solution, of course, is to add a link to the home page from that internal page. When the visitor clicks on that link, she is brought within the context of the frames.
However, simply adding a link to the home page presents yet another problem. If the visitor had already been viewing that page within the context of the frames and then clicks on the link to the home page, a new set of frames will be created in addition to the frames already being used. In order to solve this problem, you have to use the TARGET = "_top" command in the link, i.e. the HTML code for the link would be something like
Go to our home page
Wrapping things up:
As I mentioned earlier, there is also an advantage to using frames. Since most Internet users are now using frames enabled browsers, it means that not many people are going to know what's present in your NOFRAMES tag. This allows you the repeat your keywords a few more times in the NOFRAMES than what you could have done if you were writing a page which humans would also see (of course, don't repeat the keywords too many times - that can cause your web site to be penalized for spamming). You can also avoid using tables, graphics etc. which you would otherwise need to use if humans would also view that page. However, it is debatable whether this small advantage is sufficient to justify using frames.
http://www.articles-hub.com/Article/47465.html
Saturday, June 16, 2007
Page Cloaking - To Cloak or Not to Cloak
Page cloaking can broadly be defined as a technique used to deliver different web pages under different circumstances. There are two primary reasons that people use page cloaking:
i) It allows them to create a separate optimized page for each search engine and another page which is aesthetically pleasing and designed for their human visitors. When a search engine spider visits a site, the page which has been optimized for that search engine is delivered to it. When a human visits a site, the page which was designed for the human visitors is shown. The primary benefit of doing this is that the human visitors don't need to be shown the pages which have been optimized for the search engines, because the pages which are meant for the search engines may not be aesthetically pleasing, and may contain an over-repetition of keywords.
ii) It allows them to hide the source code of the optimized pages that they have created, and hence prevents their competitors from being able to copy the source code.
Page cloaking is implemented by using some specialized cloaking scripts. A cloaking script is installed on the server, which detects whether it is a search engine or a human being that is requesting a page. If a search engine is requesting a page, the cloaking script delivers the page which has been optimized for that search engine. If a human being is requesting the page, the cloaking script delivers the page which has been designed for humans.
There are two primary ways by which the cloaking script can detect whether a search engine or a human being is visiting a site:
i) The first and simplest way is by checking the User-Agent variable. Each time anyone (be it a search engine spider or a browser being operated by a human) requests a page from a site, it reports an User-Agent name to the site. Generally, if a search engine spider requests a page, the User-Agent variable contains the name of the search engine. Hence, if the cloaking script detects that the User-Agent variable contains a name of a search engine, it delivers the page which has been optimized for that search engine. If the cloaking script does not detect the name of a search engine in the User-Agent variable, it assumes that the request has been made by a human being and delivers the page which was designed for human beings.
However, while this is the simplest way to implement a cloaking script, it is also the least safe. It is pretty easy to fake the User-Agent variable, and hence, someone who wants to see the optimized pages that are being delivered to different search engines can easily do so.
ii) The second and more complicated way is to use I.P. (Internet Protocol) based cloaking. This involves the use of an I.P. database which contains a list of the I.P. addresses of all known search engine spiders. When a visitor (a search engine or a human) requests a page, the cloaking script checks the I.P. address of the visitor. If the I.P. address is present in the I.P. database, the cloaking script knows that the visitor is a search engine and delivers the page optimized for that search engine. If the I.P. address is not present in the I.P. database, the cloaking script assumes that a human has requested the page, and delivers the page which is meant for human visitors.
Although more complicated than User-Agent based cloaking, I.P. based cloaking is more reliable and safe because it is very difficult to fake I.P. addresses.
Now that you have an idea of what cloaking is all about and how it is implemented, the question arises as to whether you should use page cloaking. The one word answer is "NO". The reason is simple: the search engines don't like it, and will probably ban your site from their index if they find out that your site uses cloaking. The reason that the search engines don't like page cloaking is that it prevents them from being able to spider the same page that their visitors are going to see. And if the search engines are prevented from doing so, they cannot be confident of delivering relevant results to their users. In the past, many people have created optimized pages for some highly popular keywords and then used page cloaking to take people to their real sites which had nothing to do with those keywords. If the search engines allowed this to happen, they would suffer because their users would abandon them and go to another search engine which produced more relevant results.
Of course, a question arises as to how a search engine can detect whether or not a site uses page cloaking. There are three ways by which it can do so:
i) If the site uses User-Agent cloaking, the search engines can simply send a spider to a site which does not report the name of the search engine in the User-Agent variable. If the search engine sees that the page delivered to this spider is different from the page which is delivered to a spider which reports the name of the search engine in the User-Agent variable, it knows that the site has used page cloaking.
ii) If the site uses I.P. based cloaking, the search engines can send a spider from a different I.P. address than any I.P. address which it has used previously. Since this is a new I.P. address, the I.P. database that is used for cloaking will not contain this address. If the search engine detects that the page delivered to the spider with the new I.P. address is different from the page that is delivered to a spider with a known I.P. address, it knows that the site has used page cloaking.
iii) A human representative from a search engine may visit a site to see whether it uses cloaking. If she sees that the page which is delivered to her is different from the one being delivered to the search engine spider, she knows that the site uses cloaking.
Hence, when it comes to page cloaking, my advice is simple: don't even think about using it.
This article may be re-published as long as the following resource box is included at the end of the article and as long as you link to the email address and the URL mentioned in the resource box:
http://www.articles-hub.com/Article/47461.html
i) It allows them to create a separate optimized page for each search engine and another page which is aesthetically pleasing and designed for their human visitors. When a search engine spider visits a site, the page which has been optimized for that search engine is delivered to it. When a human visits a site, the page which was designed for the human visitors is shown. The primary benefit of doing this is that the human visitors don't need to be shown the pages which have been optimized for the search engines, because the pages which are meant for the search engines may not be aesthetically pleasing, and may contain an over-repetition of keywords.
ii) It allows them to hide the source code of the optimized pages that they have created, and hence prevents their competitors from being able to copy the source code.
Page cloaking is implemented by using some specialized cloaking scripts. A cloaking script is installed on the server, which detects whether it is a search engine or a human being that is requesting a page. If a search engine is requesting a page, the cloaking script delivers the page which has been optimized for that search engine. If a human being is requesting the page, the cloaking script delivers the page which has been designed for humans.
There are two primary ways by which the cloaking script can detect whether a search engine or a human being is visiting a site:
i) The first and simplest way is by checking the User-Agent variable. Each time anyone (be it a search engine spider or a browser being operated by a human) requests a page from a site, it reports an User-Agent name to the site. Generally, if a search engine spider requests a page, the User-Agent variable contains the name of the search engine. Hence, if the cloaking script detects that the User-Agent variable contains a name of a search engine, it delivers the page which has been optimized for that search engine. If the cloaking script does not detect the name of a search engine in the User-Agent variable, it assumes that the request has been made by a human being and delivers the page which was designed for human beings.
However, while this is the simplest way to implement a cloaking script, it is also the least safe. It is pretty easy to fake the User-Agent variable, and hence, someone who wants to see the optimized pages that are being delivered to different search engines can easily do so.
ii) The second and more complicated way is to use I.P. (Internet Protocol) based cloaking. This involves the use of an I.P. database which contains a list of the I.P. addresses of all known search engine spiders. When a visitor (a search engine or a human) requests a page, the cloaking script checks the I.P. address of the visitor. If the I.P. address is present in the I.P. database, the cloaking script knows that the visitor is a search engine and delivers the page optimized for that search engine. If the I.P. address is not present in the I.P. database, the cloaking script assumes that a human has requested the page, and delivers the page which is meant for human visitors.
Although more complicated than User-Agent based cloaking, I.P. based cloaking is more reliable and safe because it is very difficult to fake I.P. addresses.
Now that you have an idea of what cloaking is all about and how it is implemented, the question arises as to whether you should use page cloaking. The one word answer is "NO". The reason is simple: the search engines don't like it, and will probably ban your site from their index if they find out that your site uses cloaking. The reason that the search engines don't like page cloaking is that it prevents them from being able to spider the same page that their visitors are going to see. And if the search engines are prevented from doing so, they cannot be confident of delivering relevant results to their users. In the past, many people have created optimized pages for some highly popular keywords and then used page cloaking to take people to their real sites which had nothing to do with those keywords. If the search engines allowed this to happen, they would suffer because their users would abandon them and go to another search engine which produced more relevant results.
Of course, a question arises as to how a search engine can detect whether or not a site uses page cloaking. There are three ways by which it can do so:
i) If the site uses User-Agent cloaking, the search engines can simply send a spider to a site which does not report the name of the search engine in the User-Agent variable. If the search engine sees that the page delivered to this spider is different from the page which is delivered to a spider which reports the name of the search engine in the User-Agent variable, it knows that the site has used page cloaking.
ii) If the site uses I.P. based cloaking, the search engines can send a spider from a different I.P. address than any I.P. address which it has used previously. Since this is a new I.P. address, the I.P. database that is used for cloaking will not contain this address. If the search engine detects that the page delivered to the spider with the new I.P. address is different from the page that is delivered to a spider with a known I.P. address, it knows that the site has used page cloaking.
iii) A human representative from a search engine may visit a site to see whether it uses cloaking. If she sees that the page which is delivered to her is different from the one being delivered to the search engine spider, she knows that the site uses cloaking.
Hence, when it comes to page cloaking, my advice is simple: don't even think about using it.
This article may be re-published as long as the following resource box is included at the end of the article and as long as you link to the email address and the URL mentioned in the resource box:
http://www.articles-hub.com/Article/47461.html
Creating a Robots.txt file
people believe that they should create different pages for different search engines, each page optimized for one keyword and for one search engine. Now, while I don't recommend that people create different pages for different search engines, if you do decide to create such pages, there is one issue that you need to be aware of.
These pages, although optimized for different search engines, often turn out to be pretty similar to each other. The search engines now have the ability to detect when a site has created such similar looking pages and are penalizing or even banning such sites. In order to prevent your site from being penalized for spamming, you need to prevent the search engine spiders from indexing pages which are not meant for it, i.e. you need to prevent AltaVista from indexing pages meant for Google and vice-versa. The best way to do that is to use a robots.txt file.
You should create a robots.txt file using a text editor like Windows Notepad. Don't use your word processor to create such a file.
Here is the basic syntax of the robots.txt file:
User-Agent: [Spider Name]
Disallow: [File Name]
For instance, to tell AltaVista's spider, Scooter, not to spider the file named myfile1.html residing in the root directory of the server, you would write
User-Agent: Scooter
Disallow: /myfile1.html
To tell Google's spider, called Googlebot, not to spider the files myfile2.html and myfile3.html, you would write
User-Agent: Googlebot
Disallow: /myfile2.html
Disallow: /myfile3.html
You can, of course, put multiple User-Agent statements in the same robots.txt file. Hence, to tell AltaVista not to spider the file named myfile1.html, and to tell Google not to spider the files myfile2.html and myfile3.html, you would write
User-Agent: Scooter
Disallow: /myfile1.html
User-Agent: Googlebot
Disallow: /myfile2.html
Disallow: /myfile3.html
If you want to prevent all robots from spidering the file named myfile4.html, you can use the * wildcard character in the User-Agent line, i.e. you would write
User-Agent: *
Disallow: /myfile4.html
However, you cannot use the wildcard character in the Disallow line.
Once you have created the robots.txt file, you should upload it to the root directory of your domain. Uploading it to any sub-directory won't work - the robots.txt file needs to be in the root directory.
I won't discuss the syntax and structure of the robots.txt file any further - you can get the complete specifications from here.
Now we come to how the robots.txt file can be used to prevent your site from being penalized for spamming in case you are creating different pages for different search engines. What you need to do is to prevent each search engine from spidering pages which are not meant for it.
For simplicity, let's assume that you are targeting only two keywords: "tourism in Australia" and "travel to Australia". Also, let's assume that you are targeting only three of the major search engines: AltaVista, HotBot and Google.
Now, suppose you have followed the following convention for naming the files: Each page is named by separating the individual words of the keyword for which the page is being optimized by hyphens. To this is added the first two letters of the name of the search engine for which the page is being optimized.
Hence, the files for AltaVista are tourism-in-australia-al.html travel-to-australia-al.html
The files for HotBot are tourism-in-australia-ho.html travel-to-australia-ho.html
The files for Google are tourism-in-australia-go.html travel-to-australia-go.html
As I noted earlier, AltaVista's spider is called Scooter and Google's spider is called Googlebot.
A list of spiders for the major search engines can be found here.
Now, we know that HotBot uses Inktomi and from this list, we find that Inktomi's spider is called Slurp.
Using this knowledge, here's what the robots.txt file should contain:
User-Agent: Scooter
Disallow: /tourism-in-australia-ho.html
Disallow: /travel-to-australia-ho.html
Disallow: /tourism-in-australia-go.html
Disallow: /travel-to-australia-go.html
User-Agent: Slurp
Disallow: /tourism-in-australia-al.html
Disallow: /travel-to-australia-al.html
Disallow: /tourism-in-australia-go.html
Disallow: /travel-to-australia-go.html
User-Agent: Googlebot
Disallow: /tourism-in-australia-al.html
Disallow: /travel-to-australia-al.html
Disallow: /tourism-in-australia-ho.html
Disallow: /travel-to-australia-ho.html
When you put the above lines in the robots.txt file, you instruct each search engine not to spider the files meant for the other search engines.
When you have finished creating the robots.txt file, double-check to ensure that you have not made any errors anywhere in it. A small error can have disastrous consequences - a search engine may spider files which are not meant for it, in which case it can penalize your site for spamming, or, it may not spider any files at all, in which case you won't get top rankings in that search engine.
An useful tool to check the syntax of your robots.txt file can be found here. While it will help you correct syntactical errors in the robots.txt file, it won't help you correct any logical errors, for which you will still need to go through the robots.txt thoroughly, as mentioned above.
http://www.articles-hub.com/Article/47466.html
These pages, although optimized for different search engines, often turn out to be pretty similar to each other. The search engines now have the ability to detect when a site has created such similar looking pages and are penalizing or even banning such sites. In order to prevent your site from being penalized for spamming, you need to prevent the search engine spiders from indexing pages which are not meant for it, i.e. you need to prevent AltaVista from indexing pages meant for Google and vice-versa. The best way to do that is to use a robots.txt file.
You should create a robots.txt file using a text editor like Windows Notepad. Don't use your word processor to create such a file.
Here is the basic syntax of the robots.txt file:
User-Agent: [Spider Name]
Disallow: [File Name]
For instance, to tell AltaVista's spider, Scooter, not to spider the file named myfile1.html residing in the root directory of the server, you would write
User-Agent: Scooter
Disallow: /myfile1.html
To tell Google's spider, called Googlebot, not to spider the files myfile2.html and myfile3.html, you would write
User-Agent: Googlebot
Disallow: /myfile2.html
Disallow: /myfile3.html
You can, of course, put multiple User-Agent statements in the same robots.txt file. Hence, to tell AltaVista not to spider the file named myfile1.html, and to tell Google not to spider the files myfile2.html and myfile3.html, you would write
User-Agent: Scooter
Disallow: /myfile1.html
User-Agent: Googlebot
Disallow: /myfile2.html
Disallow: /myfile3.html
If you want to prevent all robots from spidering the file named myfile4.html, you can use the * wildcard character in the User-Agent line, i.e. you would write
User-Agent: *
Disallow: /myfile4.html
However, you cannot use the wildcard character in the Disallow line.
Once you have created the robots.txt file, you should upload it to the root directory of your domain. Uploading it to any sub-directory won't work - the robots.txt file needs to be in the root directory.
I won't discuss the syntax and structure of the robots.txt file any further - you can get the complete specifications from here.
Now we come to how the robots.txt file can be used to prevent your site from being penalized for spamming in case you are creating different pages for different search engines. What you need to do is to prevent each search engine from spidering pages which are not meant for it.
For simplicity, let's assume that you are targeting only two keywords: "tourism in Australia" and "travel to Australia". Also, let's assume that you are targeting only three of the major search engines: AltaVista, HotBot and Google.
Now, suppose you have followed the following convention for naming the files: Each page is named by separating the individual words of the keyword for which the page is being optimized by hyphens. To this is added the first two letters of the name of the search engine for which the page is being optimized.
Hence, the files for AltaVista are tourism-in-australia-al.html travel-to-australia-al.html
The files for HotBot are tourism-in-australia-ho.html travel-to-australia-ho.html
The files for Google are tourism-in-australia-go.html travel-to-australia-go.html
As I noted earlier, AltaVista's spider is called Scooter and Google's spider is called Googlebot.
A list of spiders for the major search engines can be found here.
Now, we know that HotBot uses Inktomi and from this list, we find that Inktomi's spider is called Slurp.
Using this knowledge, here's what the robots.txt file should contain:
User-Agent: Scooter
Disallow: /tourism-in-australia-ho.html
Disallow: /travel-to-australia-ho.html
Disallow: /tourism-in-australia-go.html
Disallow: /travel-to-australia-go.html
User-Agent: Slurp
Disallow: /tourism-in-australia-al.html
Disallow: /travel-to-australia-al.html
Disallow: /tourism-in-australia-go.html
Disallow: /travel-to-australia-go.html
User-Agent: Googlebot
Disallow: /tourism-in-australia-al.html
Disallow: /travel-to-australia-al.html
Disallow: /tourism-in-australia-ho.html
Disallow: /travel-to-australia-ho.html
When you put the above lines in the robots.txt file, you instruct each search engine not to spider the files meant for the other search engines.
When you have finished creating the robots.txt file, double-check to ensure that you have not made any errors anywhere in it. A small error can have disastrous consequences - a search engine may spider files which are not meant for it, in which case it can penalize your site for spamming, or, it may not spider any files at all, in which case you won't get top rankings in that search engine.
An useful tool to check the syntax of your robots.txt file can be found here. While it will help you correct syntactical errors in the robots.txt file, it won't help you correct any logical errors, for which you will still need to go through the robots.txt thoroughly, as mentioned above.
http://www.articles-hub.com/Article/47466.html
Improve the Link Popularity of your site
Link popularity, i.e. the number of sites which are linking to your site, is an increasingly important factor as far as search engine placement is concerned. Other things remaining the same, more the number of links to your site, higher will be its ranking.
What is important is not only the number of links to your site, but also the types of sites which are linking to you. A link from a site which is related to yours is more valuable than a link from an unrelated site.
In this article, I explore different methods by which you can improve the link popularity of your site. I start with the methods that you shouldn't bother using, then go on to the moderately effective methods, and then end with the most effective methods you can use to boost the link popularity of your site.
1) Submitting your site to Free For All (FFA) pages
A common misconception among many Internet marketers is that while FFA pages may not directly bring in traffic to your site, it will help to improve the link popularity of your site, and hence, will indirectly bring in traffic through the search engines.
Nothing could be further from the truth. Most FFA pages can contain only a certain number of links at a time. This means that when you submit your site to a FFA page, your site will be placed at the top of the page. However, as more and more people submit their sites to the FFA page, your site will be pushed down, and finally, when it reaches the bottom of the page, it will be removed.
Now, since you can bet that plenty of other people are also submitting their sites to the FFA pages, your site will remain in these pages for only a short span of time. Hence, in order to ensure that the search engines see your site if and when they come to spider the FFA page, you will need to ensure that you submit your site to these FFA pages on a regular basis - at least once a week.
Even if you used an automatic submission program to do it, can you imagine a worse way to spend your time and/or money? Furthermore, many search engines recognize these pages which only contains links to other sites as FFA pages and may completely ignore them. And while I haven't yet seen any evidence that submitting to the FFA pages will actually penalize your site, there is every possibility that this might happen in the future.
Hence, when it comes to FFA pages, my advice is simple: don't even think about them.
2) Joining Reciprocal Link Services
Some people have recommended that in order to increase the link popularity of your site, you can join some reciprocal link services. The basic idea behind these services is that you add some pages to your site which contain links to other sites which are members of that service, and in exchange, these members will also add pages to their sites which will contain a link to your site. Theoretically, more the members of that service, more your link popularity.
However, I have plenty of reservations about using this method to boost the link popularity of your site:
i) Most of these services require that you add a visible graphical or text link from your home page to the pages containing the links. If they require a graphical link, it can completely destroy the general look and feel of your site. Even if they require a text link, how would you feel if a visitor to your site clicked on such a link and found one of your competitors (who is also a member of this service) right at the top of a page?
ii) Most of these services give the same pages containing the links to each of its members, i.e. the pages that you are required to upload to your site are exactly the same as the pages which all the other members of that service are required to upload to their servers. Even the file names of the pages tend to be the same for all the members. Most search engines are now able to detect such duplicate pages in different domains and may either ignore the pages or may even penalize all these domains for spamming.
iii) Instead of linking only related sites with each other, most of these services link all the members with each other. This means that lots of unrelated sites will be linking to your site. As I mentioned before, links from unrelated sites are simply not as valuable as links from related sites.
Hence, I don't recommend that you join any reciprocal link programs.
3) Exchanging links with other webmasters
Another way of improving the link popularity of your site is to exchange links with other webmasters who have sites which are related to yours, but are not direct competitors. Here's how you can do it:
First, open a database program like Microsoft Access and create a new table containing fields like FirstName, LastName, Email Address, URL etc. Then, make a list of the sites belonging to your competitors. Then, go to AltaVista, and type in the following in the search box:
link:somesite.com -url:somesite.com
where somesite.com is the domain name of one of your competitors. This will give you a list of all the sites which are linking to that competitor. Then, find out in what context a particular site has linked to your competitor. If this site is an affiliate of your competitor, then your chance of getting a link from this site is limited, unless you offer an even better affiliate program. However, if you find that this site has a Links page which contains links to other sites, one of which is a link to your competitor, then it is an excellent prospect for exchanging links. Find out the name and email address of the webmaster of the site and add them to your database. In this way, go through all the sites which are linking to your competitors, locate those sites which you think may want to exchange links with you, and build up your database.
Once you have done that, create a Links page in your site, and add the URLs of these sites to the Links page. Then, send an email to these webmasters, introduce yourself and your site, congratulate them on building an excellent web site, tell them that you have already added a link to their sites from yours, and then ask them whether they would be kind enough to add a link to your site. In your email, emphasize the fact that exchanging links in this way will be mutually beneficial for both of you because it will help both of you drive traffic to your sites. Wait for a month or so to see the response. Some webmasters will agree to link to you. Others will simply not respond. After a month, remove the links to those sites who are not interested in exchanging links and using the methods outlined above, try to locate more sites with which to exchange links.
When you send the email to the webmasters, make sure that you personalize each email. Don't begin every email with "Hello Webmaster", begin with "Hello Mike". If you want, you can use email merge programs to automatically personalize each email. You can check out some email merge programs by going tohttp://download.cnet.com and searching for "email merge" (without the quotes).
Another thing that you can do is to mention in your Links page that you are willing to exchange links with other web sites. This allows other webmasters who come to your web site to propose a link exchange.
4) Starting an Awards Program
A moderately effective method of improving the link popularity of your site is to start an awards program. You can have web sites which are related to yours apply for an award from your site. The sites which win the award get the chance to display the logo for your award. This logo is linked to your site, preferably to a page which contains more information on the award.
If you publish a newsletter, consider declaring the winners in your newsletter. You can also perform a review of the winners' sites in your newsletter. This adds useful content to your newsletter and also gives more webmasters the incentive to apply for your award, since you may review their sites in your newsletter. This also gives them the incentive to subscribe to your newsletter to see if they win the award.
Make sure that you give awards to only those sites which deserve to win. If you give your award to sites which don't deserve it, your award will have little credibility, which will, in turn, hurt the credibility of your company. Furthermore, make sure that the logo you design for the award looks professional. If it doesn't, not many webmasters will want to display it in their sites.
5) Giving testimonials
This may sound a bit unusual, but giving testimonials for products or services which you find useful can be another moderately effective way of improving the link popularity of your site. If you really like a product, simply write to the company and tell them why you liked the product so much and how it has helped you. Chances are, the company will write back to you to thank you for your comments and will ask you for permission to display your comments in their web site. Tell the company that you have no problems if they publish your comments, but request them to add a link to your site along with the testimonial. There is every possibility that the company will agree since publishing the URL of your web site gives more credibility to the testimonial.
Of course, please don't go about giving testimonials to every company you can locate just because it will improve your link popularity :-)
6) Posting to Message Boards and Discussion Lists
Another moderately effective method of increasing the link popularity of your site is to post to online message boards. At the end of every message that you post, you can sign off by mentioning your name and the URL of your web site. If the message board allows it, you can even include a short promotional blurb about your site at the end of your posts. However, make sure that the individual messages that are posted to that message board are archived in static HTML pages (i.e. the URLs for the individual messages should not contain a "?"). Otherwise, the search engines will consider these pages to be dynamic pages and may not spider these pages and hence, will not be able to find your link.
Email based discussion lists which are archived on the web in static HTML pages can also be used to boost the link popularity of your site in a similar manner. In this case, the signature file that you use with your email program should contain the URL for your web site.
7) Starting a Link Contest
A good method of improving the link popularity of your site is to give away prizes to other webmasters if they link to you. The prizes that you give out should ideally be something which other webmasters will find valuable enough to want to link to you, but which do not cost you too much. For instance, if you publish a newsletter, and have unsold ad inventory, you can give away some free advertisements in your newsletter to the winners. If you sell a software (or an ebook), you can give away a free copy of your software or ebook to the winners, since it doesn't cost you anything to produce an additional copy of digital goods like software and ebooks.
Link contests work best if you run the contest on a continuous basis and if you declare new winners frequently. If you run the contest for a few months, and then stop it, the webmasters who had linked to you will all remove their links. However, if you run it on a continuous basis, and declare new winners every month or so, the webmasters will have the incentive to keep their links to your site.
Also, make sure that you require all participants to have a link to your site either in their home page, or in an internal page of their site which is linked to their home page. Also ensure that the page which contains the link is no more than two levels deep from their home page (i.e. it should not take more than two clicks to go from the home page to the page containing the link). If they don't do this, the search engine spiders may not index the page which contains the link to your site, and hence, may not find your link.
8) Writing articles and allowing them to be re-published
This is by far one of the best ways of improving the link popularity of your site, and one of my favorites. Whenever I write an article on search engine placement, I first publish it in my newsletter and then I publish the article in my site as a separate web page. I also submit it to the following article submission sites:
http://www.ezinearticles.com/add_url.html
http://www.ideamarketers.com
http://www.marketing-seek.com/articles/submit.shtml
http://certificate.net/wwio/ideas.shtml
http://www.web-source.net/articlesub.htm
Many webmasters and ezine publishers frequent these article directories in search of articles. Submitting my articles to these directories gives them the opportunity of re-publishing my articles. While I have had some success with each of the above directories, by far the best among them is the ezinearticles.com directory.
Now, at the end of each article, I mention that people are free to re-publish the article as long as they include my resource box (i.e. my bio) at the end of the article. I always include the URL of my site in the resource box. This means that whenever someone publishes one of my articles in his/her web site, I have another site linking to my site. Also, many ezine publishers archive their ezines in their web sites. If they have re-published my article in a particular issue, I again get a link.
Writing articles is also an excellent viral marketing tool. As some webmasters and ezine publishers publish my articles, other webmasters and ezine publishers will read my article. Some of them, in turn, will publish my article, which will again be read by other webmasters and ezine publishers, some of whom will publish it... and so on.
Also, since only web sites related to mine would be interested in publishing my articles, all these links tend to come from related sites, which, as I mentioned earlier, are more valuable than links from unrelated sites.
Writing articles, of course, has another very important benefit - if you write good articles, it makes you known as an expert in your field. This helps to improve your credibility, which makes people more comfortable about buying your products or services.
Some notes about writing articles:
i) I have learnt through experience that some webmasters will publish other people's articles and will display the complete resource box but will not link to the URL mentioned in the resource box. In order to prevent this, you need to explicitly state that the article can be published only if the URL mentioned in the resource box is linked to your site.
ii) Your resource box should not be too long - it should be no more than 6 lines long, formatted at 65 characters per line. Otherwise, other webmasters and ezine publishers will hesitate to publish your article.
9) Starting your own affiliate program
This is another excellent way by which you can improve the link popularity of your site. When you have your own affiliate program, you give other webmasters the incentive to link to you. In this case too, since most of these web sites will be related to the industry in which you are operating, these links will be more valuable than links from unrelated sites.
Now, when you start your affiliate program, you need to decide whether you want to run the program yourself, or whether you want to outsource it from a third party. While outsourcing your affiliate program has a number of benefits, doing so will not help you improve the link popularity of your site, because affiliates are going to link to the third party's site. In order to improve the link popularity of your site, you need to ensure that the affiliate links are pointing to your domain.
The affiliate program software that I highly recommend and have used for the affiliate program of our search engine positioning services is Kowabunga Technologies' My Affiliate Program. Although this software is hosted in a domain owned by Kowabunga Technologies, they give you the option of having the affiliate links pointing to your own domain.
10) Submitting to the directories
This is by far the most important step as far as improving the link popularity of your site is concerned. As I mentioned before, what is important is not only the number of links to your site, but also the quality of the links to your site. No links are as important as links from some of the major directories like Yahoo!, the Open Directory etc. However, Yahoo! currently requires a payment of $299 per year in order to list any commercial site. Paying them $299 per year just to improve your link popularity is probably not cost effective. But, the Open Directory is free, and you should definitely get your site listed in the Open Directory.
Also, you should submit your site to as many of the smaller directories as possible. You can get a list of such directories here.
http://www.articles-hub.com/Article/47467.html
What is important is not only the number of links to your site, but also the types of sites which are linking to you. A link from a site which is related to yours is more valuable than a link from an unrelated site.
In this article, I explore different methods by which you can improve the link popularity of your site. I start with the methods that you shouldn't bother using, then go on to the moderately effective methods, and then end with the most effective methods you can use to boost the link popularity of your site.
1) Submitting your site to Free For All (FFA) pages
A common misconception among many Internet marketers is that while FFA pages may not directly bring in traffic to your site, it will help to improve the link popularity of your site, and hence, will indirectly bring in traffic through the search engines.
Nothing could be further from the truth. Most FFA pages can contain only a certain number of links at a time. This means that when you submit your site to a FFA page, your site will be placed at the top of the page. However, as more and more people submit their sites to the FFA page, your site will be pushed down, and finally, when it reaches the bottom of the page, it will be removed.
Now, since you can bet that plenty of other people are also submitting their sites to the FFA pages, your site will remain in these pages for only a short span of time. Hence, in order to ensure that the search engines see your site if and when they come to spider the FFA page, you will need to ensure that you submit your site to these FFA pages on a regular basis - at least once a week.
Even if you used an automatic submission program to do it, can you imagine a worse way to spend your time and/or money? Furthermore, many search engines recognize these pages which only contains links to other sites as FFA pages and may completely ignore them. And while I haven't yet seen any evidence that submitting to the FFA pages will actually penalize your site, there is every possibility that this might happen in the future.
Hence, when it comes to FFA pages, my advice is simple: don't even think about them.
2) Joining Reciprocal Link Services
Some people have recommended that in order to increase the link popularity of your site, you can join some reciprocal link services. The basic idea behind these services is that you add some pages to your site which contain links to other sites which are members of that service, and in exchange, these members will also add pages to their sites which will contain a link to your site. Theoretically, more the members of that service, more your link popularity.
However, I have plenty of reservations about using this method to boost the link popularity of your site:
i) Most of these services require that you add a visible graphical or text link from your home page to the pages containing the links. If they require a graphical link, it can completely destroy the general look and feel of your site. Even if they require a text link, how would you feel if a visitor to your site clicked on such a link and found one of your competitors (who is also a member of this service) right at the top of a page?
ii) Most of these services give the same pages containing the links to each of its members, i.e. the pages that you are required to upload to your site are exactly the same as the pages which all the other members of that service are required to upload to their servers. Even the file names of the pages tend to be the same for all the members. Most search engines are now able to detect such duplicate pages in different domains and may either ignore the pages or may even penalize all these domains for spamming.
iii) Instead of linking only related sites with each other, most of these services link all the members with each other. This means that lots of unrelated sites will be linking to your site. As I mentioned before, links from unrelated sites are simply not as valuable as links from related sites.
Hence, I don't recommend that you join any reciprocal link programs.
3) Exchanging links with other webmasters
Another way of improving the link popularity of your site is to exchange links with other webmasters who have sites which are related to yours, but are not direct competitors. Here's how you can do it:
First, open a database program like Microsoft Access and create a new table containing fields like FirstName, LastName, Email Address, URL etc. Then, make a list of the sites belonging to your competitors. Then, go to AltaVista, and type in the following in the search box:
link:somesite.com -url:somesite.com
where somesite.com is the domain name of one of your competitors. This will give you a list of all the sites which are linking to that competitor. Then, find out in what context a particular site has linked to your competitor. If this site is an affiliate of your competitor, then your chance of getting a link from this site is limited, unless you offer an even better affiliate program. However, if you find that this site has a Links page which contains links to other sites, one of which is a link to your competitor, then it is an excellent prospect for exchanging links. Find out the name and email address of the webmaster of the site and add them to your database. In this way, go through all the sites which are linking to your competitors, locate those sites which you think may want to exchange links with you, and build up your database.
Once you have done that, create a Links page in your site, and add the URLs of these sites to the Links page. Then, send an email to these webmasters, introduce yourself and your site, congratulate them on building an excellent web site, tell them that you have already added a link to their sites from yours, and then ask them whether they would be kind enough to add a link to your site. In your email, emphasize the fact that exchanging links in this way will be mutually beneficial for both of you because it will help both of you drive traffic to your sites. Wait for a month or so to see the response. Some webmasters will agree to link to you. Others will simply not respond. After a month, remove the links to those sites who are not interested in exchanging links and using the methods outlined above, try to locate more sites with which to exchange links.
When you send the email to the webmasters, make sure that you personalize each email. Don't begin every email with "Hello Webmaster", begin with "Hello Mike". If you want, you can use email merge programs to automatically personalize each email. You can check out some email merge programs by going tohttp://download.cnet.com and searching for "email merge" (without the quotes).
Another thing that you can do is to mention in your Links page that you are willing to exchange links with other web sites. This allows other webmasters who come to your web site to propose a link exchange.
4) Starting an Awards Program
A moderately effective method of improving the link popularity of your site is to start an awards program. You can have web sites which are related to yours apply for an award from your site. The sites which win the award get the chance to display the logo for your award. This logo is linked to your site, preferably to a page which contains more information on the award.
If you publish a newsletter, consider declaring the winners in your newsletter. You can also perform a review of the winners' sites in your newsletter. This adds useful content to your newsletter and also gives more webmasters the incentive to apply for your award, since you may review their sites in your newsletter. This also gives them the incentive to subscribe to your newsletter to see if they win the award.
Make sure that you give awards to only those sites which deserve to win. If you give your award to sites which don't deserve it, your award will have little credibility, which will, in turn, hurt the credibility of your company. Furthermore, make sure that the logo you design for the award looks professional. If it doesn't, not many webmasters will want to display it in their sites.
5) Giving testimonials
This may sound a bit unusual, but giving testimonials for products or services which you find useful can be another moderately effective way of improving the link popularity of your site. If you really like a product, simply write to the company and tell them why you liked the product so much and how it has helped you. Chances are, the company will write back to you to thank you for your comments and will ask you for permission to display your comments in their web site. Tell the company that you have no problems if they publish your comments, but request them to add a link to your site along with the testimonial. There is every possibility that the company will agree since publishing the URL of your web site gives more credibility to the testimonial.
Of course, please don't go about giving testimonials to every company you can locate just because it will improve your link popularity :-)
6) Posting to Message Boards and Discussion Lists
Another moderately effective method of increasing the link popularity of your site is to post to online message boards. At the end of every message that you post, you can sign off by mentioning your name and the URL of your web site. If the message board allows it, you can even include a short promotional blurb about your site at the end of your posts. However, make sure that the individual messages that are posted to that message board are archived in static HTML pages (i.e. the URLs for the individual messages should not contain a "?"). Otherwise, the search engines will consider these pages to be dynamic pages and may not spider these pages and hence, will not be able to find your link.
Email based discussion lists which are archived on the web in static HTML pages can also be used to boost the link popularity of your site in a similar manner. In this case, the signature file that you use with your email program should contain the URL for your web site.
7) Starting a Link Contest
A good method of improving the link popularity of your site is to give away prizes to other webmasters if they link to you. The prizes that you give out should ideally be something which other webmasters will find valuable enough to want to link to you, but which do not cost you too much. For instance, if you publish a newsletter, and have unsold ad inventory, you can give away some free advertisements in your newsletter to the winners. If you sell a software (or an ebook), you can give away a free copy of your software or ebook to the winners, since it doesn't cost you anything to produce an additional copy of digital goods like software and ebooks.
Link contests work best if you run the contest on a continuous basis and if you declare new winners frequently. If you run the contest for a few months, and then stop it, the webmasters who had linked to you will all remove their links. However, if you run it on a continuous basis, and declare new winners every month or so, the webmasters will have the incentive to keep their links to your site.
Also, make sure that you require all participants to have a link to your site either in their home page, or in an internal page of their site which is linked to their home page. Also ensure that the page which contains the link is no more than two levels deep from their home page (i.e. it should not take more than two clicks to go from the home page to the page containing the link). If they don't do this, the search engine spiders may not index the page which contains the link to your site, and hence, may not find your link.
8) Writing articles and allowing them to be re-published
This is by far one of the best ways of improving the link popularity of your site, and one of my favorites. Whenever I write an article on search engine placement, I first publish it in my newsletter and then I publish the article in my site as a separate web page. I also submit it to the following article submission sites:
http://www.ezinearticles.com/add_url.html
http://www.ideamarketers.com
http://www.marketing-seek.com/articles/submit.shtml
http://certificate.net/wwio/ideas.shtml
http://www.web-source.net/articlesub.htm
Many webmasters and ezine publishers frequent these article directories in search of articles. Submitting my articles to these directories gives them the opportunity of re-publishing my articles. While I have had some success with each of the above directories, by far the best among them is the ezinearticles.com directory.
Now, at the end of each article, I mention that people are free to re-publish the article as long as they include my resource box (i.e. my bio) at the end of the article. I always include the URL of my site in the resource box. This means that whenever someone publishes one of my articles in his/her web site, I have another site linking to my site. Also, many ezine publishers archive their ezines in their web sites. If they have re-published my article in a particular issue, I again get a link.
Writing articles is also an excellent viral marketing tool. As some webmasters and ezine publishers publish my articles, other webmasters and ezine publishers will read my article. Some of them, in turn, will publish my article, which will again be read by other webmasters and ezine publishers, some of whom will publish it... and so on.
Also, since only web sites related to mine would be interested in publishing my articles, all these links tend to come from related sites, which, as I mentioned earlier, are more valuable than links from unrelated sites.
Writing articles, of course, has another very important benefit - if you write good articles, it makes you known as an expert in your field. This helps to improve your credibility, which makes people more comfortable about buying your products or services.
Some notes about writing articles:
i) I have learnt through experience that some webmasters will publish other people's articles and will display the complete resource box but will not link to the URL mentioned in the resource box. In order to prevent this, you need to explicitly state that the article can be published only if the URL mentioned in the resource box is linked to your site.
ii) Your resource box should not be too long - it should be no more than 6 lines long, formatted at 65 characters per line. Otherwise, other webmasters and ezine publishers will hesitate to publish your article.
9) Starting your own affiliate program
This is another excellent way by which you can improve the link popularity of your site. When you have your own affiliate program, you give other webmasters the incentive to link to you. In this case too, since most of these web sites will be related to the industry in which you are operating, these links will be more valuable than links from unrelated sites.
Now, when you start your affiliate program, you need to decide whether you want to run the program yourself, or whether you want to outsource it from a third party. While outsourcing your affiliate program has a number of benefits, doing so will not help you improve the link popularity of your site, because affiliates are going to link to the third party's site. In order to improve the link popularity of your site, you need to ensure that the affiliate links are pointing to your domain.
The affiliate program software that I highly recommend and have used for the affiliate program of our search engine positioning services is Kowabunga Technologies' My Affiliate Program. Although this software is hosted in a domain owned by Kowabunga Technologies, they give you the option of having the affiliate links pointing to your own domain.
10) Submitting to the directories
This is by far the most important step as far as improving the link popularity of your site is concerned. As I mentioned before, what is important is not only the number of links to your site, but also the quality of the links to your site. No links are as important as links from some of the major directories like Yahoo!, the Open Directory etc. However, Yahoo! currently requires a payment of $299 per year in order to list any commercial site. Paying them $299 per year just to improve your link popularity is probably not cost effective. But, the Open Directory is free, and you should definitely get your site listed in the Open Directory.
Also, you should submit your site to as many of the smaller directories as possible. You can get a list of such directories here.
http://www.articles-hub.com/Article/47467.html
Glossary of Search Engine Ranking Terms
Here is a glossary of commonly used terms in the world of search engine ranking.
Alt Tag: The alternative text that the browser displays when the surfer does not want to or cannot see the pictures present in a web page. Using alt tags containing keywords can improve the search engine ranking of the page for those keywords.
Bridge Page: See Doorway Page.
Click Popularity: A measure of the relevance of sites obtained by noting which sites are clicked on most and how much time users spend in each site.
Cloaking: The process by which your site can display different pages under different circumstances. It is primarily used to show an optimized page to the search engines and a different page to humans. Most search engines will penalize a site if they discover that it is using cloaking.
Comment Tag: The text present within the tags in a web page. Most search engines will ignore the text within the Comment Tags.
Crawler: See Spider.
Directory: A site containing links to other sites which are organized into various categories. Examples of directories are Yahoo! & Open Directory.
Doorway Page: A page which has been specially created in order to get a high ranking in the search engines. Also called gateway page, bridge page, entry page etc.
Dynamic Content: Information in web pages which changes automatically, based on database or user information. Search engines will index dynamic content in the same way as static content unless the URL includes a ? mark. However, if the URL does include a ? mark, many search engines will ignore the URL.
Entry Page: See Doorway Page.
Frames: An HTML technique allowing web site designers to display two or more pages in the same browser window. Many search engines do not index framed web pages properly - they only index the text present in the NOFRAMES tag. Unless a web page which uses frames contains relevant content in the NOFRAMES tag, it is unlikely to get a high ranking in those search engines.
Gateway Page: See Doorway Page.
Hallway Page: A page containing links to various doorway pages.
Heading Tags: A paragraph style that is displayed in a large, bold typeface. Having text containing keywords in the Heading Tags can improve the search engine ranking of a page for those keywords.
Hidden Text: Text that is visible to the search engines but is invisible to humans. It is mainly accomplished by using text in the same color as the background color of the page. It is primarily used for the purpose of including extra keywords in the page without distorting the aesthetics of the page. Most search engines penalize web sites which use such hidden text.
Image Map: An image containing one or more invisible regions which are linked to other pages. If the image map is defined as a separate file, the search engines may not be able to index the pages to which that image map links. The way out is to have text hyperlinks to those pages in addition to the links from the image map. However, image maps defined within the same web page will generally not prevent search engines from indexing the other pages.
Inktomi: A database of sites used by many of the larger search engines like HotBot, MSN etc. For more information, seehttp://www.inktomi.com
java script: A scripting language commonly used in web pages. Most search engines are unable to index these scripts properly.
Keyword: A word or phrase that you type in when you are searching for information in the search engines.
Keyword Frequency: Denotes how often a keyword appears in a page or in an area of a page. In general, higher the number of times a keyword appears in a page, higher its search engine ranking. However, repeating a keyword too often in a page can lead to that page being penalized for spamming.
Keyword Prominence: Denotes how close to the start of an area of a page that a keyword appears. In general, having the keyword closer to the start of an area will lead to an improvement in the search engine ranking of a page.
Keyword Weight: Denotes the number of times a keyword appears in a page as a percentage of all the other words in the page. In general, higher the weight of a particular keyword in a page, higher will be the search engine ranking of the page for that keyword. However, repeating a keyword too often in order to increase its weight can cause the page to be penalized by the search engines.
Link Popularity: The number of sites which link to a particular site. Many search engines use link popularity as a factor in determining the search engine ranking of a web site.
Meta Description Tag: The tag present in the header of a web page which is used to provide a short description of the contents of the page. Some search engines will display the text present in the Meta Description Tag when the page appears in the results of a search. Including keywords in the Meta Description Tag can improve the search engine ranking of a page for those keywords. However, some search engines ignore the Meta Description Tag.
Meta Keywords Tag: The tag present in the header of a web page which is used to provide alternative words for the words used in the body of the page. The Meta Keywords Tag is becoming less and less important in influencing the search engine ranking of a page. Some search engines ignore the Meta Keywords tag.
Meta Refresh Tag: The tag present in the header of a web page which is used to display a different page after a few seconds. If a page displays another page too soon, most search engines will either ignore the current page and index the second page or penalize the current page for spamming.
Pay Per Click Search Engine: A search engine in which the ranking of your site is determined by the amount you are paying for each click from that search engine to your site. Examples of pay per click search engines are Overture, HootingOwl etc.
Robot: In the context of search engine ranking, it implies the same thing as Spider. In a different context, it is also used to indicate a software which visits web sites and collects email addresses to be used for sending unsolicited bulk email.
Robots.txt: A text file present in the root directory of a site which is used to control which pages are indexed by a robot. Only robots which comply with the Robots Exclusion Standard will follow the instructions contained in this file.
Search Engine: A software that searches for information and returns sites which provide that information. Examples of search engines are AltaVista, Google, Hotbot etc.
Search Engine Placement: The practice of trying to ensure that a web site obtains a high rank in the search engines. Also called search engine positioning, search engine optimization etc.
Spamdexing: See Spamming.
Spamming: Using any search engine ranking technique which causes a degradation in the quality of the results produced by the search engines. Examples of spamming include excessive repetition of a keyword in a page, optimizing a page for a keyword which is unrelated to the contents of the site, using invisible text, etc. Most search engines will penalize a page which uses spamming. Also called spamdexing. In a different context, spamming is also used to mean the practice of sending unsolicited bulk email.
Spider: A software that visits web sites and indexes the pages present in those sites. Search engines use spiders to build up their databases. Example: The spider for AltaVista is called Scooter.
Stop Word: A word that often appears in pages, yet has no significance by itself. Most search engines ignore stop words while searching. Example of stop words are: and, the, of etc.
Title Tag: The contents of the Title tag is generally displayed by the browser at the top of the browser window. The search engines use the Title tag to provide a link to the sites which match the query made by the user. Having keywords in the Title tag of a page can significantly increase the search engine ranking of the page for those keywords.
http://www.articles-hub.com/Article/47468.html
Alt Tag: The alternative text that the browser displays when the surfer does not want to or cannot see the pictures present in a web page. Using alt tags containing keywords can improve the search engine ranking of the page for those keywords.
Bridge Page: See Doorway Page.
Click Popularity: A measure of the relevance of sites obtained by noting which sites are clicked on most and how much time users spend in each site.
Cloaking: The process by which your site can display different pages under different circumstances. It is primarily used to show an optimized page to the search engines and a different page to humans. Most search engines will penalize a site if they discover that it is using cloaking.
Comment Tag: The text present within the tags in a web page. Most search engines will ignore the text within the Comment Tags.
Crawler: See Spider.
Directory: A site containing links to other sites which are organized into various categories. Examples of directories are Yahoo! & Open Directory.
Doorway Page: A page which has been specially created in order to get a high ranking in the search engines. Also called gateway page, bridge page, entry page etc.
Dynamic Content: Information in web pages which changes automatically, based on database or user information. Search engines will index dynamic content in the same way as static content unless the URL includes a ? mark. However, if the URL does include a ? mark, many search engines will ignore the URL.
Entry Page: See Doorway Page.
Frames: An HTML technique allowing web site designers to display two or more pages in the same browser window. Many search engines do not index framed web pages properly - they only index the text present in the NOFRAMES tag. Unless a web page which uses frames contains relevant content in the NOFRAMES tag, it is unlikely to get a high ranking in those search engines.
Gateway Page: See Doorway Page.
Hallway Page: A page containing links to various doorway pages.
Heading Tags: A paragraph style that is displayed in a large, bold typeface. Having text containing keywords in the Heading Tags can improve the search engine ranking of a page for those keywords.
Hidden Text: Text that is visible to the search engines but is invisible to humans. It is mainly accomplished by using text in the same color as the background color of the page. It is primarily used for the purpose of including extra keywords in the page without distorting the aesthetics of the page. Most search engines penalize web sites which use such hidden text.
Image Map: An image containing one or more invisible regions which are linked to other pages. If the image map is defined as a separate file, the search engines may not be able to index the pages to which that image map links. The way out is to have text hyperlinks to those pages in addition to the links from the image map. However, image maps defined within the same web page will generally not prevent search engines from indexing the other pages.
Inktomi: A database of sites used by many of the larger search engines like HotBot, MSN etc. For more information, seehttp://www.inktomi.com
java script: A scripting language commonly used in web pages. Most search engines are unable to index these scripts properly.
Keyword: A word or phrase that you type in when you are searching for information in the search engines.
Keyword Frequency: Denotes how often a keyword appears in a page or in an area of a page. In general, higher the number of times a keyword appears in a page, higher its search engine ranking. However, repeating a keyword too often in a page can lead to that page being penalized for spamming.
Keyword Prominence: Denotes how close to the start of an area of a page that a keyword appears. In general, having the keyword closer to the start of an area will lead to an improvement in the search engine ranking of a page.
Keyword Weight: Denotes the number of times a keyword appears in a page as a percentage of all the other words in the page. In general, higher the weight of a particular keyword in a page, higher will be the search engine ranking of the page for that keyword. However, repeating a keyword too often in order to increase its weight can cause the page to be penalized by the search engines.
Link Popularity: The number of sites which link to a particular site. Many search engines use link popularity as a factor in determining the search engine ranking of a web site.
Meta Description Tag: The tag present in the header of a web page which is used to provide a short description of the contents of the page. Some search engines will display the text present in the Meta Description Tag when the page appears in the results of a search. Including keywords in the Meta Description Tag can improve the search engine ranking of a page for those keywords. However, some search engines ignore the Meta Description Tag.
Meta Keywords Tag: The tag present in the header of a web page which is used to provide alternative words for the words used in the body of the page. The Meta Keywords Tag is becoming less and less important in influencing the search engine ranking of a page. Some search engines ignore the Meta Keywords tag.
Meta Refresh Tag: The tag present in the header of a web page which is used to display a different page after a few seconds. If a page displays another page too soon, most search engines will either ignore the current page and index the second page or penalize the current page for spamming.
Pay Per Click Search Engine: A search engine in which the ranking of your site is determined by the amount you are paying for each click from that search engine to your site. Examples of pay per click search engines are Overture, HootingOwl etc.
Robot: In the context of search engine ranking, it implies the same thing as Spider. In a different context, it is also used to indicate a software which visits web sites and collects email addresses to be used for sending unsolicited bulk email.
Robots.txt: A text file present in the root directory of a site which is used to control which pages are indexed by a robot. Only robots which comply with the Robots Exclusion Standard will follow the instructions contained in this file.
Search Engine: A software that searches for information and returns sites which provide that information. Examples of search engines are AltaVista, Google, Hotbot etc.
Search Engine Placement: The practice of trying to ensure that a web site obtains a high rank in the search engines. Also called search engine positioning, search engine optimization etc.
Spamdexing: See Spamming.
Spamming: Using any search engine ranking technique which causes a degradation in the quality of the results produced by the search engines. Examples of spamming include excessive repetition of a keyword in a page, optimizing a page for a keyword which is unrelated to the contents of the site, using invisible text, etc. Most search engines will penalize a page which uses spamming. Also called spamdexing. In a different context, spamming is also used to mean the practice of sending unsolicited bulk email.
Spider: A software that visits web sites and indexes the pages present in those sites. Search engines use spiders to build up their databases. Example: The spider for AltaVista is called Scooter.
Stop Word: A word that often appears in pages, yet has no significance by itself. Most search engines ignore stop words while searching. Example of stop words are: and, the, of etc.
Title Tag: The contents of the Title tag is generally displayed by the browser at the top of the browser window. The search engines use the Title tag to provide a link to the sites which match the query made by the user. Having keywords in the Title tag of a page can significantly increase the search engine ranking of the page for those keywords.
http://www.articles-hub.com/Article/47468.html
Which keywords should you optimize your site for?
In this article, we focus on the correct way of finding out the keywords for which you should optimize your site for the search engines. This article will give you the formula for the Keyword Effectiveness Index (KEI) - a mathematical formula which I have developed to help you determine which keywords you should be optimizing your site for.
Step 1: Open your text editor or word processor and write down all the words and phrases that you might have searched for if you were looking for a company which offers products and services similar to yours. For example, suppose your company organizes packaged tours to Australia. Here's a list of phrases that I might have searched for if I were planning to make a trip to Australia:
tourism in Australia
travel to Australia
travelling in Australia
travel agencies in Australia
travelling agencies in Australia
Australian travel agencies
Of course, the keywords that came to your mind may have been different. But that's not important - the important thing is to get an initial list of keywords.
You may be wondering why I have not used single word keywords. Here's why:
Firstly, single word keywords tend to be hyper-competitive. A search for "tourism" or "travelling" in any search engine will probably generate hundreds of thousands of pages. While it is possible that you may get your page in the top 10 for such a single word keyword, it is quite unlikely.
Secondly, because of the sheer number of pages that single word searches can throw up, most search engine users have realized that they can get more relevant pages if they search for phrases rather than individual words. Statistical research has shown that most people are now searching for 2 or 3 word phrases rather than for single words.
Thirdly, single word keywords won't get you targeted traffic. When people search for "tourism", they are not necessarily looking for tourist destinations in Australia - they may be interested in any other country of the world. Even if you got your site into the top 10 for tourism, you gain nothing from such visitors. However, when someone searches for "tourism in Australia", he/she is your potential customer, and hence, it makes sense for you to try and get a top ranking for your site for that keyword.
Hence, whenever you are trying to generate keywords, try to be location specific. Try to think of keywords which apply to the geographic area that your product or service is designed to serve.
Step 2: Open any spreadsheet program that is installed in your hard drive. I assume you are using Microsoft Excel. If you are using some other spreadsheet program, just change the spreadsheet related procedures outlined here to fit your program.
Create 4 columns - one for the keyword, one for the popularity of the keyword, one for the number of sites that appear in AltaVista for that keyword and the last for something I call the Keyword Effectiveness Index (don't worry - I'll explain what KEI means later on). In order to ensure that you can follow what I am saying, I recommend that you add the following column headers to the first four columns of the first row of your spreadsheet:
Keyword
Popularity
No. of Competitors
KEI
In case you don't want to take the trouble of creating your own spreadsheet, download the keywords.zip file. The file contains a sample spreadsheet in Excel 97 format.
Step 3: A great way to obtain a list of keywords related to the ones you have developed in the first step is to use WordTracker's keyword generation service. Click on the "Trial" option at the top of the site. In the page that appears, type in your name and email address and click on the "Start the trial >>" button. In the next page, click on "Click here to start the trial". In the next page, type in the first keyword that you developed in Step 1, i.e. "tourism in Australia", in the text box. Click on the "Proceed >>" button.
Step 4: In the next page, WordTracker will display a list of keywords related to the keyword that you had typed in. (Just scroll down the left pane to see the keywords). Now, click on the first keyword in the left pane which is applicable for your site. In the right pane, WordTracker will show a list of keywords which contain the keyword you had clicked on in the left pane.
Then in the table that you have created in your spreadsheet, copy each of the keywords in the right pane and paste them in the first column of the table. Also, copy the number of times those keywords have been used (i.e. the figure present in the Count column in WordTracker) and paste them in the second column. In order to ensure that you can follow me, make sure that you type the first keyword in the second row of your spreadsheet. Of course, you should only bother adding a keyword to your spreadsheet if it is applicable for your site.
Once you have added all the keywords in the right pane which are applicable for your site, click on the next keyword in the left pane which is applicable for your site. Once again, WordTracker will display a list of keywords in the right pane which contain the keyword you had clicked on in the left pane. Again, copy the keywords in the right pane which are applicable for your site and paste them in the first column of your spreadsheet. Also, copy the figures present in the Count column and paste them in the second column beside the corresponding keywords.
Repeat this process for each of the keywords in the left pane.
Step 5: Once you have finished with all the keywords in the left pane, press your browser's Back button a number of times until WordTracker again displays the text box which asks you to type in a keyword. Type in the second keyword in your original list (i.e. "travel to Australia"), click on the "Proceed >>" button and repeat Step 4.
Do this for each of the keywords that you developed in Step 1.
Step 6: Go to AltaVista. Search for the first keyword that is present in your spreadsheet using exact match search (i.e. you should wrap the keyword in quotes, i.e. you should type a quotation mark before typing the keyword and a quotation mark after typing it). AltaVista will return the number of sites which are relevant to that keyword. Add this number to the third column of the spreadsheet in the same row in which the keyword is present. Repeat this process for each of the keywords present in your spreadsheet.
Once you have done that, your first column will contain the keywords, your second column will show the popularity of the keywords and your third column will contain the number of sites you are competing against to get a high ranking for those keywords.
Now it's time to calculate the KEI!
Step 7: The Keyword Effectiveness Index is the square of the popularity of a keyword multiplied by 1000 and divided by the number of sites which appear in AltaVista for that keyword. It is designed to measure which keywords are worth optimizing your site for. Higher the KEI, better the keyword. How the formula for the KEI is arrived at is beyond the scope of this article. If you want to know, send a blank email to kei@sendfree.com.
If you had used the spreadsheet file that I created for you (see Step 2), you won't need to enter the formula for calculating the KEI yourself. The KEI would be automatically calculated for you the moment you enter the values in columns 2 and 3. You can go straight to Step 8.
In case you didn't download the file, here's how you can calculate the KEI.
I am assuming that you have created the spreadsheet columns in the way I recommended in Step 3 and that you are using Microsoft Excel. If you using some other spreadsheet program, you will need to adjust the formula to the requirements of your spreadsheet program. Click on cell D2. Type in the following exactly as it is shown:
=IF(C2<>0,B2^2/C2*1000,B2^2*1000)
Then click on the Copy button to copy the formula, select all the cells in column 4 which have keywords associated with them and press the Paste button to paste the formula. The KEI for each keyword will be displayed.
Step 8: Use your spreadsheet program's Sort feature to sort the rows in descending order of the KEI. In Excel 97, you would click on the Data menu, click on the Sort menu item, choose KEI from the drop-down combo box named "Sort by", click on the "Descending" option next to it, and then click on OK.
And guess what - that's it! You now know the keywords which you should optimize your site for. You can now start optimizing your site one by one for each keyword, starting with the keyword with the highest KEI. Exactly how many of the keywords you choose to optimize your site for largely depends on the amount of time that you can spare from your normal business activities. But whatever the number of keywords that you target, it obviously makes sense to go for the most effective keywords first.
Tying up the loose ends:
The number of related keywords that WordTracker displays in the trial version is limited. In order to get all the keywords which are related to the keywords you had developed in Step 1, you would need to subscribe to WordTracker's paid service. We highly recommend that you do subscribe to WordTracker's paid service as otherwise, you will miss out on a lot of keywords that can prove to be extremely valuable to you.
http://www.articles-hub.com/Article/47469.html
Step 1: Open your text editor or word processor and write down all the words and phrases that you might have searched for if you were looking for a company which offers products and services similar to yours. For example, suppose your company organizes packaged tours to Australia. Here's a list of phrases that I might have searched for if I were planning to make a trip to Australia:
tourism in Australia
travel to Australia
travelling in Australia
travel agencies in Australia
travelling agencies in Australia
Australian travel agencies
Of course, the keywords that came to your mind may have been different. But that's not important - the important thing is to get an initial list of keywords.
You may be wondering why I have not used single word keywords. Here's why:
Firstly, single word keywords tend to be hyper-competitive. A search for "tourism" or "travelling" in any search engine will probably generate hundreds of thousands of pages. While it is possible that you may get your page in the top 10 for such a single word keyword, it is quite unlikely.
Secondly, because of the sheer number of pages that single word searches can throw up, most search engine users have realized that they can get more relevant pages if they search for phrases rather than individual words. Statistical research has shown that most people are now searching for 2 or 3 word phrases rather than for single words.
Thirdly, single word keywords won't get you targeted traffic. When people search for "tourism", they are not necessarily looking for tourist destinations in Australia - they may be interested in any other country of the world. Even if you got your site into the top 10 for tourism, you gain nothing from such visitors. However, when someone searches for "tourism in Australia", he/she is your potential customer, and hence, it makes sense for you to try and get a top ranking for your site for that keyword.
Hence, whenever you are trying to generate keywords, try to be location specific. Try to think of keywords which apply to the geographic area that your product or service is designed to serve.
Step 2: Open any spreadsheet program that is installed in your hard drive. I assume you are using Microsoft Excel. If you are using some other spreadsheet program, just change the spreadsheet related procedures outlined here to fit your program.
Create 4 columns - one for the keyword, one for the popularity of the keyword, one for the number of sites that appear in AltaVista for that keyword and the last for something I call the Keyword Effectiveness Index (don't worry - I'll explain what KEI means later on). In order to ensure that you can follow what I am saying, I recommend that you add the following column headers to the first four columns of the first row of your spreadsheet:
Keyword
Popularity
No. of Competitors
KEI
In case you don't want to take the trouble of creating your own spreadsheet, download the keywords.zip file. The file contains a sample spreadsheet in Excel 97 format.
Step 3: A great way to obtain a list of keywords related to the ones you have developed in the first step is to use WordTracker's keyword generation service. Click on the "Trial" option at the top of the site. In the page that appears, type in your name and email address and click on the "Start the trial >>" button. In the next page, click on "Click here to start the trial". In the next page, type in the first keyword that you developed in Step 1, i.e. "tourism in Australia", in the text box. Click on the "Proceed >>" button.
Step 4: In the next page, WordTracker will display a list of keywords related to the keyword that you had typed in. (Just scroll down the left pane to see the keywords). Now, click on the first keyword in the left pane which is applicable for your site. In the right pane, WordTracker will show a list of keywords which contain the keyword you had clicked on in the left pane.
Then in the table that you have created in your spreadsheet, copy each of the keywords in the right pane and paste them in the first column of the table. Also, copy the number of times those keywords have been used (i.e. the figure present in the Count column in WordTracker) and paste them in the second column. In order to ensure that you can follow me, make sure that you type the first keyword in the second row of your spreadsheet. Of course, you should only bother adding a keyword to your spreadsheet if it is applicable for your site.
Once you have added all the keywords in the right pane which are applicable for your site, click on the next keyword in the left pane which is applicable for your site. Once again, WordTracker will display a list of keywords in the right pane which contain the keyword you had clicked on in the left pane. Again, copy the keywords in the right pane which are applicable for your site and paste them in the first column of your spreadsheet. Also, copy the figures present in the Count column and paste them in the second column beside the corresponding keywords.
Repeat this process for each of the keywords in the left pane.
Step 5: Once you have finished with all the keywords in the left pane, press your browser's Back button a number of times until WordTracker again displays the text box which asks you to type in a keyword. Type in the second keyword in your original list (i.e. "travel to Australia"), click on the "Proceed >>" button and repeat Step 4.
Do this for each of the keywords that you developed in Step 1.
Step 6: Go to AltaVista. Search for the first keyword that is present in your spreadsheet using exact match search (i.e. you should wrap the keyword in quotes, i.e. you should type a quotation mark before typing the keyword and a quotation mark after typing it). AltaVista will return the number of sites which are relevant to that keyword. Add this number to the third column of the spreadsheet in the same row in which the keyword is present. Repeat this process for each of the keywords present in your spreadsheet.
Once you have done that, your first column will contain the keywords, your second column will show the popularity of the keywords and your third column will contain the number of sites you are competing against to get a high ranking for those keywords.
Now it's time to calculate the KEI!
Step 7: The Keyword Effectiveness Index is the square of the popularity of a keyword multiplied by 1000 and divided by the number of sites which appear in AltaVista for that keyword. It is designed to measure which keywords are worth optimizing your site for. Higher the KEI, better the keyword. How the formula for the KEI is arrived at is beyond the scope of this article. If you want to know, send a blank email to kei@sendfree.com.
If you had used the spreadsheet file that I created for you (see Step 2), you won't need to enter the formula for calculating the KEI yourself. The KEI would be automatically calculated for you the moment you enter the values in columns 2 and 3. You can go straight to Step 8.
In case you didn't download the file, here's how you can calculate the KEI.
I am assuming that you have created the spreadsheet columns in the way I recommended in Step 3 and that you are using Microsoft Excel. If you using some other spreadsheet program, you will need to adjust the formula to the requirements of your spreadsheet program. Click on cell D2. Type in the following exactly as it is shown:
=IF(C2<>0,B2^2/C2*1000,B2^2*1000)
Then click on the Copy button to copy the formula, select all the cells in column 4 which have keywords associated with them and press the Paste button to paste the formula. The KEI for each keyword will be displayed.
Step 8: Use your spreadsheet program's Sort feature to sort the rows in descending order of the KEI. In Excel 97, you would click on the Data menu, click on the Sort menu item, choose KEI from the drop-down combo box named "Sort by", click on the "Descending" option next to it, and then click on OK.
And guess what - that's it! You now know the keywords which you should optimize your site for. You can now start optimizing your site one by one for each keyword, starting with the keyword with the highest KEI. Exactly how many of the keywords you choose to optimize your site for largely depends on the amount of time that you can spare from your normal business activities. But whatever the number of keywords that you target, it obviously makes sense to go for the most effective keywords first.
Tying up the loose ends:
The number of related keywords that WordTracker displays in the trial version is limited. In order to get all the keywords which are related to the keywords you had developed in Step 1, you would need to subscribe to WordTracker's paid service. We highly recommend that you do subscribe to WordTracker's paid service as otherwise, you will miss out on a lot of keywords that can prove to be extremely valuable to you.
http://www.articles-hub.com/Article/47469.html
Google Page Rank - important or just another numbe
In my last newsletter I wrote about how your websites Alexa rating is not actually that important to the success of your online business. In this issue, I want to look at another popular statistic - Google Page Rank - and ask a similar question - is it that important?
First a quick overview as to what the Google Page Rank actually is...
Google Page Rank (or PR as it is often referred to as) is simply an indication of the number of websites that link to a specific website. It also attempts to indicate the quality of those links. PR ranges from 0 to 10 (with 10 being the 'best' PR and 0 being the 'worst'). The vast majority of small business websites will usually find they have a PR of between 0 and 5.
To calculate a particular sites PR, Google uses a fairly complicated algorithm based on the number of web links that it is aware of that link to the site in question. This algorithm will also take into account the PR of the page that is providing the link, thus a link from a web page that has a PR of 7 will be considered more valuable than a link from a page with a PR of 4.
Because of the way in which links from higher PR-ranked sites are considered more important, many people are choosing to buy links from websites with high PR's just so that they can increase their own PR. I have seen sites selling a simple text link on their home page for over $700 a month purely based on the fact that they have a PR of 7 or above. This may seem like a lot of money but when you consider that the website owners that are buying these links often have websites that are in no way relevant to the content of the site linking to them, it is absolutely ridiculous.
Take this example, let's say you have a website about health and fitness and you buy a link for $500 a month from a random website because it has a PR of 7. This random website has no relevance to your health and fitness site so what is going to happen? Well, your own PR may increase as a result of the link. You may get a bit of extra traffic but probably not much since people don't click on links that that they are not interested in. You will definitely be $500 poorer at the end of the month!
Instead, why not spend the $500 on pay-per-click advertising and benefit from some quality, targeted traffic?
Of course, there is a bit more to it than that and the reason that most people want to increase their PR is because Google takes this statistic into account when determining where a website will be displayed in their search results. Many people assume that a high PR automatically equals a high search engine placement for their chosen keywords. Not so....
PR is just one of over 100 different factors that Google takes into account when deciding where your website will feature (and these factors and the main algorithm change on a very regular basis). It is perfectly possible for a website with a PR of 5 to get a higher ranking than a PR 7 site if it has better content or is more relevant for the search term in question.
Remember that relevance is all important with Google and a link from a website that is not relevant to your own site will be considered far less important than a relevant one (which makes buying links from random sites purely because they have a high PR even more crazy).
I have read several rumours lately that Google haven't updated PR's for a couple of months and they are considering phasing PR out or modifying it in some way. This is pure speculation but it wouldn't surprise me in the least. PR is easily manipulated (for example by purchasing links as described above) and Google doesn't like to have their calculations or search results manipulated. It stands to reason that they will be looking at ways of preventing this.
So, in summary, is Google Page Rank important to your business?
Well, it is a good indicator of how many other sites link to yours and how important Google considers your site to be BUT I personally don't place too much importance on this statistic and I certainly won't be paying out for a link from a website just because it has a high PR.
As I said above, Google changes it's rules on a regular basis and I see little point in chasing a particular PR on the basis that it might get you higher search engine rankings. If Google do decide to do away with PR, all your work will have been for nothing.
Instead, concentrate on building quality, relevant links from sites that are connected in some way to your own site content. This will ensure that any traffic you receive via these links will at least have an interest in your site. Building links on this basis will automatically increase your PR over time (without the need to pay out for overpriced, irrelevant links). If you do things this way and Google does scrap the PR indicator, it shouldn't affect you in any way and the links you have in place will continue to benefit you.
Remember, in the same way that a low Alexa rating doesn't guarantee traffic or sales, neither does a high PR. Sure a high PR is a 'nice to have' but lots of traffic and high sales is even nicer :-)
http://www.articles-hub.com/Article/47486.html
First a quick overview as to what the Google Page Rank actually is...
Google Page Rank (or PR as it is often referred to as) is simply an indication of the number of websites that link to a specific website. It also attempts to indicate the quality of those links. PR ranges from 0 to 10 (with 10 being the 'best' PR and 0 being the 'worst'). The vast majority of small business websites will usually find they have a PR of between 0 and 5.
To calculate a particular sites PR, Google uses a fairly complicated algorithm based on the number of web links that it is aware of that link to the site in question. This algorithm will also take into account the PR of the page that is providing the link, thus a link from a web page that has a PR of 7 will be considered more valuable than a link from a page with a PR of 4.
Because of the way in which links from higher PR-ranked sites are considered more important, many people are choosing to buy links from websites with high PR's just so that they can increase their own PR. I have seen sites selling a simple text link on their home page for over $700 a month purely based on the fact that they have a PR of 7 or above. This may seem like a lot of money but when you consider that the website owners that are buying these links often have websites that are in no way relevant to the content of the site linking to them, it is absolutely ridiculous.
Take this example, let's say you have a website about health and fitness and you buy a link for $500 a month from a random website because it has a PR of 7. This random website has no relevance to your health and fitness site so what is going to happen? Well, your own PR may increase as a result of the link. You may get a bit of extra traffic but probably not much since people don't click on links that that they are not interested in. You will definitely be $500 poorer at the end of the month!
Instead, why not spend the $500 on pay-per-click advertising and benefit from some quality, targeted traffic?
Of course, there is a bit more to it than that and the reason that most people want to increase their PR is because Google takes this statistic into account when determining where a website will be displayed in their search results. Many people assume that a high PR automatically equals a high search engine placement for their chosen keywords. Not so....
PR is just one of over 100 different factors that Google takes into account when deciding where your website will feature (and these factors and the main algorithm change on a very regular basis). It is perfectly possible for a website with a PR of 5 to get a higher ranking than a PR 7 site if it has better content or is more relevant for the search term in question.
Remember that relevance is all important with Google and a link from a website that is not relevant to your own site will be considered far less important than a relevant one (which makes buying links from random sites purely because they have a high PR even more crazy).
I have read several rumours lately that Google haven't updated PR's for a couple of months and they are considering phasing PR out or modifying it in some way. This is pure speculation but it wouldn't surprise me in the least. PR is easily manipulated (for example by purchasing links as described above) and Google doesn't like to have their calculations or search results manipulated. It stands to reason that they will be looking at ways of preventing this.
So, in summary, is Google Page Rank important to your business?
Well, it is a good indicator of how many other sites link to yours and how important Google considers your site to be BUT I personally don't place too much importance on this statistic and I certainly won't be paying out for a link from a website just because it has a high PR.
As I said above, Google changes it's rules on a regular basis and I see little point in chasing a particular PR on the basis that it might get you higher search engine rankings. If Google do decide to do away with PR, all your work will have been for nothing.
Instead, concentrate on building quality, relevant links from sites that are connected in some way to your own site content. This will ensure that any traffic you receive via these links will at least have an interest in your site. Building links on this basis will automatically increase your PR over time (without the need to pay out for overpriced, irrelevant links). If you do things this way and Google does scrap the PR indicator, it shouldn't affect you in any way and the links you have in place will continue to benefit you.
Remember, in the same way that a low Alexa rating doesn't guarantee traffic or sales, neither does a high PR. Sure a high PR is a 'nice to have' but lots of traffic and high sales is even nicer :-)
http://www.articles-hub.com/Article/47486.html
Your Alexa rating - is it really that important?
A discussion in my forum has prompted this newsletter as I think it is important for any online entrepreneur to understand how Alexa ratings work in order that they can understand why they may not be that important to the success of their online business. Too many people make the mistake of concentrating entirely on improving their Alexa ratings but at the end of the day, these rankings are just numbers - they won't put money in the bank.
Alexa is a company owned by the Amazon group and it aims to rank every single website on the Internet in terms of how much traffic it is receiving.
Quite simply, the lower your Alexa ranking, the more traffic your website gets. The ideal scenario would be to have an Alexa ranking of '1'. This would mean, in theory, that your website receives more traffic than any other website in the world. Currently this position is held by Yahoo.com and as you would expect, other top sites include Microsoft, Google and eBay.
It is generally considered that a website with an Alexa rating of 100,000 or less is receiving a reasonable level of traffic but Alexa can be wildly misleading and very easily manipulated.
To understand why the rankings are misleading, you need to understand how Alexa gathers the data that it uses to create the rankings in the first place. This is really very simple - Alexa has a free toolbar that you can download and install within your Internet browser and this reports back to Alexa with the details of every single website that you visit.
http://www.articles-hub.com/Article/47488.html
Alexa is a company owned by the Amazon group and it aims to rank every single website on the Internet in terms of how much traffic it is receiving.
Quite simply, the lower your Alexa ranking, the more traffic your website gets. The ideal scenario would be to have an Alexa ranking of '1'. This would mean, in theory, that your website receives more traffic than any other website in the world. Currently this position is held by Yahoo.com and as you would expect, other top sites include Microsoft, Google and eBay.
It is generally considered that a website with an Alexa rating of 100,000 or less is receiving a reasonable level of traffic but Alexa can be wildly misleading and very easily manipulated.
To understand why the rankings are misleading, you need to understand how Alexa gathers the data that it uses to create the rankings in the first place. This is really very simple - Alexa has a free toolbar that you can download and install within your Internet browser and this reports back to Alexa with the details of every single website that you visit.
http://www.articles-hub.com/Article/47488.html
Is it worth chasing search engine rankings?
We all know that the best way of getting quality, free traffic is by achieving a first page ranking on one of the main search engines. For the past few years, Google has held the position of being the most popular and probably the best search engine there is and as such, this is the search engine that everyone wants their site listed on.
Copyright © 2005 Richard Grady
We all know that the best way of getting quality, free traffic is by achieving a first page ranking on one of the main search engines. For the past few years, Google has held the position of being the most popular and probably the best search engine there is and as such, this is the search engine that everyone wants their site listed on.
Of course, I am no exception to this and my various websites hold numerous first page rankings on Google and have done for several years.
However, like so many things on the Internet, your search engine ranking is completely out of your control. Sure, there are things you can do to make your website more 'attractive' to the search engines but if they decide that what was attractive yesterday is not attractive today, then you can be dropped like a stone.
Towards the end of last year (2003) I wrote about how Google had suddenly changed its algorithm (the formula that it uses to decide which order websites appear in within the search results) and caused me to lose countless top place rankings. These were rankings that I had held for up to three years without change but suddenly they were gone.
Getting those rankings had taken many weeks of work and whilst it had been worthwhile, it wasn't something that I fancied doing all over again just to find that Google changes its mind once more and puts me back at square one.
Therefore, I took the decision that I wasn't going to make any changes to my websites and if Google no longer wanted to include them for some keywords, then so be it.
Following my decision, two things happened that are really the moral of this story....
Firstly, despite the loss of the above rankings, my sales were not affected in any noticeable way.
And secondly, this week, most of my rankings have come back AND I have gained some extra ones for competitive keywords that I have spent years trying to obtain but always without success!!
Remember, I didn't make any changes to my website to 'win back' these positions. It was simply down to Google amending their algorithm once again.
Naturally, I am very pleased that my website rankings on Google have been restored but I am even more pleased that I didn't spend time back in December re-designing my sites to try and get Google to re-index them. Had I done this, I could very well have gained the rankings back for three months and then lost them again this week.
Of course, just because my websites are back where they should be in the Google index (in my opinion at least!), it doesn't mean that they will stay there. There is every possibility that I could lose the rankings again tomorrow, next week, next month, next year.... It could go the other way and I might gain even more top positions - who knows?
Running an online business involves a huge amount of testing and this doesn't just mean the small website owners such as myself and you folks. It includes the big companies like Google too. When Google (or any other search engine) changes their algorithm, they are simply testing to see if they can improve their service or increase revenue or whatever.
Do you really want to spend your time chasing a top position on Google or any other search engine when you have no control over when the factors that enable you to gain this position could be changed completely?
My advice is stick to the basic rules when designing a website - excellent content, quality inbound links etc and let the search engines sort themselves out.
http://www.articles-hub.com/Article/47490.html
Copyright © 2005 Richard Grady
We all know that the best way of getting quality, free traffic is by achieving a first page ranking on one of the main search engines. For the past few years, Google has held the position of being the most popular and probably the best search engine there is and as such, this is the search engine that everyone wants their site listed on.
Of course, I am no exception to this and my various websites hold numerous first page rankings on Google and have done for several years.
However, like so many things on the Internet, your search engine ranking is completely out of your control. Sure, there are things you can do to make your website more 'attractive' to the search engines but if they decide that what was attractive yesterday is not attractive today, then you can be dropped like a stone.
Towards the end of last year (2003) I wrote about how Google had suddenly changed its algorithm (the formula that it uses to decide which order websites appear in within the search results) and caused me to lose countless top place rankings. These were rankings that I had held for up to three years without change but suddenly they were gone.
Getting those rankings had taken many weeks of work and whilst it had been worthwhile, it wasn't something that I fancied doing all over again just to find that Google changes its mind once more and puts me back at square one.
Therefore, I took the decision that I wasn't going to make any changes to my websites and if Google no longer wanted to include them for some keywords, then so be it.
Following my decision, two things happened that are really the moral of this story....
Firstly, despite the loss of the above rankings, my sales were not affected in any noticeable way.
And secondly, this week, most of my rankings have come back AND I have gained some extra ones for competitive keywords that I have spent years trying to obtain but always without success!!
Remember, I didn't make any changes to my website to 'win back' these positions. It was simply down to Google amending their algorithm once again.
Naturally, I am very pleased that my website rankings on Google have been restored but I am even more pleased that I didn't spend time back in December re-designing my sites to try and get Google to re-index them. Had I done this, I could very well have gained the rankings back for three months and then lost them again this week.
Of course, just because my websites are back where they should be in the Google index (in my opinion at least!), it doesn't mean that they will stay there. There is every possibility that I could lose the rankings again tomorrow, next week, next month, next year.... It could go the other way and I might gain even more top positions - who knows?
Running an online business involves a huge amount of testing and this doesn't just mean the small website owners such as myself and you folks. It includes the big companies like Google too. When Google (or any other search engine) changes their algorithm, they are simply testing to see if they can improve their service or increase revenue or whatever.
Do you really want to spend your time chasing a top position on Google or any other search engine when you have no control over when the factors that enable you to gain this position could be changed completely?
My advice is stick to the basic rules when designing a website - excellent content, quality inbound links etc and let the search engines sort themselves out.
http://www.articles-hub.com/Article/47490.html
Subscribe to:
Posts (Atom)