Started getting aware of the web technology and search engine marketing of your website? If you have not started marketing your website it is not too late you must start now without any further delay. Here are concrete reasons why you must market your website through search engines and directories.
* Your competitors are ranking well on the search engines.
Your competitors have opted for search engine optimization technology and started attracting your prospective customers. It is very frustrating that even though they are inferior in quality and services they are getting more and more business through internet marketing.
* Your customer and potential customer are surfing the web to find out your products.
Gone are the days when your potential customers are collecting initial information about your products and services through yellow pages and business directories. They have become more technology oriented and using internet to read and research about products and services. They shortlist good companies and start negotiations with them. If your site is not listed well in the search engine, you will miss the chance to quote.
* A well promoted website makes it easy for people to refer new customers to you.
For most of the businesses, referrals are the most crucial source of getting new customers. With a well promoted website, it becomes easy to encourage the referrals because customer can simply send your URL to their business contacts. Many times, referrals influence your potential customers to alternative manufacturer.
* Search Engine Optimization makes your website online global office which is open to your potential customer 24 hours a day, 7 days a week, 365 days a year.
Even when you are relaxing, your website is working for you. It is constantly giving information about your company, business, and services. Your prospective customer does not have to contact you personally for any kind of information about your company. They can only get in touch with you through email when they are convinced about getting into any type of business association with your company. This saves lot of your time and energy.
* From remote destinations or distant countries, your customers can buy your products and pay for them while sitting at their computer.
Online business or selling online has become so common across the world that irrespective of nature or value of the product, people close deal through website. You can do business with customers from any part of the world and their payment made directly into your bank account here you are sure about your payment as you will not dispatch the product without realization of money while there are instances in the local market that you have to send product first and then you get the payment. Chances are there you get payment very late but here you are doing business in cash and widening reach of your product across the world.
* Search engine marketing increases the value of your product.
You can have pages and pages about your company, products, and services in much more detailed form than anyone else can be able to offer. A well placed website in search engine can draw the attention of your prospective customers and get an opportunity to explain about superiority over your competitors.
* Any new product or updates in product range can reach your customer faster.
A well promoted site conveys the message through search engine listing. If your site is listed well in search engine, any updates in your website will be reflected faster as spider visits your site regularly. So when you introduce latest product or upgrade the product and your prospective customer is looking for upgraded product, your site will draw attention with the relevant keywords. So chances of getting new business will become faster. Just compare this fast updates of your website in search engine with reprinting, redesigning, and redistribution of your new sales brochure.
* Search engine marketing will help you to reach potential customers worldwide, nationwide, or locally without any extra cost.
Thanks to search engine marketing, wider the business prospect, lower the business promotion cost. It is rightly said search engine marketing is a gift of God to all small and genuine ambitious businessmen to reach globally. Just think if your website is listed well, and it opens all doors to business opportunities coming from all over the world.
* Search engine marketing can save lot of your money.
Just compare the cost of leaflet printing, magazine advertising and mailing versus the cost of search engine marketing. The scope of search engine marketing has wider reach vis-à-vis conventional marketing activities which is much more expensive than search engine promotion.
* Your website positioning in search engine increases your credibility
Imagine the miracle of technology, your website is listed above multimillion dollar companies or your company placed above world leader in your product range. In that case, your prospective customer will visit your website, which creates credibility at par with the multimillion dollar company. A high ranking website creates powerful impact on a potential customer & creates confidence in you. If your website is well designed and well positioned visitors will have no choice but to be impressed.
* You can test new products or services instantly at no extra cost
Once your site is positioned well in search engine, you can add any information to the website and list the same at ranking high in search engines. So based on response, you get about new services or products through search engines. You can make business decision about scope of new services or products. In other words, search engine marketing replaces conventional market research and get feed back from larger and wider audience at no extra cost.
* Search engine optimization helps you to stay in contact with potential customers.
There are chances that your prospective buyer is interested in your product but not ready to buy right now. So just to get information he will visit your site and collect the information but when he is ready to buy the product he will again search the internet. If you are with knowledgeable search engine marketing company, your position in search engine would have improved or remained the same position, giving more confidence to your prospective buyer. Staying top in search engine ranking keeps your company fresh in visitors mind.
* Leading in search engine proves that you are leader in your business too.
If your website is listed with high ranking in most of the search engine and directories, customers will see your website more wherever and whenever he surfs. This creates a solid impression about your company product and services in the mind of prospective buyer. In other words, it establishes leadership status in the mind of customer. This leads to more business opportunities. So be on top, go for search engine marketing with knowledgeable firm. This is the most effective and more economical marketing technology available on this planet.
http://www.topranker.in/seo_articles.htm
Thursday, May 31, 2007
Wednesday, May 30, 2007
How search engine marketing tools can work for you: or, searching is really all about finding
Summary
This is the second of three articles. Part 1 appeared in the August issue of Information Outlook.
Search engine optimization and marketing covers a wide range of activities, many of which are similar to what a reference librarian, systems librarian, or market researcher does. Although the focus is the World Wide Web, many of the tools that are used have broader applications for special librarians.
Internal corporate processes. Web analytics tools measure and analyze corporate sales, customer preferences and problems, viable products and channels, and other issues that may provide answers for questions received by special librarians.
Competitive intelligence/market research. Keyword research, Web site saturation and popularity tools can provide information on a company's competitors: how they are marketing on the Internet, what they are spending on online marketing campaigns, how they are pricing their products.
Legal issues. Who Is tools can provide valuable information relating to copyright and trademark issues. Link Popularity tools can show who is deep-linking to your site. Log files, in conjunction with Who Is tools, can tell you who may be committing click fraud on your paid placement campaigns or spamming your e-mail servers.
Back end knowledge of how Web sites work. These tools can show you what may be keeping search engines from indexing your site and can highlight customer service issues.
http://findarticles.com/p/articles/mi_m0FWE/is_10_9/ai_n15890934
This is the second of three articles. Part 1 appeared in the August issue of Information Outlook.
Search engine optimization and marketing covers a wide range of activities, many of which are similar to what a reference librarian, systems librarian, or market researcher does. Although the focus is the World Wide Web, many of the tools that are used have broader applications for special librarians.
Internal corporate processes. Web analytics tools measure and analyze corporate sales, customer preferences and problems, viable products and channels, and other issues that may provide answers for questions received by special librarians.
Competitive intelligence/market research. Keyword research, Web site saturation and popularity tools can provide information on a company's competitors: how they are marketing on the Internet, what they are spending on online marketing campaigns, how they are pricing their products.
Legal issues. Who Is tools can provide valuable information relating to copyright and trademark issues. Link Popularity tools can show who is deep-linking to your site. Log files, in conjunction with Who Is tools, can tell you who may be committing click fraud on your paid placement campaigns or spamming your e-mail servers.
Back end knowledge of how Web sites work. These tools can show you what may be keeping search engines from indexing your site and can highlight customer service issues.
http://findarticles.com/p/articles/mi_m0FWE/is_10_9/ai_n15890934
GeoTrust updates TrustWatch Search engine
INTERNET BUSINESS NEWS-(C)1995-2005 M2 COMMUNICATIONS LTD
Additional features have been added to the TrustWatch Search engine by GeoTrust Europe Inc, a provider of identity verification solutions for e-business and digital certificates.
TrustWatch Search is a free trusted search service aimed at preventing consumers from being victims of identity theft, phishing scams and web-based fraud. It verifies the security of a website by placing a green, yellow or red verification symbol beside each search result and displays CNET's Certified Store rating data, allowing consumers to evaluate merchant sites using additional data.
As well as verifying secure sites for e-commerce and other transactions the service now also verifies a number of content sites.
TrustWatch Search offers a site report facility to include site verification and SSL encryption, store ratings and website reviews, security certification, site attribute information and privacy policies. It enables consumers to evaluate and verify the integrity of a website, review merchants and report suspicious, phishing-related activity.
Feedback is collected and evaluated by GeoTrust, forwarded to anti-fraud organisations as appropriate and kept for potential future use in TrustWatch ratings. The updated TrustWatch Search can be found as a search box in the TrustWatch anti-phishing toolbar from GeoTrust and through http://www.TrustWatch.com as a destination consumer search site.
http://findarticles.com/p/articles/mi_m0BNG/is_2005_Oct_27/ai_n15761539
Additional features have been added to the TrustWatch Search engine by GeoTrust Europe Inc, a provider of identity verification solutions for e-business and digital certificates.
TrustWatch Search is a free trusted search service aimed at preventing consumers from being victims of identity theft, phishing scams and web-based fraud. It verifies the security of a website by placing a green, yellow or red verification symbol beside each search result and displays CNET's Certified Store rating data, allowing consumers to evaluate merchant sites using additional data.
As well as verifying secure sites for e-commerce and other transactions the service now also verifies a number of content sites.
TrustWatch Search offers a site report facility to include site verification and SSL encryption, store ratings and website reviews, security certification, site attribute information and privacy policies. It enables consumers to evaluate and verify the integrity of a website, review merchants and report suspicious, phishing-related activity.
Feedback is collected and evaluated by GeoTrust, forwarded to anti-fraud organisations as appropriate and kept for potential future use in TrustWatch ratings. The updated TrustWatch Search can be found as a search box in the TrustWatch anti-phishing toolbar from GeoTrust and through http://www.TrustWatch.com as a destination consumer search site.
http://findarticles.com/p/articles/mi_m0BNG/is_2005_Oct_27/ai_n15761539
Tuesday, May 29, 2007
Reviews of Two Legal Search Engines for Consumers
I'm a believer in the old adage "A little knowledge is a dangerous thing," but not when I need the services of a lawyer. I find that when I'm familiar with the basic legal facts about a matter, my lawyer and I can accomplish my goal with a minimum of confusion, in the shortest time possible.
For example, if I'm drawing up a will, it helps to know who can be chosen as an executor, what a living trust is, and how property can be divided, so that I can mull over my own preferences about these issues before seeing my attorney.
Of course, if my legal facts are not accurate, they will be less than worthless. Two specialized search engines that provide access to reputable legal information for consumers are USLaw.com and FindLaw for the Public.
USLaw.com - CLEAR, COMPREHENSIVE LEGAL INFORMATION FOR CONSUMERS
USLaw.com is without a doubt the best place to go for consumer information about legal matters. The information you'll find here is consistently well-organized, well-written and informative. Although the articles are not overly long, they are amazingly comprehensive, and touch on the major issues relating to a topic.
There are about 2,000 articles in USLaw's database. The main areas it covers are family law, employment, immigration, real estate, finance, health and injury, crime and the courts, commercial issues and small business. Most seem to have been written specifically for USLaw.com by professionals in various fields of law, or have been excerpted from books.
Although the number of documents in the database is not large, its scope is relatively limited - the common legal concerns of individuals and small businesses - so most searches result in a fairly long list of relevant, helpful articles.
Search Tips
No search operators are available in USLaw.com. Entering one or two search words having to do with your subject will usually retrieve the best results. For example, if you are withholding your rent payments because your landlord is not providing heat and you want to find information about the legal ramifications, you might enter
withhold rent
This would retrieve 13 articles, most with direct relevance to your subject (e.g. Landlord and Tenant: Solutions for Breach of the Warranty of Implied Habitability and Landlord and Tenant: When Payment of Rent is Excused).
Each search result includes the title of the item and a short description.
Special Features
Special features at USLaw include a search engine for locating USLaw.com prescreened attorneys in your area, plus two fee-based services - access to the advice of a staff lawyer either in real time or by e-mail, and a service enabling you to create your own legal documents.
FindLaw FOR THE PUBLIC - RETRIEVES LEGAL INFORMATION
FROM MANY SOURCES
FindLaw for the Public is part of FindLaw, a very large site that is geared to providing information for legal professionals. A particularly useful feature of FindLaw for the Public are the legal "guides" covering many areas of law that are of special interest to consumers. These guides contain links to various types of information and resources, including articles, FAQ's, legal forms, message boards, etc.
Search Tips
FindLaw for the Public gives you a choice of 10 databases to search (on its drop-down menu) - FindLaw, document library, legal dictionary, legal news, legal websites, all websites, US government sites, US Supreme Court, all Circuit Courts, and the US Constitution. For the average consumer, searching the legal websites will provide the most useful information.
When searching the legal websites, you can use the minus sign to exclude a word and quotation marks to indicate a phrase.
Results
The format of the items retrieved differ among the databases. Items resulting from a search of the legal websites contain the title, a description and the url.
Special Features
You can find lawyers in many specialties using FindLaw's extensive search engine. In addition, you can get information and advice relating to your topic, and read many sad stories at FindLaw's message boards.
http://www.suite101.com/article.cfm/search_engines/88089/1
For example, if I'm drawing up a will, it helps to know who can be chosen as an executor, what a living trust is, and how property can be divided, so that I can mull over my own preferences about these issues before seeing my attorney.
Of course, if my legal facts are not accurate, they will be less than worthless. Two specialized search engines that provide access to reputable legal information for consumers are USLaw.com and FindLaw for the Public.
USLaw.com - CLEAR, COMPREHENSIVE LEGAL INFORMATION FOR CONSUMERS
USLaw.com is without a doubt the best place to go for consumer information about legal matters. The information you'll find here is consistently well-organized, well-written and informative. Although the articles are not overly long, they are amazingly comprehensive, and touch on the major issues relating to a topic.
There are about 2,000 articles in USLaw's database. The main areas it covers are family law, employment, immigration, real estate, finance, health and injury, crime and the courts, commercial issues and small business. Most seem to have been written specifically for USLaw.com by professionals in various fields of law, or have been excerpted from books.
Although the number of documents in the database is not large, its scope is relatively limited - the common legal concerns of individuals and small businesses - so most searches result in a fairly long list of relevant, helpful articles.
Search Tips
No search operators are available in USLaw.com. Entering one or two search words having to do with your subject will usually retrieve the best results. For example, if you are withholding your rent payments because your landlord is not providing heat and you want to find information about the legal ramifications, you might enter
withhold rent
This would retrieve 13 articles, most with direct relevance to your subject (e.g. Landlord and Tenant: Solutions for Breach of the Warranty of Implied Habitability and Landlord and Tenant: When Payment of Rent is Excused).
Each search result includes the title of the item and a short description.
Special Features
Special features at USLaw include a search engine for locating USLaw.com prescreened attorneys in your area, plus two fee-based services - access to the advice of a staff lawyer either in real time or by e-mail, and a service enabling you to create your own legal documents.
FindLaw FOR THE PUBLIC - RETRIEVES LEGAL INFORMATION
FROM MANY SOURCES
FindLaw for the Public is part of FindLaw, a very large site that is geared to providing information for legal professionals. A particularly useful feature of FindLaw for the Public are the legal "guides" covering many areas of law that are of special interest to consumers. These guides contain links to various types of information and resources, including articles, FAQ's, legal forms, message boards, etc.
Search Tips
FindLaw for the Public gives you a choice of 10 databases to search (on its drop-down menu) - FindLaw, document library, legal dictionary, legal news, legal websites, all websites, US government sites, US Supreme Court, all Circuit Courts, and the US Constitution. For the average consumer, searching the legal websites will provide the most useful information.
When searching the legal websites, you can use the minus sign to exclude a word and quotation marks to indicate a phrase.
Results
The format of the items retrieved differ among the databases. Items resulting from a search of the legal websites contain the title, a description and the url.
Special Features
You can find lawyers in many specialties using FindLaw's extensive search engine. In addition, you can get information and advice relating to your topic, and read many sad stories at FindLaw's message boards.
http://www.suite101.com/article.cfm/search_engines/88089/1
Reviews of Two Legal Search Engines for Consumers
--Hi Paula, this is Janice, your SME. You accidentally posted two copies of this article today. I've dated the second copy into the distant future as a fix. I haven't made any other changes to the copy.----
I'm a believer in the old adage "A little knowledge is a dangerous thing," but not when I need the services of a lawyer. I find that when I'm familiar with the basic legal facts about a matter, my lawyer and I can accomplish my goal with a minimum of confusion, in the shortest time possible.
For example, if I'm drawing up a will, it helps to know who can be chosen as an executor, what a living trust is, and how property can be divided, so that I can mull over my own preferences about these issues before seeing my attorney.
Of course, if my legal facts are not accurate, they will be less than worthless. Two specialized search engines that provide access to reputable legal information for consumers are USLaw.com and FindLaw for the Public.
USLaw.com - CLEAR, COMPREHENSIVE LEGAL INFORMATION FOR CONSUMERS
USLaw.com is without a doubt the best place to go for consumer information about legal matters. The information you'll find here is consistently well-organized, well-written and informative. Although the articles are not overly long, they are amazingly comprehensive, and touch on the major issues relating to a topic.
There are about 2,000 articles in USLaw's database. The main areas it covers are family law, employment, immigration, real estate, finance, health and injury, crime and the courts, commercial issues and small business. Most seem to have been written specifically for USLaw.com by professionals in various fields of law, or have been excerpted from books.
Although the number of documents in the database is not large, its scope is relatively limited - the common legal concerns of individuals and small businesses - so most searches result in a fairly long list of relevant, helpful articles.
Search Tips
No search operators are available in USLaw.com. Entering one or two search words having to do with your subject will usually retrieve the best results. For example, if you are withholding your rent payments because your landlord is not providing heat and you want to find information about the legal ramifications, you might enter
withhold rent
This would retrieve 13 articles, most with direct relevance to your subject (e.g. Landlord and Tenant: Solutions for Breach of the Warranty of Implied Habitability and Landlord and Tenant: When Payment of Rent is Excused).
--Hi Paula, this is Janice, your SME. You accidentally posted two copies of this article today. I've dated the second copy into the distant future as a fix. I haven't made any other changes to the copy.----
I'm a believer in the old adage "A little knowledge is a dangerous thing," but not when I need the services of a lawyer. I find that when I'm familiar with the basic legal facts about a matter, my lawyer and I can accomplish my goal with a minimum of confusion, in the shortest time possible.
For example, if I'm drawing up a will, it helps to know who can be chosen as an executor, what a living trust is, and how property can be divided, so that I can mull over my own preferences about these issues before seeing my attorney.
Of course, if my legal facts are not accurate, they will be less than worthless. Two specialized search engines that provide access to reputable legal information for consumers are USLaw.com and FindLaw for the Public.
USLaw.com - CLEAR, COMPREHENSIVE LEGAL INFORMATION FOR CONSUMERS
USLaw.com is without a doubt the best place to go for consumer information about legal matters. The information you'll find here is consistently well-organized, well-written and informative. Although the articles are not overly long, they are amazingly comprehensive, and touch on the major issues relating to a topic.
There are about 2,000 articles in USLaw's database. The main areas it covers are family law, employment, immigration, real estate, finance, health and injury, crime and the courts, commercial issues and small business. Most seem to have been written specifically for USLaw.com by professionals in various fields of law, or have been excerpted from books.
Although the number of documents in the database is not large, its scope is relatively limited - the common legal concerns of individuals and small businesses - so most searches result in a fairly long list of relevant, helpful articles.
Search Tips
No search operators are available in USLaw.com. Entering one or two search words having to do with your subject will usually retrieve the best results. For example, if you are withholding your rent payments because your landlord is not providing heat and you want to find information about the legal ramifications, you might enter
withhold rent
This would retrieve 13 articles, most with direct relevance to your subject (e.g. Landlord and Tenant: Solutions for Breach of the Warranty of Implied Habitability and Landlord and Tenant: When Payment of Rent is Excused).
http://www.suite101.com/article.cfm/search_engines/88088
I'm a believer in the old adage "A little knowledge is a dangerous thing," but not when I need the services of a lawyer. I find that when I'm familiar with the basic legal facts about a matter, my lawyer and I can accomplish my goal with a minimum of confusion, in the shortest time possible.
For example, if I'm drawing up a will, it helps to know who can be chosen as an executor, what a living trust is, and how property can be divided, so that I can mull over my own preferences about these issues before seeing my attorney.
Of course, if my legal facts are not accurate, they will be less than worthless. Two specialized search engines that provide access to reputable legal information for consumers are USLaw.com and FindLaw for the Public.
USLaw.com - CLEAR, COMPREHENSIVE LEGAL INFORMATION FOR CONSUMERS
USLaw.com is without a doubt the best place to go for consumer information about legal matters. The information you'll find here is consistently well-organized, well-written and informative. Although the articles are not overly long, they are amazingly comprehensive, and touch on the major issues relating to a topic.
There are about 2,000 articles in USLaw's database. The main areas it covers are family law, employment, immigration, real estate, finance, health and injury, crime and the courts, commercial issues and small business. Most seem to have been written specifically for USLaw.com by professionals in various fields of law, or have been excerpted from books.
Although the number of documents in the database is not large, its scope is relatively limited - the common legal concerns of individuals and small businesses - so most searches result in a fairly long list of relevant, helpful articles.
Search Tips
No search operators are available in USLaw.com. Entering one or two search words having to do with your subject will usually retrieve the best results. For example, if you are withholding your rent payments because your landlord is not providing heat and you want to find information about the legal ramifications, you might enter
withhold rent
This would retrieve 13 articles, most with direct relevance to your subject (e.g. Landlord and Tenant: Solutions for Breach of the Warranty of Implied Habitability and Landlord and Tenant: When Payment of Rent is Excused).
--Hi Paula, this is Janice, your SME. You accidentally posted two copies of this article today. I've dated the second copy into the distant future as a fix. I haven't made any other changes to the copy.----
I'm a believer in the old adage "A little knowledge is a dangerous thing," but not when I need the services of a lawyer. I find that when I'm familiar with the basic legal facts about a matter, my lawyer and I can accomplish my goal with a minimum of confusion, in the shortest time possible.
For example, if I'm drawing up a will, it helps to know who can be chosen as an executor, what a living trust is, and how property can be divided, so that I can mull over my own preferences about these issues before seeing my attorney.
Of course, if my legal facts are not accurate, they will be less than worthless. Two specialized search engines that provide access to reputable legal information for consumers are USLaw.com and FindLaw for the Public.
USLaw.com - CLEAR, COMPREHENSIVE LEGAL INFORMATION FOR CONSUMERS
USLaw.com is without a doubt the best place to go for consumer information about legal matters. The information you'll find here is consistently well-organized, well-written and informative. Although the articles are not overly long, they are amazingly comprehensive, and touch on the major issues relating to a topic.
There are about 2,000 articles in USLaw's database. The main areas it covers are family law, employment, immigration, real estate, finance, health and injury, crime and the courts, commercial issues and small business. Most seem to have been written specifically for USLaw.com by professionals in various fields of law, or have been excerpted from books.
Although the number of documents in the database is not large, its scope is relatively limited - the common legal concerns of individuals and small businesses - so most searches result in a fairly long list of relevant, helpful articles.
Search Tips
No search operators are available in USLaw.com. Entering one or two search words having to do with your subject will usually retrieve the best results. For example, if you are withholding your rent payments because your landlord is not providing heat and you want to find information about the legal ramifications, you might enter
withhold rent
This would retrieve 13 articles, most with direct relevance to your subject (e.g. Landlord and Tenant: Solutions for Breach of the Warranty of Implied Habitability and Landlord and Tenant: When Payment of Rent is Excused).
http://www.suite101.com/article.cfm/search_engines/88088
Monday, May 28, 2007
Surplus Of Search Engine Marketing Reports
It's been a busy few weeks for reports about search engine marketing issues, with three different publications having been released recently. They cover the issues of selecting a search engine marketing firm, which search engine marketing strategies seem to work and a review of how the web sites of Fortune 100 companies rate in terms of search engine friendliness.
The first, "Buyers' Guide to Search Engine Optimization & Positioning Firms," is an outstanding resource that fills a long-needed gap. There has simply never been this type of comprehensive guide to firms that conduct search engine marketing, and the report is a real must buy for anyone considering outsourcing their search engine marketing efforts.
Published by MarketingSherpa, the 100 page report covers 24 search engine marketing firms, with at least another 31 to be added over the coming months. Along with profiles of the firms, it also has an outstanding introduction to the many different ways that search engine optimization firms approach the task of building traffic. The report costs $119, when downloaded in PDF format, or $129 for a print version.
Who made the cut to be reviewed? A post for recommended search engine marketing firms was sent to several mailing lists frequented by web marketers. Over 100 responses came back, and a list of the initial 24 firms was compiled, when duplicates and subcontractors were removed. The concept was to use this type of survey to find what are the leading firms, at least in terms of reputation.
It was a good approach, especially in that since the SEO industry lacks any type of trade group to determine who is biggest. Nor does biggest necessarily mean the best. After all, independent consultants might perform outstanding work as well as firms with many employees. At least this report provides a first benchmark to "who's who" in the industry.
One part of the MarketingSherpa report calls search engines "the ultimate, cost-effective online guerrilla marketing tactic" and adds that "if search engines are not driving a solid percentage of your traffic, then you have not optimized properly." But what's a solid percentage? The report notes this could vary between 5 to 100 percent, depending on your overall marketing mix.
Isn't there a better figure that can be found? That's one of the questions several questions that "Search Engine Optimization Strategies: A Marketer's Perspective" aimed to answer.
I know this new 46-page report well, because I cowrote it with Search Engine Watch associate editor Chris Sherman for CyberAtlas. Whereas the MarketingSherpa report is a first in surveying SEO firms, the CyberAtlas report is a first in surveying so many web site owners about a wide-range of tactics they follow to achieve success with search engines. The report is $195, when downloaded in PDF format, using the special URL below.
The focus wasn't about specific search engine optimization techniques, such as how many search terms can go into a meta keywords tag. Instead, it was an overview of the collective search engine marketing experiences of over 400 web site owners and marketers.
For instance, we found that there was no conclusive answer to the "how much traffic should you get from search engines" question. Nearly as many respondents (24 percent) said they received more than 75 percent of their web traffic from search engines as those who indicated they got less than 25 percent. In fact, answers for the entire range of traffic volumes were nearly evenly matched in responses. Truly, how much traffic others get is no great measure of how well or poorly you are doing.
The report examined the take-up of paid participation programs. Paid submission to Yahoo and LookSmart and paid listings with GoTo were by far the most popular programs, each used by over 30 percent of those survey (you could select as many programs as you used). Paid inclusion from Inktomi and Google's paid placement programs also had some significant usage, around the 20 percent mark.
Another finding was that about half (46 percent) of those responding said they spent less than 0.5 percent of their annual marketing budgets on search engine optimization, which was pretty shocking, given that search engine listings are becoming widely recognized as one of the most targeted and cost-effective types of advertising available.
The third search engine-related report released recently comes from search engine positioning firm iProspect. The company surveyed the web sites of the Fortune 100 companies and found that 97 percent of them had some type of site architecture problem that might give them problems being found by search engines.
In particular, the report examined the use of JavaScript, Flash, frames and some popular dynamic delivery systems, all of which can may cause crawler-based search engines to miss indexing pages. The study also examined the use of title and both the meta keywords and description tags, as well as link popularity.
One fact from the report's press release is that the study found that half the companies reviewed had meta keyword tags. However, it determined that the tags didn't make use of key products or services offered by those companies. If they had, it might have increased the chances of the companies getting some valuable traffic from search engines.
The press release overstates the problem, however, suggesting that the lack of accurate meta data means that pages about products and services from these companies "will in all likelihood not appear in the query results." The web's major human powered search engines, such as Yahoo, make no use of meta data to compile their listings. In addition, Google -- one of the web's most popular crawler-based search engines -- also doesn't use the meta keywords tag. In these instances, accurate or inaccurate meta data would have no bearing on rankings.
The report is available in two forms. A $195 summary version provides an overview of findings, with some interesting comparisons to how things have changed since 1998 (better, but there's plenty of room for improvement), along with an industry-by-industry breakdown. The $2,000 full copy presumably provides a close up look at each company's web sites, providing details on problems that were found.
http://searchenginewatch.com/showPage.html?page=2164261
The first, "Buyers' Guide to Search Engine Optimization & Positioning Firms," is an outstanding resource that fills a long-needed gap. There has simply never been this type of comprehensive guide to firms that conduct search engine marketing, and the report is a real must buy for anyone considering outsourcing their search engine marketing efforts.
Published by MarketingSherpa, the 100 page report covers 24 search engine marketing firms, with at least another 31 to be added over the coming months. Along with profiles of the firms, it also has an outstanding introduction to the many different ways that search engine optimization firms approach the task of building traffic. The report costs $119, when downloaded in PDF format, or $129 for a print version.
Who made the cut to be reviewed? A post for recommended search engine marketing firms was sent to several mailing lists frequented by web marketers. Over 100 responses came back, and a list of the initial 24 firms was compiled, when duplicates and subcontractors were removed. The concept was to use this type of survey to find what are the leading firms, at least in terms of reputation.
It was a good approach, especially in that since the SEO industry lacks any type of trade group to determine who is biggest. Nor does biggest necessarily mean the best. After all, independent consultants might perform outstanding work as well as firms with many employees. At least this report provides a first benchmark to "who's who" in the industry.
One part of the MarketingSherpa report calls search engines "the ultimate, cost-effective online guerrilla marketing tactic" and adds that "if search engines are not driving a solid percentage of your traffic, then you have not optimized properly." But what's a solid percentage? The report notes this could vary between 5 to 100 percent, depending on your overall marketing mix.
Isn't there a better figure that can be found? That's one of the questions several questions that "Search Engine Optimization Strategies: A Marketer's Perspective" aimed to answer.
I know this new 46-page report well, because I cowrote it with Search Engine Watch associate editor Chris Sherman for CyberAtlas. Whereas the MarketingSherpa report is a first in surveying SEO firms, the CyberAtlas report is a first in surveying so many web site owners about a wide-range of tactics they follow to achieve success with search engines. The report is $195, when downloaded in PDF format, using the special URL below.
The focus wasn't about specific search engine optimization techniques, such as how many search terms can go into a meta keywords tag. Instead, it was an overview of the collective search engine marketing experiences of over 400 web site owners and marketers.
For instance, we found that there was no conclusive answer to the "how much traffic should you get from search engines" question. Nearly as many respondents (24 percent) said they received more than 75 percent of their web traffic from search engines as those who indicated they got less than 25 percent. In fact, answers for the entire range of traffic volumes were nearly evenly matched in responses. Truly, how much traffic others get is no great measure of how well or poorly you are doing.
The report examined the take-up of paid participation programs. Paid submission to Yahoo and LookSmart and paid listings with GoTo were by far the most popular programs, each used by over 30 percent of those survey (you could select as many programs as you used). Paid inclusion from Inktomi and Google's paid placement programs also had some significant usage, around the 20 percent mark.
Another finding was that about half (46 percent) of those responding said they spent less than 0.5 percent of their annual marketing budgets on search engine optimization, which was pretty shocking, given that search engine listings are becoming widely recognized as one of the most targeted and cost-effective types of advertising available.
The third search engine-related report released recently comes from search engine positioning firm iProspect. The company surveyed the web sites of the Fortune 100 companies and found that 97 percent of them had some type of site architecture problem that might give them problems being found by search engines.
In particular, the report examined the use of JavaScript, Flash, frames and some popular dynamic delivery systems, all of which can may cause crawler-based search engines to miss indexing pages. The study also examined the use of title and both the meta keywords and description tags, as well as link popularity.
One fact from the report's press release is that the study found that half the companies reviewed had meta keyword tags. However, it determined that the tags didn't make use of key products or services offered by those companies. If they had, it might have increased the chances of the companies getting some valuable traffic from search engines.
The press release overstates the problem, however, suggesting that the lack of accurate meta data means that pages about products and services from these companies "will in all likelihood not appear in the query results." The web's major human powered search engines, such as Yahoo, make no use of meta data to compile their listings. In addition, Google -- one of the web's most popular crawler-based search engines -- also doesn't use the meta keywords tag. In these instances, accurate or inaccurate meta data would have no bearing on rankings.
The report is available in two forms. A $195 summary version provides an overview of findings, with some interesting comparisons to how things have changed since 1998 (better, but there's plenty of room for improvement), along with an industry-by-industry breakdown. The $2,000 full copy presumably provides a close up look at each company's web sites, providing details on problems that were found.
http://searchenginewatch.com/showPage.html?page=2164261
Sunday, May 27, 2007
Spam Rules Require Effective Spam Police
What's spam? Search columnist Kevin Ryan wasn't quite correct in saying there are no standards in his recent article. Indeed, the two major search providers each publish standards about what they find unacceptable. Here they are: Google's and Yahoo's.
Ryan's summary of things to avoid, like other ones that have appeared recently from Shari Thurow (here and here) Dave Wallace and Jill Whalen, is a good one.
Clearly, knowing what's widely considered spam isn't hard. So why aren't people "playing by the rules?" Here are just a few reasons:
* The rules for each search engine aren't necessarily the same, though they do seem closer today than in the past.
* Some people have a "the cheaters are winning" mentality. If they see spam slip through, they feel like they should do the same (regardless of the fact that the "cheat" technique might not actually be what's helping a page rank well).
* Some people simply don't agree with the search engine rules. For example, they might justify the use of "hidden text" via cascading style sheets to make up for the fact that their home page has no text. To them, search engines are simply stupid for not accepting this.
* Some people simply don't care. As far as they're concerned, as long as a person gets to something relevant to their search topic, who cares how it came up?
How About Standards?
Ryan's commentary sees one solution to spam as a lobbying for standards. This idea has been floating around for ages and has gone nowhere.
SEM pioneer Paul Bruemmer pushed for search engine optimization certification back in 1998. But as I wrote then, just having a "rule book" doesn't mean and end to spam.
We also had a push in 2001 for search engine marketing standards, which also has gone nowhere in terms of reducing spam in search engines.
How About Better Search Engine Disclosure?
Want a real solution to the spam problem? Then let's have the search engines agree to publish lists of firms and companies that they have banned. That would help the consumer seeking an SEM firm to understand which firms to avoid. Or, if they do use a banned firm, at least they've been warned about the consequences if they still want to go with a "rule" breaker.
It's something I've suggested before, and the search engines themselves have discussed the idea at various times in the past. It's never gone forward, because the search engines seem fearful of legal actions, should they out-and-out call a firm a spammer.
Given this, it's with some sympathy that I've defended the still new SEMPO search engine marketing industry group, when it has come under fire for not trying to ensure its members adhere to search engine spam guidelines.
SEMPO recently posted a FAQ explaining why it has declined to do this. I feel even more forcefully. If the search engines aren't brave enough to enforce their own laws, why should the onus be on a third party group that doesn't even create these rules?
They Do Enforce!
Of course, search engines do police for spam. If they catch it, a page might be penalized for banned entirely. But that's not the same or as effective as providing an offenders list, for a variety of reasons.
* An offenders list ensures that people who aren't ranking well or aren't listed for perfectly innocent reasons can discover that their problems are NOT due to a spam penalty. Far too often, people assume that they've "accidentally" spammed when they haven't. That leads them to perhaps make changes they needn't do.
* Sometimes spam is simply allowed to continue on. Google is especially famous for this, preferring to seek "algorithmic" solutions to removing spam than perhaps to react immediately to spam reports and yank material. Why? Google says it wants to detect overall patterns and come up with more comprehensive solutions. Unfortunately, the waiting period for this fuels the "anything goes" fears that some search marketers have when they see spam escaping prosecution.
* Disclosure helps searchers, not just companies that want to be listed. We've seen press outcry over filtering of adult content or filtering content in response to national laws, with the idea that perhaps Google and other search engines should disclose what they remove. But as I've written several times before, no search engine discloses what they've removed for spam reasons. That's something a searcher might want to know.
Disclosure...
To further expand on my last point, Google currently provides you with two ways to discover if they've removed material because of the Digital Millennium Copyright Act, a US law that can cause search engines to bar listings. A DMCA case involving the Church of Scientology in 2002 is probably the most famous example of this.
Google provides DMCA takedown requests that they receive to the Chilling Effects Clearinghouse. So, do a search there for Google, and you can review the material Google's been asked to remove.
I don't believe that Yahoo does the same thing, provide copies of DMCA takedown requests. However, you can get some sense by searching for Yahoo at Chilling Effects. Ironically, this brings back requests to Google that were probably bulk directed to Yahoo and other search engines.
Google will also provide "inline" notification if it has removed or suppressed material that might otherwise have shown up in the results you were reviewing. For example, try a search for kazaa. At the bottom of the page, you'll see this notice:
In response to a complaint we received under the Digital Millennium Copyright Act,
we have removed 5 result(s) from this page. If you wish, you may
read the DMCA complaint for these removed results.
This is excellent disclosure (and not something I believe Yahoo or other search engines do). Google's not only telling you that they removed results, but you can also clickthrough to read the actual complaint that they reacted to.
...And Lack Of Disclosure
Now return to the spam issue. As a searcher, were you informed that material was removed from your search request? Nope. And might there be an economic incentive for search engines to ban sites? Absolutely. Removing sites may cause those sites to resort to taking out ads, exactly the accusation that Google came under (and strongly denied) at the end of last year, when many sites lost rankings.
Let's not single out Google. Long-time search engine marketer Greg Boser nearly brought an audience of search engine marketers to its feet in applause when he criticized search engines themselves as needing better standards during a session Search Engine Strategies in 2002.
One of the issues Boser complained about was how sites would get pulled, only to then hear from various search engines about how they could get back in in through advertising or paid inclusion.
Real-Life Disclosure Example
Want a real-life example of the need for disclosure? Earlier this month, NetIQ, the maker of WebTrends, purchased rank checking tool WebPosition. It was a pretty big deal. Perhaps someone may have gone to Google and done a search for webposition to find out more about the software.
Good luck finding the official WebPosition site. It's not in the top results, unusual given that Google built its reputation largely on providing good navigation to official sites. Why not? Because WebPosition has no pages listed in Google at all. For ages, I've not seen Google list pages from WebPosition. It's probably banned, but of course Google doesn't confirm these things. And as a searcher, it's not something that was disclosed to you.
The Google-WebPosition problem? Google doesn't like the burden the popular rank checking software places on its system, so explicitly warns people not to use it. But ironically, you'll notice that Google has no problems taking ads for the software from WebPosition's many resellers.
Let me stress -- similar things are happening at other search engines as well. The need for better spam disclosure is universal, not just a Google-specific problem.
Protect Yourself
Better disclosure would be helpful for so many reasons. It would help confused web site owners. It would help guide those seeking the many good, search engine marketing firms that diligently try to avoid trouble and play by the search engine rules. I'd love to see it happen.
Until it does -- or more likely, given that it may never come -- here are some additional suggestions to guide you.
* When it comes to spam, the search engines are the lawmakers. You operate within their borders and are subject to whatever rules they may or may not publish. Break the rules, and you can get tossed out of the country. So learn the rules, as much as you can -- assuming you want to avoid trouble.
* Outsourcing? Knowing the rules may not help, if the SEM company simply makes up its own euphemisms to cover up things search engines don't like. So get references. You might also try contacting the search engines themselves for advice about the company, but don't hold your breath waiting for a response. Still do it. Should you get banned, at least you have some evidence that you tried to do due diligence with the search engine themselves.
Looking for more on the topic of spam? Search Engine Watch members have access to the Search Engine Spamming page, which has a summary of things commonly considered bad to do, as well as a compilation of articles on the topic. The Search Engine Optimization Articles and Search Engine Marketing Articles pages also have a compilation of articles that touch on spam and past guidelines and standards attempts.
Another new spam resource is a scholarly paper that's just appeared out of Stanford, Web Spam Taxonomy. It's a nice summary, though not entirely accurate. Google still stands by the fact that it does not give out "URL spam" penalties for URLs that contain to many keywords in them (though I personally wouldn't do this, despite such assurances). And the example giving a meta keyword tag spam example in the report actually doesn't look like spam at all. But only a search engine could tell you for certain -- and none of them do.
For fun, you might also look at the new Black Hat SEO site. This is a humorous directory to unsavory search engine marketing pitches that have been received by Aaron Wall.
Finally, the Outsourcing Search Engine Marketing page is a compilation of articles covering advice on seeking out SEM firms to work with, such as the recent SEO outsouring one we published on this topic in SearchDay last month.
http://searchenginewatch.com/showPage.html?page=3344581
Ryan's summary of things to avoid, like other ones that have appeared recently from Shari Thurow (here and here) Dave Wallace and Jill Whalen, is a good one.
Clearly, knowing what's widely considered spam isn't hard. So why aren't people "playing by the rules?" Here are just a few reasons:
* The rules for each search engine aren't necessarily the same, though they do seem closer today than in the past.
* Some people have a "the cheaters are winning" mentality. If they see spam slip through, they feel like they should do the same (regardless of the fact that the "cheat" technique might not actually be what's helping a page rank well).
* Some people simply don't agree with the search engine rules. For example, they might justify the use of "hidden text" via cascading style sheets to make up for the fact that their home page has no text. To them, search engines are simply stupid for not accepting this.
* Some people simply don't care. As far as they're concerned, as long as a person gets to something relevant to their search topic, who cares how it came up?
How About Standards?
Ryan's commentary sees one solution to spam as a lobbying for standards. This idea has been floating around for ages and has gone nowhere.
SEM pioneer Paul Bruemmer pushed for search engine optimization certification back in 1998. But as I wrote then, just having a "rule book" doesn't mean and end to spam.
We also had a push in 2001 for search engine marketing standards, which also has gone nowhere in terms of reducing spam in search engines.
How About Better Search Engine Disclosure?
Want a real solution to the spam problem? Then let's have the search engines agree to publish lists of firms and companies that they have banned. That would help the consumer seeking an SEM firm to understand which firms to avoid. Or, if they do use a banned firm, at least they've been warned about the consequences if they still want to go with a "rule" breaker.
It's something I've suggested before, and the search engines themselves have discussed the idea at various times in the past. It's never gone forward, because the search engines seem fearful of legal actions, should they out-and-out call a firm a spammer.
Given this, it's with some sympathy that I've defended the still new SEMPO search engine marketing industry group, when it has come under fire for not trying to ensure its members adhere to search engine spam guidelines.
SEMPO recently posted a FAQ explaining why it has declined to do this. I feel even more forcefully. If the search engines aren't brave enough to enforce their own laws, why should the onus be on a third party group that doesn't even create these rules?
They Do Enforce!
Of course, search engines do police for spam. If they catch it, a page might be penalized for banned entirely. But that's not the same or as effective as providing an offenders list, for a variety of reasons.
* An offenders list ensures that people who aren't ranking well or aren't listed for perfectly innocent reasons can discover that their problems are NOT due to a spam penalty. Far too often, people assume that they've "accidentally" spammed when they haven't. That leads them to perhaps make changes they needn't do.
* Sometimes spam is simply allowed to continue on. Google is especially famous for this, preferring to seek "algorithmic" solutions to removing spam than perhaps to react immediately to spam reports and yank material. Why? Google says it wants to detect overall patterns and come up with more comprehensive solutions. Unfortunately, the waiting period for this fuels the "anything goes" fears that some search marketers have when they see spam escaping prosecution.
* Disclosure helps searchers, not just companies that want to be listed. We've seen press outcry over filtering of adult content or filtering content in response to national laws, with the idea that perhaps Google and other search engines should disclose what they remove. But as I've written several times before, no search engine discloses what they've removed for spam reasons. That's something a searcher might want to know.
Disclosure...
To further expand on my last point, Google currently provides you with two ways to discover if they've removed material because of the Digital Millennium Copyright Act, a US law that can cause search engines to bar listings. A DMCA case involving the Church of Scientology in 2002 is probably the most famous example of this.
Google provides DMCA takedown requests that they receive to the Chilling Effects Clearinghouse. So, do a search there for Google, and you can review the material Google's been asked to remove.
I don't believe that Yahoo does the same thing, provide copies of DMCA takedown requests. However, you can get some sense by searching for Yahoo at Chilling Effects. Ironically, this brings back requests to Google that were probably bulk directed to Yahoo and other search engines.
Google will also provide "inline" notification if it has removed or suppressed material that might otherwise have shown up in the results you were reviewing. For example, try a search for kazaa. At the bottom of the page, you'll see this notice:
In response to a complaint we received under the Digital Millennium Copyright Act,
we have removed 5 result(s) from this page. If you wish, you may
read the DMCA complaint for these removed results.
This is excellent disclosure (and not something I believe Yahoo or other search engines do). Google's not only telling you that they removed results, but you can also clickthrough to read the actual complaint that they reacted to.
...And Lack Of Disclosure
Now return to the spam issue. As a searcher, were you informed that material was removed from your search request? Nope. And might there be an economic incentive for search engines to ban sites? Absolutely. Removing sites may cause those sites to resort to taking out ads, exactly the accusation that Google came under (and strongly denied) at the end of last year, when many sites lost rankings.
Let's not single out Google. Long-time search engine marketer Greg Boser nearly brought an audience of search engine marketers to its feet in applause when he criticized search engines themselves as needing better standards during a session Search Engine Strategies in 2002.
One of the issues Boser complained about was how sites would get pulled, only to then hear from various search engines about how they could get back in in through advertising or paid inclusion.
Real-Life Disclosure Example
Want a real-life example of the need for disclosure? Earlier this month, NetIQ, the maker of WebTrends, purchased rank checking tool WebPosition. It was a pretty big deal. Perhaps someone may have gone to Google and done a search for webposition to find out more about the software.
Good luck finding the official WebPosition site. It's not in the top results, unusual given that Google built its reputation largely on providing good navigation to official sites. Why not? Because WebPosition has no pages listed in Google at all. For ages, I've not seen Google list pages from WebPosition. It's probably banned, but of course Google doesn't confirm these things. And as a searcher, it's not something that was disclosed to you.
The Google-WebPosition problem? Google doesn't like the burden the popular rank checking software places on its system, so explicitly warns people not to use it. But ironically, you'll notice that Google has no problems taking ads for the software from WebPosition's many resellers.
Let me stress -- similar things are happening at other search engines as well. The need for better spam disclosure is universal, not just a Google-specific problem.
Protect Yourself
Better disclosure would be helpful for so many reasons. It would help confused web site owners. It would help guide those seeking the many good, search engine marketing firms that diligently try to avoid trouble and play by the search engine rules. I'd love to see it happen.
Until it does -- or more likely, given that it may never come -- here are some additional suggestions to guide you.
* When it comes to spam, the search engines are the lawmakers. You operate within their borders and are subject to whatever rules they may or may not publish. Break the rules, and you can get tossed out of the country. So learn the rules, as much as you can -- assuming you want to avoid trouble.
* Outsourcing? Knowing the rules may not help, if the SEM company simply makes up its own euphemisms to cover up things search engines don't like. So get references. You might also try contacting the search engines themselves for advice about the company, but don't hold your breath waiting for a response. Still do it. Should you get banned, at least you have some evidence that you tried to do due diligence with the search engine themselves.
Looking for more on the topic of spam? Search Engine Watch members have access to the Search Engine Spamming page, which has a summary of things commonly considered bad to do, as well as a compilation of articles on the topic. The Search Engine Optimization Articles and Search Engine Marketing Articles pages also have a compilation of articles that touch on spam and past guidelines and standards attempts.
Another new spam resource is a scholarly paper that's just appeared out of Stanford, Web Spam Taxonomy. It's a nice summary, though not entirely accurate. Google still stands by the fact that it does not give out "URL spam" penalties for URLs that contain to many keywords in them (though I personally wouldn't do this, despite such assurances). And the example giving a meta keyword tag spam example in the report actually doesn't look like spam at all. But only a search engine could tell you for certain -- and none of them do.
For fun, you might also look at the new Black Hat SEO site. This is a humorous directory to unsavory search engine marketing pitches that have been received by Aaron Wall.
Finally, the Outsourcing Search Engine Marketing page is a compilation of articles covering advice on seeking out SEM firms to work with, such as the recent SEO outsouring one we published on this topic in SearchDay last month.
http://searchenginewatch.com/showPage.html?page=3344581
Saturday, May 26, 2007
Leading SEO Training School Expands Into New Mexico and Arizona Search Engine Academy
Leading SEO industry educators Robin Nobles and John Alexander expand their SEO skill-building training schools to include New Mexico and Arizona State as they welcome new associate SEO educator, Emily Leach to their Search Engine Academy network.
Albuquerque, NM (PRWEB) April 18, 2007 -- SEO industry educators Robin Nobles and John Alexander recently announced more new localized Search Engine Academy classes are soon to be available in Albuquerque, New Mexico and Arizona in the ever expanding North American network of http://www.searchengineacademy.com training schools.
John Alexander, Director of the Search Engine Academy officially welcomed Emily Leach of http://www.WhiteHatWorkshops.com this month stating, "Robin Nobles and I are delighted to have Emily joining our network of licensed educators. Emily is an advanced SEO graduate who brings a strong SEO technical background to the Search Engine Academy. We anticipate hearing some wonderful SEO student success stories out of New Mexico and Arizona in the near future."
Robin Nobles and I are delighted to have Emily joining our network of licensed educators. Emily is an advanced SEO graduate who brings a strong SEO technical background to the Search Engine Academy. We anticipate hearing some wonderful SEO student success stories out of New Mexico and Arizona in the near future.
Alexander asked Emily if she would mind sharing a few her views on SEO and give people a brief overview on her SEO training plans for New Mexico and Arizona.
The interview followed with a question and answer period on SEO certification:
Q: "Emily, can you please describe what is unique or special about your workshops?
A: Absolutely! There is more than one aspect that makes this seminar unique. First is the fact that they are truly hands-on! The student brings their project to the class, instead of providing a canned project that works perfectly and then leaves the student having to translate what they learned when they get back to their office. Secondly, I do not teach anything that has not been tested and proven by the Search Engine Academy. And Third, the seminar is followed up with 6 months of mentoring.
Q: How would you identify the value of structured, SEO training in a personalized, hands-on workshop?
A: Personalized, hands-on SEO training is invaluable …this workshop takes the participant right into the actual tasks of SEO on the same project they will be working on after class. The opportunity for questions is far greater because of the students' interaction with the process, those types of questions usually don't come up until after the lecture and then they never get answered. But here, every question does get answered."
Q: Do you have any advice for those who would be interested in starting a successful SEO business?
A: I do! This is a field screaming for attention, and a person with the skill set, ethics and ability to communicate with clients can make a very good living and feel great when they go to sleep at night knowing they are helping companies grow.
Q: Can you talk a little bit about the importance of SEO certification?
A: SEO has received a bad reputation over the years with submissions services, and all the changes in meta-tags. Certification creates a bar that the public can relate to and allow them to disseminate which optimizers know the latest information and which ones may be using strategies from 3 or 4 years ago. Until now anyone could say they are a search engine optimizer without any training, I am very happy to be offering a solution for companies to know they are getting what is being advertised if the SEO they are hiring is certified.
Q: Emily, how long have you been practicing search engine optimization or SEO?
A: Almost 6 years; I found the power of Search Engine Optimization in the fall of 2001.
Q: Do you offer any "industry specific" SEO courses for business owners that are in a highly competitive industry (for example like real estate?)
A: I am working with Search Engine Academy on defining specific industries that would benefit from a more directed training as well as working with the Department of Labor to offer certification classes through their program. There is a large need in the Real Estate industry to not just understand their options when it comes to Search Engine Optimization but also to teach them how to handle the process themselves.
Q: Tell us about the name of "White Hat" Workshops?
A: I chose the name WhiteHat Workshops because I feel it represents the tactics I agree to always adhere to, in my SEO Consulting and training; there is no need to attempt to 'trick' the search engines - there are far more benficial methods that have been proven to work over and over again.
Q: Well spoken! In one final question, do you have any last minute SEO tips or advice for us?
A: I provide a free Introduction to SEO workshop on the second Friday of every month, this is a great place to have your questions answered and get your toes wet. To expand your knowledge in SEO you can sign up online at http://www.whitehatworkshops.com for a free SEO Tip of the Day. I will send you a fresh SEO tip each and every day to keep you informed of updates, free tools and ideas you may not think of on your own.
Thank you for your time today and we wish you and your students the greatest of success in the future.
http://www.prweb.com/releases/2007/04/prweb519349.htm
Albuquerque, NM (PRWEB) April 18, 2007 -- SEO industry educators Robin Nobles and John Alexander recently announced more new localized Search Engine Academy classes are soon to be available in Albuquerque, New Mexico and Arizona in the ever expanding North American network of http://www.searchengineacademy.com training schools.
John Alexander, Director of the Search Engine Academy officially welcomed Emily Leach of http://www.WhiteHatWorkshops.com this month stating, "Robin Nobles and I are delighted to have Emily joining our network of licensed educators. Emily is an advanced SEO graduate who brings a strong SEO technical background to the Search Engine Academy. We anticipate hearing some wonderful SEO student success stories out of New Mexico and Arizona in the near future."
Robin Nobles and I are delighted to have Emily joining our network of licensed educators. Emily is an advanced SEO graduate who brings a strong SEO technical background to the Search Engine Academy. We anticipate hearing some wonderful SEO student success stories out of New Mexico and Arizona in the near future.
Alexander asked Emily if she would mind sharing a few her views on SEO and give people a brief overview on her SEO training plans for New Mexico and Arizona.
The interview followed with a question and answer period on SEO certification:
Q: "Emily, can you please describe what is unique or special about your workshops?
A: Absolutely! There is more than one aspect that makes this seminar unique. First is the fact that they are truly hands-on! The student brings their project to the class, instead of providing a canned project that works perfectly and then leaves the student having to translate what they learned when they get back to their office. Secondly, I do not teach anything that has not been tested and proven by the Search Engine Academy. And Third, the seminar is followed up with 6 months of mentoring.
Q: How would you identify the value of structured, SEO training in a personalized, hands-on workshop?
A: Personalized, hands-on SEO training is invaluable …this workshop takes the participant right into the actual tasks of SEO on the same project they will be working on after class. The opportunity for questions is far greater because of the students' interaction with the process, those types of questions usually don't come up until after the lecture and then they never get answered. But here, every question does get answered."
Q: Do you have any advice for those who would be interested in starting a successful SEO business?
A: I do! This is a field screaming for attention, and a person with the skill set, ethics and ability to communicate with clients can make a very good living and feel great when they go to sleep at night knowing they are helping companies grow.
Q: Can you talk a little bit about the importance of SEO certification?
A: SEO has received a bad reputation over the years with submissions services, and all the changes in meta-tags. Certification creates a bar that the public can relate to and allow them to disseminate which optimizers know the latest information and which ones may be using strategies from 3 or 4 years ago. Until now anyone could say they are a search engine optimizer without any training, I am very happy to be offering a solution for companies to know they are getting what is being advertised if the SEO they are hiring is certified.
Q: Emily, how long have you been practicing search engine optimization or SEO?
A: Almost 6 years; I found the power of Search Engine Optimization in the fall of 2001.
Q: Do you offer any "industry specific" SEO courses for business owners that are in a highly competitive industry (for example like real estate?)
A: I am working with Search Engine Academy on defining specific industries that would benefit from a more directed training as well as working with the Department of Labor to offer certification classes through their program. There is a large need in the Real Estate industry to not just understand their options when it comes to Search Engine Optimization but also to teach them how to handle the process themselves.
Q: Tell us about the name of "White Hat" Workshops?
A: I chose the name WhiteHat Workshops because I feel it represents the tactics I agree to always adhere to, in my SEO Consulting and training; there is no need to attempt to 'trick' the search engines - there are far more benficial methods that have been proven to work over and over again.
Q: Well spoken! In one final question, do you have any last minute SEO tips or advice for us?
A: I provide a free Introduction to SEO workshop on the second Friday of every month, this is a great place to have your questions answered and get your toes wet. To expand your knowledge in SEO you can sign up online at http://www.whitehatworkshops.com for a free SEO Tip of the Day. I will send you a fresh SEO tip each and every day to keep you informed of updates, free tools and ideas you may not think of on your own.
Thank you for your time today and we wish you and your students the greatest of success in the future.
http://www.prweb.com/releases/2007/04/prweb519349.htm
Friday, May 25, 2007
Google's "Florida" Update
On November 16th 2003, Google commenced an update (the Florida update) which had a catastrophic effect for a very large number of websites and, in the process, turned search engine optimization on its head. It is usual to give alphabetical names to Google's updates in the same way that names are given to hurricanes, and this one became known as "Florida".
In a nutshell, a vast number of pages, many of which had ranked at or near the top of the results for a very long time, simply disappeared from the results altogether. Also, the quality (relevancy) of the results for a great many searches was reduced. In the place of Google's usual relevant results, we are now finding pages listed that are off-topic, or their on-topic connections are very tenuous to say the least.
The theories about the Florida update
The various search engine related communities on the web went into overdrive to try and figure what changes Google had made to cause such disastrous effects.
SEO filter (search engine optimization filter)
One of the main theories that was put forward and that, at the time of writing, is still believed by many or most people, is that Google had implemented an 'seo filter'. The idea is that, when a search query is made, Google gets a set of pages that match the query and then applies the seo filter to each of them. Any pages that are found to exceed the threshold of 'allowable' seo, are dropped from the results. That's a brief summary of the theory.
At first I liked this idea because it makes perfect sense for a search engine to do it. But I saw pages that were still ranked in the top 10, and that were very well optimized for the searchterms that they were ranked for. If an seo filter was being applied, they wouldn't have been listed at all. Also, many pages that are not SEOed in any way, were dropped from the rankings.
Searchterm list
People realized that this seo filter was being applied to some searchterms but not to others, so they decided that Google is maintaining a list of searchterms to apply the filter to. I never liked that idea because it doesn't make a great deal of sense to me. If an seo filter can be applied to some searches on-the-fly, it can applied to all searches on-the-fly.
LocalRank
Another idea that has taken hold is that Google have implemented LocalRank. LocalRank is a method of modifying the rankings based on the interconnectivity between the pages that have been selected to be ranked. I.e. pages in the selected set, that are linked to from other pages in the selected set, are ranked more highly. (Google took out a patent on LocalRank earlier this year). But this idea cannot be right. A brief study of LocalRank shows that the technique does not drop pages from the results, as the Florida algorithm does. It merely rearranges them.
Commercial list
It was noticed that many search results were biased towards information pages, and commercial pages were either dropped or moved down the rankings. From this sprang the theory that Google is maintaining a list of "money-words", and modifying the rankings of searches that are done for those words, so that informative pages are displayed at and near the top, rather than commercial ones.
Google sells advertising, and the ads are placed on the search results pages. Every time a person clicks on one of the ads, Google gets paid by the advertiser. In some markets, the cost per click is very expensive, and the idea of dropping commercial pages from the results, or lowering their rankings, when a money-word is searched on is to force commercial sites into advertising, thereby putting up the cost of each click and allowing Google to make a lot more money.
Comment on the above theories
All of the above theories are based on the idea that, when a search query is received, Google compiles a set of results and then modifies them in one way or another before presenting them as the search results. All of the above theories are based on the premise that Google modifies the result set. I am convinced that all the above theories are wrong, as we will see.
Stemming
Finally, there is a theory that has nothing to do with how the results set is compiled. Google has implemented stemming, which means that, in a search query, Google matches words of the same word-stem; e.g. drink is the stem of drink, drinks, drinking, drinker and drinkers. So far, this is not a theory - it's a fact, because Google say it on their website. The theory is that, stemming accounts for all the Florida effects. Like the other theories, I will show why this one cannot be right.
Evidence
There are a number of evidences (Florida effects) that are seen in the results, but I won't go into detail about them all. One piece of evidence that everyone jumped to conclusions about is the fact that you can modify the searches to produce different results. For instance, a search for "uk holidays" (without quotes) shows one set of results, but if you tell Google not to include pages that contain a nonsense word, e.g. "uk holidays -asdqwezxc" (without quotes), you will get a different set of results for some searches, but not for others. Also, the results with the -nonsense word looked the same as they were before the update began, therefore they appeared to be the results before a filter was applied.
This is what led people to come up with the idea of a list of searchterms or a list of money-words; i.e. a filter is applied to some searches but not to others. It was believed that the set of results without the -nonsense word (the normal results) were derived from the set produced with the -nonsense word. But that was a mistake.
What really happened
Although Google's results state how many matches were found for a searchterm (e.g. "1 - 10 of about 2,354,000"), they will only show a maximum of 1000 results. I decided to compare the entire sets of results produced, with and without the -nonsense word, and compare them to see if I could discover why a page would be filtered out and why other pages made it to the top. I did it with a number of searchterms and I was very surprised to find that, in some cases, over 80% of the results had been filtered out and replaced by other results. Then I realised that the two sets of results were completely different - the filtered sets were not derived from the unfiltered sets.
The partners in each pair of result sets were completely different. The 'filtered' set didn't contain what was left from the 'unfiltered' set, and very low ranking pages in the 'unfiltered' set got very high rankings in the 'filtered' set. I saw one page, for instance, that was ranked at #800+ unfiltered and #1 filtered. That can't happen with a simple filter. It can't jump over the other pages that weren't filtered out. All the theories about various kinds of filters and lists were wrong, because they all assumed that the result set is always compiled in the same way, regardless of the searchterm, and then modified by filters. That clearly isn't the case.
In it inescapable that the Google engine now compiles the search results for different queries in different ways. For some queries it compiles them in one way, and for others it compiles them in a different way. The different result sets are not due to filters, they are simply compiled differently in the first place. I.e. the result set without the -nonsense word, and the result set with the -nonsense word are compiled in different ways and are not related to each other as the filter theories suggest. One set is not the result of filtering the other set.
The most fundamental change that Google made with the Florida update is that they now compile the results set for the new results in a different way than they did before.
That's what all the previous theories failed to spot. The question now is, how does Google compile the new results set?
Back in 1999, a system for determining the rankings of pages was conceived and tested by Krishna Bharat. His paper about it is here. He called his search engine "Hilltop". At the time he wrote the paper, his address was Google's address, and people have often wondered if Google might implement the Hilltop system.
Hilltop employs an 'expert' system to rank pages. It compiles an index of expert web pages - these are pages that contain multiple links to other pages on the web of the same subject matter. The pages that end up in the rankings are those that the expert pages link to. Of course, there's much more to it than that, but it gives the general idea. Hilltop was written in 1999 and, if Google have implemented it, they have undoubtedly developed it since then. Even so, every effect that the Florida update has caused can be attributed to a Hilltop-type, expert-based system. An important thing to note is that the 'expert' system cannot create a set of results for all search queries. It can only create a set for queries of a more general nature.
We see many search results, that once contained useful commercial sites, now containing much more in the way of information or authority pages. That's because expert pages would have a significant tendancy to point to information pages. We see that the results with and without the -nonsense word are sometimes different and sometimes the same. That's because an expert system cannot handle all search queries, as the Krishna Bharat paper states. When it can't produce a set of results, Google's normal mechanisms do it instead. We see that a great many home pages have vanished from the results (that was the first thing that everyone noticed). It's because expert pages are much more likely to point to the inner pages that contain the information that to home pages. Every effect we see in the search results can be attributed to an expert system like Hilltop.
I can see flaws in every theory that has been put forward thus far. The flaw in the seo filter idea is that there are highly SEOed pages still ranking in the top 10 for searchterms that they should have been filtered out for. The flaw in the LocalRank theory is that LocalRank doesn't drop pages, but a great many pages have been dropped. The flaw in the list of searchterms is that if a filter can be applied to one searchterm, it can be applied to them all, so why bother maintaining a list. The flaw in the money-words list idea is that, if it ever came out that they were doing it, Google would run the risk of going into a quick decline. I just don't believe that the people at Google are that stupid. The flaw in the stemming theory is not that Google hasn't introduced stemming, it's that the theory doesn't take into account the fact that the Florida results set is compiled in a different way to the -nonsense set. Stemming is additional to the main change, but it isn't the main change itself.
The expert-system, or something like it, accounts for every Florida effect that we see. I am convinced that this is what Google rolled out in the Florida update. Having said that, I must also add that it is still a theory, and cannot be relied upon as fact. I cannot say that Google has implemented Hilltop, or a development of Hilltop, or even a Hilltop-like system. What I can say with confidence is that the results without a -nonsense word (the normal results) are not derived from the results with a -nonsense word, as most people currently think. They are a completely different results set and are compiled in a different way. And I can also say that every effect that the Florida update has caused would be expected with a Hilltop-like expert-based system.
Where do we go from here?
At the moment, Google's search results are in poor shape, in spite of what their representatives say. If they leave them as they, they will lose users, and risk becoming a small engine as other top engines have done in the past. We are seeing the return of some pages that were consigned to the void, so it is clear that the people at Google are continuing to tweak the changes.
If they get the results to their satisfaction, the changes will stay and we will have to learn how to seo Google all over again. But it can be done. There are reasons why certain pages are at the top of the search results and, if they can get there, albeit accidentally in many cases, other pages can get there too.
If it really is an expert system, then the first thing to realise is that the system cannot deal with all searchterms, so targeting non-generalised and lesser searchterms, using the usual search engine optimization basics, will still work.
For more generalised searchterms, the page needs to be linked to by multiple expert pages that are unaffiliated with the page. By "unaffiliated" I mean that they must reside on servers with different IP C block addresses than each other and than the target page, and their URLs must not use the same domain name as each other or as the target page. These expert pages can either be found and links requested or they can be created.
Latest
8th December 2003
Since soon after the Florida update began, some pages that disappeared from the results have been returning. In some cases they are back at, or close to, the rankings that they had before Florida. In other cases they are quite highly ranked but lower than before. Day after day, more of them are returning.
I put this down to Google recognizing that Florida caused a sharp decline in the quality (relevancy) of the search results. It appears that they are adjusting the algorithm's parameters, trying to find a balance between the new page selection process and good relevancy. In doing so, some of the pages that were dumped out of the results are getting back into the results set, and they are achieving high rankings because they already matched the ranking algorithm quite well, so once they are back in the results set, they do well in the rankings.
Reminder
Don't forget that all this is just theory, but what we see happening does appear to fit an expert sytem, although there can be other explanations. We can be sure that Google compiles the results sets in different ways depending on the searchterm, and that the Florida results are not derived, via one or more filters, from the -nonsense results, but we can't yet be certain that an expert system is used to compile the Florida results set.
22nd December 2003
Google has now dealt with the -nonsense search trick of seeing the non-Florida results, and it no longer works. It doesn't mean that they are not different to the Florida results; it's just that we can no longer see them.
5th January 2004
Dan Thies, of Seo Research Labs, came up with the interesting theory that the Florida changes are due to Google now using Topic Sensitive PageRank (TSPR). His PDF article can be found here. It's an interesting theory because, like the 'expert system' theory, it would cause Google to use 2 different algorithms depending on the searchterm used. To date, it's the only other theory that I believe has a chance of being right.
http://www.webworkshop.net/florida-update.html
In a nutshell, a vast number of pages, many of which had ranked at or near the top of the results for a very long time, simply disappeared from the results altogether. Also, the quality (relevancy) of the results for a great many searches was reduced. In the place of Google's usual relevant results, we are now finding pages listed that are off-topic, or their on-topic connections are very tenuous to say the least.
The theories about the Florida update
The various search engine related communities on the web went into overdrive to try and figure what changes Google had made to cause such disastrous effects.
SEO filter (search engine optimization filter)
One of the main theories that was put forward and that, at the time of writing, is still believed by many or most people, is that Google had implemented an 'seo filter'. The idea is that, when a search query is made, Google gets a set of pages that match the query and then applies the seo filter to each of them. Any pages that are found to exceed the threshold of 'allowable' seo, are dropped from the results. That's a brief summary of the theory.
At first I liked this idea because it makes perfect sense for a search engine to do it. But I saw pages that were still ranked in the top 10, and that were very well optimized for the searchterms that they were ranked for. If an seo filter was being applied, they wouldn't have been listed at all. Also, many pages that are not SEOed in any way, were dropped from the rankings.
Searchterm list
People realized that this seo filter was being applied to some searchterms but not to others, so they decided that Google is maintaining a list of searchterms to apply the filter to. I never liked that idea because it doesn't make a great deal of sense to me. If an seo filter can be applied to some searches on-the-fly, it can applied to all searches on-the-fly.
LocalRank
Another idea that has taken hold is that Google have implemented LocalRank. LocalRank is a method of modifying the rankings based on the interconnectivity between the pages that have been selected to be ranked. I.e. pages in the selected set, that are linked to from other pages in the selected set, are ranked more highly. (Google took out a patent on LocalRank earlier this year). But this idea cannot be right. A brief study of LocalRank shows that the technique does not drop pages from the results, as the Florida algorithm does. It merely rearranges them.
Commercial list
It was noticed that many search results were biased towards information pages, and commercial pages were either dropped or moved down the rankings. From this sprang the theory that Google is maintaining a list of "money-words", and modifying the rankings of searches that are done for those words, so that informative pages are displayed at and near the top, rather than commercial ones.
Google sells advertising, and the ads are placed on the search results pages. Every time a person clicks on one of the ads, Google gets paid by the advertiser. In some markets, the cost per click is very expensive, and the idea of dropping commercial pages from the results, or lowering their rankings, when a money-word is searched on is to force commercial sites into advertising, thereby putting up the cost of each click and allowing Google to make a lot more money.
Comment on the above theories
All of the above theories are based on the idea that, when a search query is received, Google compiles a set of results and then modifies them in one way or another before presenting them as the search results. All of the above theories are based on the premise that Google modifies the result set. I am convinced that all the above theories are wrong, as we will see.
Stemming
Finally, there is a theory that has nothing to do with how the results set is compiled. Google has implemented stemming, which means that, in a search query, Google matches words of the same word-stem; e.g. drink is the stem of drink, drinks, drinking, drinker and drinkers. So far, this is not a theory - it's a fact, because Google say it on their website. The theory is that, stemming accounts for all the Florida effects. Like the other theories, I will show why this one cannot be right.
Evidence
There are a number of evidences (Florida effects) that are seen in the results, but I won't go into detail about them all. One piece of evidence that everyone jumped to conclusions about is the fact that you can modify the searches to produce different results. For instance, a search for "uk holidays" (without quotes) shows one set of results, but if you tell Google not to include pages that contain a nonsense word, e.g. "uk holidays -asdqwezxc" (without quotes), you will get a different set of results for some searches, but not for others. Also, the results with the -nonsense word looked the same as they were before the update began, therefore they appeared to be the results before a filter was applied.
This is what led people to come up with the idea of a list of searchterms or a list of money-words; i.e. a filter is applied to some searches but not to others. It was believed that the set of results without the -nonsense word (the normal results) were derived from the set produced with the -nonsense word. But that was a mistake.
What really happened
Although Google's results state how many matches were found for a searchterm (e.g. "1 - 10 of about 2,354,000"), they will only show a maximum of 1000 results. I decided to compare the entire sets of results produced, with and without the -nonsense word, and compare them to see if I could discover why a page would be filtered out and why other pages made it to the top. I did it with a number of searchterms and I was very surprised to find that, in some cases, over 80% of the results had been filtered out and replaced by other results. Then I realised that the two sets of results were completely different - the filtered sets were not derived from the unfiltered sets.
The partners in each pair of result sets were completely different. The 'filtered' set didn't contain what was left from the 'unfiltered' set, and very low ranking pages in the 'unfiltered' set got very high rankings in the 'filtered' set. I saw one page, for instance, that was ranked at #800+ unfiltered and #1 filtered. That can't happen with a simple filter. It can't jump over the other pages that weren't filtered out. All the theories about various kinds of filters and lists were wrong, because they all assumed that the result set is always compiled in the same way, regardless of the searchterm, and then modified by filters. That clearly isn't the case.
In it inescapable that the Google engine now compiles the search results for different queries in different ways. For some queries it compiles them in one way, and for others it compiles them in a different way. The different result sets are not due to filters, they are simply compiled differently in the first place. I.e. the result set without the -nonsense word, and the result set with the -nonsense word are compiled in different ways and are not related to each other as the filter theories suggest. One set is not the result of filtering the other set.
The most fundamental change that Google made with the Florida update is that they now compile the results set for the new results in a different way than they did before.
That's what all the previous theories failed to spot. The question now is, how does Google compile the new results set?
Back in 1999, a system for determining the rankings of pages was conceived and tested by Krishna Bharat. His paper about it is here. He called his search engine "Hilltop". At the time he wrote the paper, his address was Google's address, and people have often wondered if Google might implement the Hilltop system.
Hilltop employs an 'expert' system to rank pages. It compiles an index of expert web pages - these are pages that contain multiple links to other pages on the web of the same subject matter. The pages that end up in the rankings are those that the expert pages link to. Of course, there's much more to it than that, but it gives the general idea. Hilltop was written in 1999 and, if Google have implemented it, they have undoubtedly developed it since then. Even so, every effect that the Florida update has caused can be attributed to a Hilltop-type, expert-based system. An important thing to note is that the 'expert' system cannot create a set of results for all search queries. It can only create a set for queries of a more general nature.
We see many search results, that once contained useful commercial sites, now containing much more in the way of information or authority pages. That's because expert pages would have a significant tendancy to point to information pages. We see that the results with and without the -nonsense word are sometimes different and sometimes the same. That's because an expert system cannot handle all search queries, as the Krishna Bharat paper states. When it can't produce a set of results, Google's normal mechanisms do it instead. We see that a great many home pages have vanished from the results (that was the first thing that everyone noticed). It's because expert pages are much more likely to point to the inner pages that contain the information that to home pages. Every effect we see in the search results can be attributed to an expert system like Hilltop.
I can see flaws in every theory that has been put forward thus far. The flaw in the seo filter idea is that there are highly SEOed pages still ranking in the top 10 for searchterms that they should have been filtered out for. The flaw in the LocalRank theory is that LocalRank doesn't drop pages, but a great many pages have been dropped. The flaw in the list of searchterms is that if a filter can be applied to one searchterm, it can be applied to them all, so why bother maintaining a list. The flaw in the money-words list idea is that, if it ever came out that they were doing it, Google would run the risk of going into a quick decline. I just don't believe that the people at Google are that stupid. The flaw in the stemming theory is not that Google hasn't introduced stemming, it's that the theory doesn't take into account the fact that the Florida results set is compiled in a different way to the -nonsense set. Stemming is additional to the main change, but it isn't the main change itself.
The expert-system, or something like it, accounts for every Florida effect that we see. I am convinced that this is what Google rolled out in the Florida update. Having said that, I must also add that it is still a theory, and cannot be relied upon as fact. I cannot say that Google has implemented Hilltop, or a development of Hilltop, or even a Hilltop-like system. What I can say with confidence is that the results without a -nonsense word (the normal results) are not derived from the results with a -nonsense word, as most people currently think. They are a completely different results set and are compiled in a different way. And I can also say that every effect that the Florida update has caused would be expected with a Hilltop-like expert-based system.
Where do we go from here?
At the moment, Google's search results are in poor shape, in spite of what their representatives say. If they leave them as they, they will lose users, and risk becoming a small engine as other top engines have done in the past. We are seeing the return of some pages that were consigned to the void, so it is clear that the people at Google are continuing to tweak the changes.
If they get the results to their satisfaction, the changes will stay and we will have to learn how to seo Google all over again. But it can be done. There are reasons why certain pages are at the top of the search results and, if they can get there, albeit accidentally in many cases, other pages can get there too.
If it really is an expert system, then the first thing to realise is that the system cannot deal with all searchterms, so targeting non-generalised and lesser searchterms, using the usual search engine optimization basics, will still work.
For more generalised searchterms, the page needs to be linked to by multiple expert pages that are unaffiliated with the page. By "unaffiliated" I mean that they must reside on servers with different IP C block addresses than each other and than the target page, and their URLs must not use the same domain name as each other or as the target page. These expert pages can either be found and links requested or they can be created.
Latest
8th December 2003
Since soon after the Florida update began, some pages that disappeared from the results have been returning. In some cases they are back at, or close to, the rankings that they had before Florida. In other cases they are quite highly ranked but lower than before. Day after day, more of them are returning.
I put this down to Google recognizing that Florida caused a sharp decline in the quality (relevancy) of the search results. It appears that they are adjusting the algorithm's parameters, trying to find a balance between the new page selection process and good relevancy. In doing so, some of the pages that were dumped out of the results are getting back into the results set, and they are achieving high rankings because they already matched the ranking algorithm quite well, so once they are back in the results set, they do well in the rankings.
Reminder
Don't forget that all this is just theory, but what we see happening does appear to fit an expert sytem, although there can be other explanations. We can be sure that Google compiles the results sets in different ways depending on the searchterm, and that the Florida results are not derived, via one or more filters, from the -nonsense results, but we can't yet be certain that an expert system is used to compile the Florida results set.
22nd December 2003
Google has now dealt with the -nonsense search trick of seeing the non-Florida results, and it no longer works. It doesn't mean that they are not different to the Florida results; it's just that we can no longer see them.
5th January 2004
Dan Thies, of Seo Research Labs, came up with the interesting theory that the Florida changes are due to Google now using Topic Sensitive PageRank (TSPR). His PDF article can be found here. It's an interesting theory because, like the 'expert system' theory, it would cause Google to use 2 different algorithms depending on the searchterm used. To date, it's the only other theory that I believe has a chance of being right.
http://www.webworkshop.net/florida-update.html
Google and Inbound Links (IBLs)
The effect of inbound links
It's common knowledge that Google evaluates many factors when working out which page to rank where in response to a search query. They claim to incorporate around 100 different ranking factors. And it's common knowledge that the most powerful of these ranking factors is link text. Link text is the text that you click on when clicking a link. Here's an example of link text:- miserable failure. The words "miserable failure" are the link text. Link text is also known as anchor text.
I used that particular example because it shows the power of link text - the link text effect. If you click on it, it searches Google for "miserable failure", and you may be surprised to see which page is ranked at #1. If you click on the "Cached" link for that #1 ranked listing, you will see Google's cache for the page, and you will see each word of the phrase "miserable failure" highlighted in yellow in the page - or that's what you would see if the page actually contained either of those words, but it doesn't.
So how come the George Bush page is ranked at #1 for a phrase that isn't anywhere to be found in the page? The cache page itself tells us. In Google's head are the words, These terms only appear in links pointing to this page: miserable failure. The link texts of links that point to that page contain the words "miserable failure", and it's the power of those link texts that got the page to #1.
That demonstrates the power of link text in Google. Some people decided to get the George Bush page ranked #1 for "miserable failure", and they did it by linking to the page using the link text "miserable failure". It's known as "Googlebombing".
Why are inbound links so powerful?
It's because of the way that Google stores a page's data, and the way that they process a search query.
Google's Regular index consists of two indexes - the short index and the long index. They are also known as the short barrels and the long barrels. The short index is also known as the "fancy hits" index. Google also has a Supplemental index, but that's not part of the Regular index, and it's not relevant to this topic.
The short index is used to store the words in link texts that point to a page, the words in a page's title, and one or two other special things. But when they store the link text words in the short index, they are attributed to the target page, and not to the page that the link is on. In other words, if my page links to your page, using the link text "Miami hotels", then the words "Miami" and "hotels" are stored in the short index as though they appeared in your page - they belong to your page. If 100 pages link to your page, using those same words as link text, then your page will have a lot of entries in the short index for those particular words.
The long index is used to store all the other words on a page - its actual content.
And here's the point...
When Google processes a search query, they first try to get enough results from the short index. If they can't get enough results from there, they use the long index to add to what they have. It means that, if they can get enough results from the short index - that's the index that contains words in link texts and page titles - then they don't even look in the long index where the actual contents of pages are stored. Page content isn't even considered if they can get enough results from the link texts and titles index - the short index.
That is the reason why link texts are so powerful for Google rankings. They are much more powerful than page titles, because a page can have the words from only one title in the short index, but it can have the words from a great many link texts in there. That is the reason why the George Bush page ranks #1 for "miserable failure". All the link texts from all the pages that link to the George Bush page using the "miserable failure" link text, are in the short index - and they are all attributed to the George Bush page.
Page titles are the second most powerful ranking factor, because they are stored in the short index.
URL-only listings
We sometimes see a page listed in the rankings, but its URL is shown and linked instead of its title, and there is no description snippet for it. They are known an URL-only listings. Google says that they are "partially indexed pages". I'll explain what that means, since it's relevant to this topic.
When Google spiders a page and finds a link to another page on it, but they don't yet have the other page in the index, they find themselves with some link text that they want to attribute to the other page, so that it can be used in the normal search query processing. They treat it as normal, and place it in the short index, attributing it to the other page which they haven't got. Sometimes they will store the words from more than one link to the other page before they have spidered and indexed the page itself.
Sometimes that link text data in the short index will cause the other page to be ranked for a search query before the page has been spidered and indexed. But they don't have the page itself, so they don't have its title, or anything from the page that can be used for the description snippet. So they simply display and link its URL.
That's what is meant by "partially indexed", and it's why we sometimes see those URL-only listings. Google will later spider the other page, it's data will be stored as normal, and its listings in the search results will be displayed normally.
Note: When a page is indexed, not only is its content indexed, but also link texts that point to it are indexed as part of the page itself. So when links that point to a page are indexed, the page itself is partially indexed, even though it hasn't yet been spidered.
http://www.webworkshop.net/google-and-inbound-links.html
It's common knowledge that Google evaluates many factors when working out which page to rank where in response to a search query. They claim to incorporate around 100 different ranking factors. And it's common knowledge that the most powerful of these ranking factors is link text. Link text is the text that you click on when clicking a link. Here's an example of link text:- miserable failure. The words "miserable failure" are the link text. Link text is also known as anchor text.
I used that particular example because it shows the power of link text - the link text effect. If you click on it, it searches Google for "miserable failure", and you may be surprised to see which page is ranked at #1. If you click on the "Cached" link for that #1 ranked listing, you will see Google's cache for the page, and you will see each word of the phrase "miserable failure" highlighted in yellow in the page - or that's what you would see if the page actually contained either of those words, but it doesn't.
So how come the George Bush page is ranked at #1 for a phrase that isn't anywhere to be found in the page? The cache page itself tells us. In Google's head are the words, These terms only appear in links pointing to this page: miserable failure. The link texts of links that point to that page contain the words "miserable failure", and it's the power of those link texts that got the page to #1.
That demonstrates the power of link text in Google. Some people decided to get the George Bush page ranked #1 for "miserable failure", and they did it by linking to the page using the link text "miserable failure". It's known as "Googlebombing".
Why are inbound links so powerful?
It's because of the way that Google stores a page's data, and the way that they process a search query.
Google's Regular index consists of two indexes - the short index and the long index. They are also known as the short barrels and the long barrels. The short index is also known as the "fancy hits" index. Google also has a Supplemental index, but that's not part of the Regular index, and it's not relevant to this topic.
The short index is used to store the words in link texts that point to a page, the words in a page's title, and one or two other special things. But when they store the link text words in the short index, they are attributed to the target page, and not to the page that the link is on. In other words, if my page links to your page, using the link text "Miami hotels", then the words "Miami" and "hotels" are stored in the short index as though they appeared in your page - they belong to your page. If 100 pages link to your page, using those same words as link text, then your page will have a lot of entries in the short index for those particular words.
The long index is used to store all the other words on a page - its actual content.
And here's the point...
When Google processes a search query, they first try to get enough results from the short index. If they can't get enough results from there, they use the long index to add to what they have. It means that, if they can get enough results from the short index - that's the index that contains words in link texts and page titles - then they don't even look in the long index where the actual contents of pages are stored. Page content isn't even considered if they can get enough results from the link texts and titles index - the short index.
That is the reason why link texts are so powerful for Google rankings. They are much more powerful than page titles, because a page can have the words from only one title in the short index, but it can have the words from a great many link texts in there. That is the reason why the George Bush page ranks #1 for "miserable failure". All the link texts from all the pages that link to the George Bush page using the "miserable failure" link text, are in the short index - and they are all attributed to the George Bush page.
Page titles are the second most powerful ranking factor, because they are stored in the short index.
URL-only listings
We sometimes see a page listed in the rankings, but its URL is shown and linked instead of its title, and there is no description snippet for it. They are known an URL-only listings. Google says that they are "partially indexed pages". I'll explain what that means, since it's relevant to this topic.
When Google spiders a page and finds a link to another page on it, but they don't yet have the other page in the index, they find themselves with some link text that they want to attribute to the other page, so that it can be used in the normal search query processing. They treat it as normal, and place it in the short index, attributing it to the other page which they haven't got. Sometimes they will store the words from more than one link to the other page before they have spidered and indexed the page itself.
Sometimes that link text data in the short index will cause the other page to be ranked for a search query before the page has been spidered and indexed. But they don't have the page itself, so they don't have its title, or anything from the page that can be used for the description snippet. So they simply display and link its URL.
That's what is meant by "partially indexed", and it's why we sometimes see those URL-only listings. Google will later spider the other page, it's data will be stored as normal, and its listings in the search results will be displayed normally.
Note: When a page is indexed, not only is its content indexed, but also link texts that point to it are indexed as part of the page itself. So when links that point to a page are indexed, the page itself is partially indexed, even though it hasn't yet been spidered.
http://www.webworkshop.net/google-and-inbound-links.html
The Madness of King Google
When Google arrived on the scene in the late 1990s, they came in with a new idea of how to rank pages. Until then, search engines had ranked each page according to what was in the page - it's content - but it was easy for people to manipulate a page's content and move it up the rankings. Google's new idea was to rank pages largely by what was in the links that pointed to them - the clickable link text - which made it a little more difficult for page owners to manipulate the page's rankings.
Changing the focus from what is in a page to what other websites and pages say about a page (the link text), produced much more relevant search results than the other engines were able to produce at the time.
The idea worked very well, but it could only work well as long as it was never actually used in the real world. As soon as people realised that Google were largely basing their rankings on link text, webmasters and search engine optimizers started to find ways of manipulating the links and link text, and therefore the rankings. From that point on, Google's results deteriorated, and their fight against link manipulations has continued. We've had link exchange schemes for a long time now, and they are all about improving the rankings in Google - and in the other engines that copied Google's idea.
In the first few months of this year (2006), Google rolled out a new infrastructure for their servers. The infrastructure update was called "Big Daddy". As the update was completed, people started to notice that Google was dropping their sites' pages from the index - their pages were being dumped. Many sites that had been fully indexed for a long time were having their pages removed from Google's index, which caused traffic to deteriorate, and business to be lost. It caused a great deal of frustration, because Google kept quiet about what was happening. Speculation about what was causing it was rife, but nobody outside Google knew exactly why the pages were being dropped.
Then on the 16th May 2006, Matt Cutts, a senior Google software engineer, finally explained something about what was going on. He said that the dropping of pages is caused by the improved crawling and indexing functions in the new Big Daddy infrastructure, and he gave some examples of sites that had had their pages dropped.
Here is what Matt said about one of the sites:
Some one sent in a health care directory domain. It seems like a fine site, and it’s not linking to anything junky. But it only has six links to the entire domain. With that few links, I can believe that out toward the edge of the crawl, we would index fewer pages.
And about the same site, he went on to say:
A few more relevant links would help us know to crawl more pages from your site.
Because the site hasn't attracted enough relevant links to it, it won't have all of its pages included in Google's index, in spite of the fact that, in Matt's words, "it seems like a fine site". He also said the same about another of the examples that he gave.
Let me repeat one of the things that he said about that site. "A few more relevant links would help us know to crawl more pages from your site." What??? They know that the site is there! They know that the site has more pages that they haven't crawled and indexed! They don't need any additional help to know to crawl more pages from the site! If the site has "fine" pages then index them, dammit. That's what a search engine is supposed to do. That's what Google's users expect them to do.
Google never did crawl all sites equally. The amount of PageRank in a site has always affected how often a site is crawled. But they've now added links to the criteria, and for the first time they are dumping a site's pages OUT of the index if it doesn't have a good enough score. What sense is there in dumping perfectly good and useful pages out of the index? If they are in, leave them in. Why remove them? What difference does it make if a site has only one link pointing to it or a thousand links pointing to it? Does having only one link make it a bad site that people would rather not see? If it does, why index ANY of it's pages? Nothing makes any sort of sense.
So we now have the situation where Google intentionally leaves "fine" and useful pages out of their index, simply because the sites haven't attracted enough links to them. It is grossly unfair to website owners, especially to the owners of small websites, most of whom won't even know that they are being treated so unfairly, and it short-changes Google's users, since they are being deprived of the opportunity to find many useful pages and resources.
So what now? Google has always talked against doing things to websites and pages, solely because search engines exist. But what can website owners do? Those who aren't aware of what's happening to their sites simply lose - end of story. Those who are aware of it are forced into doing something solely because search engines exist. They are forced to contrive unnatural links to their sites - something that Google is actually fighting against - just so that Google will treat them fairly.
Incidentally, link exchanges are no good, because Matt also said that too many reciprocal links causes the same negative effect. The effect being that the site isn't crawled as often, and fewer pages from the site are indexed.
It's a penalty. There is no other way to see it. If a site is put on the Web, and the owner doesn't go in for search engine manipulation by doing unnatural link-building, the site gets penalised by not having all of its pages indexed. It can't be seen as anything other than a penalty.
Is that the way to run a decent search engine? Not in my opinion it isn't. Do Google's users want them to leave useful pages and resources out of the index, just because they haven't got enough links pointing to them? I don't think so. As a Google user, I certainly don't want to be short-changed like that. It is sheer madness to do it. The only winners are those who manipulate Google by contriving unnatural links to their sites. The filthy linking rich get richer, and the link-poor get poorer - and pushed by Google towards spam methods.
Google's new crawling/indexing system is lunacy. It is grossly unfair to many websites that have never even tried to manipulate the engine by building unnatural links to their sites, and it is very bad for Google's users, who are intentionally deprived of the opportunity to find many useful pages and resources. Google people always talk about improving the user's experience, but now they are intentionally depriving their users. It is sheer madness!
What's wrong with Google indexing decent pages, just because they are there? Doesn't Google want to index all the good pages for their users any more? It's what a search engine is supposed to do, it's what Google's users expect it to do, and it's what Google's users trust it to do, but it's not what Google is doing.
At the time of writing, the dropping of pages is continuing with a vengeance, and more and more perfectly good sites are being affected.
A word about Matt Cutts
Matt is a senior software engineer at Google, who currently works on the spam side of things. He is Google's main spam man. He communicates with the outside world through his blog, in which he is often very helpful and informative. Personally, I believe that he is an honest person. I have a great deal of respect for him, and I don't doubt anything that he says, but I accept that he frequently has to be economical with the truth. He may agree or disagree with some or all of the overwhelming outside opinion concerning Google's new crawl/index function, but if he agrees with any of it, he cannot voice it publically. This article isn't about Matt Cutts, or his views and opinions; it is about what Google is doing.
The thread in Matt's blog where all of this came to light is here.
Update:
Since writing this article, it has occured to me that I may have jumped to the wrong conclusion as to what Google is actually doing with the Big Daddy update. What I haven't been able to understand is the reason for attacking certain types of links at the point of indexing pages, instead of attacking them in the index itself, where they boost rankings. But attacking certain types of links may not be Big Daddy's primary purpose.
The growth of the Web continues at a great pace, and no search engine can possibly keep up with it. Index space has to be an issue for the engines sooner or later, and it may be that Big Daddy is Google's way of addressing the issue now. Search engines have normally tried to index as much of the Web as possible, but, since they can't keep pace with it, it may be that Google has made a fundamental change to the way they intend to index the Web. Instead of trying to index all pages from as many websites as possible, they may have decided to allow all sites to be represented in the index, but not necessarily to be fully indexed. In that way, they can index pages from more sites, and their index could be said to be more comprehensive.
Matt Cutts has stated that, with Big Daddy, they are now indexing more sites than before, and also that the index is now more comprehensive than before.
If that's what Big Daddy is about, then I would have to say that it is fair, because it may be that Google had to leave many sites out of the index due to space restrictions, and the new way would allow pages from more sites to be included in the index.
http://www.webworkshop.net/google-madness.html
Changing the focus from what is in a page to what other websites and pages say about a page (the link text), produced much more relevant search results than the other engines were able to produce at the time.
The idea worked very well, but it could only work well as long as it was never actually used in the real world. As soon as people realised that Google were largely basing their rankings on link text, webmasters and search engine optimizers started to find ways of manipulating the links and link text, and therefore the rankings. From that point on, Google's results deteriorated, and their fight against link manipulations has continued. We've had link exchange schemes for a long time now, and they are all about improving the rankings in Google - and in the other engines that copied Google's idea.
In the first few months of this year (2006), Google rolled out a new infrastructure for their servers. The infrastructure update was called "Big Daddy". As the update was completed, people started to notice that Google was dropping their sites' pages from the index - their pages were being dumped. Many sites that had been fully indexed for a long time were having their pages removed from Google's index, which caused traffic to deteriorate, and business to be lost. It caused a great deal of frustration, because Google kept quiet about what was happening. Speculation about what was causing it was rife, but nobody outside Google knew exactly why the pages were being dropped.
Then on the 16th May 2006, Matt Cutts, a senior Google software engineer, finally explained something about what was going on. He said that the dropping of pages is caused by the improved crawling and indexing functions in the new Big Daddy infrastructure, and he gave some examples of sites that had had their pages dropped.
Here is what Matt said about one of the sites:
Some one sent in a health care directory domain. It seems like a fine site, and it’s not linking to anything junky. But it only has six links to the entire domain. With that few links, I can believe that out toward the edge of the crawl, we would index fewer pages.
And about the same site, he went on to say:
A few more relevant links would help us know to crawl more pages from your site.
Because the site hasn't attracted enough relevant links to it, it won't have all of its pages included in Google's index, in spite of the fact that, in Matt's words, "it seems like a fine site". He also said the same about another of the examples that he gave.
Let me repeat one of the things that he said about that site. "A few more relevant links would help us know to crawl more pages from your site." What??? They know that the site is there! They know that the site has more pages that they haven't crawled and indexed! They don't need any additional help to know to crawl more pages from the site! If the site has "fine" pages then index them, dammit. That's what a search engine is supposed to do. That's what Google's users expect them to do.
Google never did crawl all sites equally. The amount of PageRank in a site has always affected how often a site is crawled. But they've now added links to the criteria, and for the first time they are dumping a site's pages OUT of the index if it doesn't have a good enough score. What sense is there in dumping perfectly good and useful pages out of the index? If they are in, leave them in. Why remove them? What difference does it make if a site has only one link pointing to it or a thousand links pointing to it? Does having only one link make it a bad site that people would rather not see? If it does, why index ANY of it's pages? Nothing makes any sort of sense.
So we now have the situation where Google intentionally leaves "fine" and useful pages out of their index, simply because the sites haven't attracted enough links to them. It is grossly unfair to website owners, especially to the owners of small websites, most of whom won't even know that they are being treated so unfairly, and it short-changes Google's users, since they are being deprived of the opportunity to find many useful pages and resources.
So what now? Google has always talked against doing things to websites and pages, solely because search engines exist. But what can website owners do? Those who aren't aware of what's happening to their sites simply lose - end of story. Those who are aware of it are forced into doing something solely because search engines exist. They are forced to contrive unnatural links to their sites - something that Google is actually fighting against - just so that Google will treat them fairly.
Incidentally, link exchanges are no good, because Matt also said that too many reciprocal links causes the same negative effect. The effect being that the site isn't crawled as often, and fewer pages from the site are indexed.
It's a penalty. There is no other way to see it. If a site is put on the Web, and the owner doesn't go in for search engine manipulation by doing unnatural link-building, the site gets penalised by not having all of its pages indexed. It can't be seen as anything other than a penalty.
Is that the way to run a decent search engine? Not in my opinion it isn't. Do Google's users want them to leave useful pages and resources out of the index, just because they haven't got enough links pointing to them? I don't think so. As a Google user, I certainly don't want to be short-changed like that. It is sheer madness to do it. The only winners are those who manipulate Google by contriving unnatural links to their sites. The filthy linking rich get richer, and the link-poor get poorer - and pushed by Google towards spam methods.
Google's new crawling/indexing system is lunacy. It is grossly unfair to many websites that have never even tried to manipulate the engine by building unnatural links to their sites, and it is very bad for Google's users, who are intentionally deprived of the opportunity to find many useful pages and resources. Google people always talk about improving the user's experience, but now they are intentionally depriving their users. It is sheer madness!
What's wrong with Google indexing decent pages, just because they are there? Doesn't Google want to index all the good pages for their users any more? It's what a search engine is supposed to do, it's what Google's users expect it to do, and it's what Google's users trust it to do, but it's not what Google is doing.
At the time of writing, the dropping of pages is continuing with a vengeance, and more and more perfectly good sites are being affected.
A word about Matt Cutts
Matt is a senior software engineer at Google, who currently works on the spam side of things. He is Google's main spam man. He communicates with the outside world through his blog, in which he is often very helpful and informative. Personally, I believe that he is an honest person. I have a great deal of respect for him, and I don't doubt anything that he says, but I accept that he frequently has to be economical with the truth. He may agree or disagree with some or all of the overwhelming outside opinion concerning Google's new crawl/index function, but if he agrees with any of it, he cannot voice it publically. This article isn't about Matt Cutts, or his views and opinions; it is about what Google is doing.
The thread in Matt's blog where all of this came to light is here.
Update:
Since writing this article, it has occured to me that I may have jumped to the wrong conclusion as to what Google is actually doing with the Big Daddy update. What I haven't been able to understand is the reason for attacking certain types of links at the point of indexing pages, instead of attacking them in the index itself, where they boost rankings. But attacking certain types of links may not be Big Daddy's primary purpose.
The growth of the Web continues at a great pace, and no search engine can possibly keep up with it. Index space has to be an issue for the engines sooner or later, and it may be that Big Daddy is Google's way of addressing the issue now. Search engines have normally tried to index as much of the Web as possible, but, since they can't keep pace with it, it may be that Google has made a fundamental change to the way they intend to index the Web. Instead of trying to index all pages from as many websites as possible, they may have decided to allow all sites to be represented in the index, but not necessarily to be fully indexed. In that way, they can index pages from more sites, and their index could be said to be more comprehensive.
Matt Cutts has stated that, with Big Daddy, they are now indexing more sites than before, and also that the index is now more comprehensive than before.
If that's what Big Daddy is about, then I would have to say that it is fair, because it may be that Google had to leave many sites out of the index due to space restrictions, and the new way would allow pages from more sites to be included in the index.
http://www.webworkshop.net/google-madness.html
Google's "Big Daddy" Update
In December 2005, Google began to roll out what they called the "Big Daddy" update, and by the end of March 2006 it had been fully deployed in all of their datacenters. It wasn't a normal update, which are often algorithm changes. Big Daddy was a software/infrastructure change, largely to the way that they crawl and index websites.
As the update spread across the datacenters, people started to notice that many pages from their sites had disappeared from the regular index. Matt Cutts, a senior software engineer at Google, put it down to "sites where our algorithms had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking to spammy neighborhoods on the web, or link buying/selling."
That statement pretty much sums up the way that the Big Daddy update affects websites. Links into and out of a site are being used to determine how many of the site's pages to have in the index. Matt then went on to give a few examples of sites that had been hit, and what he thought might be their problems...
About a real estate site, he said, "Linking to a free ringtones site, an SEO contest, and an Omega 3 fish oil site? I think I’ve found your problem. I’d think about the quality of your links if you’d prefer to have more pages crawled. As these indexing changes have rolled out, we’ve improving how we handle reciprocal link exchanges and link buying/selling."
About another real estate site, he said, "This time, I’m seeing links to mortgages sites, credit card sites, and exercise equipment. I think this is covered by the same guidance as above; if you were getting crawled more before and you’re trading a bunch of reciprocal links, don’t be surprised if the new crawler has different crawl priorities and doesn’t crawl as much."
And about a health care directory site, he said, "your site also has very few links pointing to you. A few more relevant links would help us know to crawl more pages from your site."
The Big Daddy update is mainly a new crawl/index function that evaluates the trustability of links into and out of a site, to determine how many of the site's pages to have in the index. Not only does it evaluate the trustability of the links, but it takes account of the quantity of trustable links. As the health care site shows, if a site doesn't score well enough, it doesn't get all of its pages indexed, and if the site already had all of its pages indexed, many or most of them them are removed.
I've written about the gross unfairness of evaluating links for that purpose here, so I won't go over it again, but I want to suggest a reason why Google has done it.
Since Google came on the scene with their links-based rankings system, people have increasingly arranged links solely for ranking purposes. For instance, link exchange schemes are all over the Web, and link exchange requests plague our email inboxes. Over the years, such links have increased and, because of them, the quality of Google's index/rankings has deteriorated. Google's system relies on the natural linking of the Web, but in implementing the system, they ruined natural linking, which in turn has eroded the quality of Google's index and rankings. It's my belief that Big Daddy is Google's way of addressing the problem. They are evaluating the trustability of both inbound and outbound links to try and prevent unnatural links from benefiting websites.
They had to address the problem but, in my opinion, they've done it in the wrong way. Nevertheless, it's done, and we have to live with it. We still have a lot to learn about the Big Daddy update, but the way I see it is that reciprocal and off-topic links are not dead, but they won't help a site as they did before. Perhaps those links won't count against a site, but they won't count for it, and links are now needed that count for a site, if it is to be fully indexed.
The best links to have are one-way on-topic links into the site, but because of what Google did to natural linking, they aren't easily found. Google caused people to not link naturally, and most sites don't naturally attract links, but the links must be found. The most obvious places to get them are directories. DMOZ can take a very long time to review a site, and even then the site may not be included, but it's a very good directory to be listed in, so it's always worth submitting to it (read this before submitting to DMOZ).
Other directories are well worth submitting to, and a good sized list of decent ones can be found at VileSilencer. Google may not credit all the links from all of them, but that doesn't matter as long as some of them are credited - and all of them may send some traffic.
Google isn't against link-building, and their own people suggest doing it. But it is ludicrous that we have now the situation where Google first destroyed the natural linking of the Web, and then turned around to suggest ways of unnaturally getting natural links, just so that a website can be treated fairly by them. It's a ludicrous situation, but that's the way it is. Some of the unnatural ways that Google suggests are, writing articles that people will link to, writing a blog that people will link to, and creating a buzz. But most people don't want to write articles and blogs, and would have nothing to write or blog about, and very few sites can create a buzz, so for most people, a buzz is a complete non-starter.
Update:
Since writing this article, it occured to me that I may have jumped to the wrong conclusion as to what Google is actually doing with the Big Daddy update. What I haven't been able to understand is the reason for attacking certain types of links at the point of indexing pages, instead of attacking them in the index itself, where they boost rankings. But attacking certain types of links may not be Big Daddy's primary purpose.
The growth of the Web continues at a great pace, and no search engine can possibly keep up with it. Index space has to be an issue for them sooner or later, and it may be that Big Daddy is Google's way of addressing the issue now. Search engines have normally tried to index as much of the Web as possible, but, since they can't keep pace with it, it may be that Google has made a fundamental change to the way they intend to index the Web. Instead of trying to index all pages from as many websites as possible, they may have decided to allow all sites to be represented in the index, but not necessarily to be fully indexed. In that way, they can index pages from more sites, and their index could be said to be more comprehensive.
Matt Cutts has stated that, with Big Daddy, they are now indexing more sites than before, and also that the index is now more comprehensive than before.
If that's what Big Daddy is about, then I can't find fault with it. But it doesn't make any difference to webmasters. We still need to find more of those one-way on-topic inbound links to get more of our pages in the index.
http://www.webworkshop.net/googles-big-daddy-update.html
As the update spread across the datacenters, people started to notice that many pages from their sites had disappeared from the regular index. Matt Cutts, a senior software engineer at Google, put it down to "sites where our algorithms had very low trust in the inlinks or the outlinks of that site. Examples that might cause that include excessive reciprocal links, linking to spammy neighborhoods on the web, or link buying/selling."
That statement pretty much sums up the way that the Big Daddy update affects websites. Links into and out of a site are being used to determine how many of the site's pages to have in the index. Matt then went on to give a few examples of sites that had been hit, and what he thought might be their problems...
About a real estate site, he said, "Linking to a free ringtones site, an SEO contest, and an Omega 3 fish oil site? I think I’ve found your problem. I’d think about the quality of your links if you’d prefer to have more pages crawled. As these indexing changes have rolled out, we’ve improving how we handle reciprocal link exchanges and link buying/selling."
About another real estate site, he said, "This time, I’m seeing links to mortgages sites, credit card sites, and exercise equipment. I think this is covered by the same guidance as above; if you were getting crawled more before and you’re trading a bunch of reciprocal links, don’t be surprised if the new crawler has different crawl priorities and doesn’t crawl as much."
And about a health care directory site, he said, "your site also has very few links pointing to you. A few more relevant links would help us know to crawl more pages from your site."
The Big Daddy update is mainly a new crawl/index function that evaluates the trustability of links into and out of a site, to determine how many of the site's pages to have in the index. Not only does it evaluate the trustability of the links, but it takes account of the quantity of trustable links. As the health care site shows, if a site doesn't score well enough, it doesn't get all of its pages indexed, and if the site already had all of its pages indexed, many or most of them them are removed.
I've written about the gross unfairness of evaluating links for that purpose here, so I won't go over it again, but I want to suggest a reason why Google has done it.
Since Google came on the scene with their links-based rankings system, people have increasingly arranged links solely for ranking purposes. For instance, link exchange schemes are all over the Web, and link exchange requests plague our email inboxes. Over the years, such links have increased and, because of them, the quality of Google's index/rankings has deteriorated. Google's system relies on the natural linking of the Web, but in implementing the system, they ruined natural linking, which in turn has eroded the quality of Google's index and rankings. It's my belief that Big Daddy is Google's way of addressing the problem. They are evaluating the trustability of both inbound and outbound links to try and prevent unnatural links from benefiting websites.
They had to address the problem but, in my opinion, they've done it in the wrong way. Nevertheless, it's done, and we have to live with it. We still have a lot to learn about the Big Daddy update, but the way I see it is that reciprocal and off-topic links are not dead, but they won't help a site as they did before. Perhaps those links won't count against a site, but they won't count for it, and links are now needed that count for a site, if it is to be fully indexed.
The best links to have are one-way on-topic links into the site, but because of what Google did to natural linking, they aren't easily found. Google caused people to not link naturally, and most sites don't naturally attract links, but the links must be found. The most obvious places to get them are directories. DMOZ can take a very long time to review a site, and even then the site may not be included, but it's a very good directory to be listed in, so it's always worth submitting to it (read this before submitting to DMOZ).
Other directories are well worth submitting to, and a good sized list of decent ones can be found at VileSilencer. Google may not credit all the links from all of them, but that doesn't matter as long as some of them are credited - and all of them may send some traffic.
Google isn't against link-building, and their own people suggest doing it. But it is ludicrous that we have now the situation where Google first destroyed the natural linking of the Web, and then turned around to suggest ways of unnaturally getting natural links, just so that a website can be treated fairly by them. It's a ludicrous situation, but that's the way it is. Some of the unnatural ways that Google suggests are, writing articles that people will link to, writing a blog that people will link to, and creating a buzz. But most people don't want to write articles and blogs, and would have nothing to write or blog about, and very few sites can create a buzz, so for most people, a buzz is a complete non-starter.
Update:
Since writing this article, it occured to me that I may have jumped to the wrong conclusion as to what Google is actually doing with the Big Daddy update. What I haven't been able to understand is the reason for attacking certain types of links at the point of indexing pages, instead of attacking them in the index itself, where they boost rankings. But attacking certain types of links may not be Big Daddy's primary purpose.
The growth of the Web continues at a great pace, and no search engine can possibly keep up with it. Index space has to be an issue for them sooner or later, and it may be that Big Daddy is Google's way of addressing the issue now. Search engines have normally tried to index as much of the Web as possible, but, since they can't keep pace with it, it may be that Google has made a fundamental change to the way they intend to index the Web. Instead of trying to index all pages from as many websites as possible, they may have decided to allow all sites to be represented in the index, but not necessarily to be fully indexed. In that way, they can index pages from more sites, and their index could be said to be more comprehensive.
Matt Cutts has stated that, with Big Daddy, they are now indexing more sites than before, and also that the index is now more comprehensive than before.
If that's what Big Daddy is about, then I can't find fault with it. But it doesn't make any difference to webmasters. We still need to find more of those one-way on-topic inbound links to get more of our pages in the index.
http://www.webworkshop.net/googles-big-daddy-update.html
Thursday, May 24, 2007
Effective SEO Comes Cheap
Search engine optimization or SEO is the hottest way to drive targeted traffic to your website. Maximizing the benefits of a well-SEO'ed website will yield lots of earnings for the marketer. However, SEO-ing your site might cost you thousands of dollars if you are such a newbie on this field.
But to tell you the truth, you can essentially get information on low cost SEO anywhere in the Internet. But only several really show you how to work out an affordable search engine optimization endeavor. And those few that really inform include this article.
1. Link exchanges
One cheap SEO method that can get you best results is through link exchanges or linking to and from other web sites. Depending on the websites that you would like to exchange links with, this tool could even cost you nothing at all. Contact the author or owner of the web site you want to have a link exchange with. You will be surprised with the eventual spiking up of your page ranking using this means of getting your website optimized.
2. Write or acquire key word rich articles
Writing truly informative and keyword-rich articles is one surefire way to make your Internet business more visible than ever. It's either you write your own articles or you get them from article directories that allow you to post these articles on your website as long as you keep the resource box or the author's byline in tact. Just don't stuff your articles with keywords that even idiots would get bore of reading them. The readability and freshness of your articles will still be the basis of whether your writers will keep on coming back to your website or not.
3. Catchy Domain Name
What better will make your target visitors remember your website but with a very easy-to-recall domain name. Something sweet and short will prove to be very invaluable. Registering your domain name is not for free. But creativity is.
4. Organize your site navigation
Providing easy steps in navigating your site is one way to make your visitors become at ease with your site. This, in turn, will improve the flow of traffic to your website.
Low cost SEO is always evolving like any other approach in information technology. There are many methods that can very well land you on the top ten rankings of Google or on any other search engines. Some may cost a lot but there are methods that can give you the same results at a low price or you can even do on your own such as those mentioned above.
About the Author
John Ugoshowa. You are welcome to use this article on your website or
in your ezines
as long as you have a link back to
href="http://www.quickregister.net/partners/">http://www.quickregister.net/partners/
For more information on internet marketing see the Internet Marketing section of Quickregister.net Free Search Engine Submission Service
at:
href="http://www.quickregister.net/partners/">http://www.quickregister.net/partners/
But to tell you the truth, you can essentially get information on low cost SEO anywhere in the Internet. But only several really show you how to work out an affordable search engine optimization endeavor. And those few that really inform include this article.
1. Link exchanges
One cheap SEO method that can get you best results is through link exchanges or linking to and from other web sites. Depending on the websites that you would like to exchange links with, this tool could even cost you nothing at all. Contact the author or owner of the web site you want to have a link exchange with. You will be surprised with the eventual spiking up of your page ranking using this means of getting your website optimized.
2. Write or acquire key word rich articles
Writing truly informative and keyword-rich articles is one surefire way to make your Internet business more visible than ever. It's either you write your own articles or you get them from article directories that allow you to post these articles on your website as long as you keep the resource box or the author's byline in tact. Just don't stuff your articles with keywords that even idiots would get bore of reading them. The readability and freshness of your articles will still be the basis of whether your writers will keep on coming back to your website or not.
3. Catchy Domain Name
What better will make your target visitors remember your website but with a very easy-to-recall domain name. Something sweet and short will prove to be very invaluable. Registering your domain name is not for free. But creativity is.
4. Organize your site navigation
Providing easy steps in navigating your site is one way to make your visitors become at ease with your site. This, in turn, will improve the flow of traffic to your website.
Low cost SEO is always evolving like any other approach in information technology. There are many methods that can very well land you on the top ten rankings of Google or on any other search engines. Some may cost a lot but there are methods that can give you the same results at a low price or you can even do on your own such as those mentioned above.
About the Author
John Ugoshowa. You are welcome to use this article on your website or
in your ezines
as long as you have a link back to
href="http://www.quickregister.net/partners/">http://www.quickregister.net/partners/
For more information on internet marketing see the Internet Marketing section of Quickregister.net Free Search Engine Submission Service
at:
href="http://www.quickregister.net/partners/">http://www.quickregister.net/partners/
Search engine optimisations
For many businesses optimization of their web site and good search engine rankings can make or break a business. This is the modern world of search engine marketing, where professional and expert companies can offer successful search engine optimization to effectively market web sites on the worldwide web.
It is no longer sufficient to have a great web site design, informative content and useful functions, top products and services. Today the internet is a humming marketing world, offering online customers many choices in companies specializing in any type of specified product or service. It is becoming more and more important for the success of your online business to ensure that search engines find and include your web site on their listings.
Researches have found that 80-90% of internet traffic is generated through search engines. That is, most online users find products, services and information using search engines. Therefore for your web site to receive more traffic, it is imperative for it to be listed high on top search engines such as Google, Yahoo and MSN. With expert and professional search engine optimization services your web site can be optimized in such a way as to ensure that search engines find and include your web site on their listings.
Today when online users search for products, services and information often they will further refine their search using key words or terms to describe the desired product or service. Now search engine optimization techniques use researched key terms and words included in informative content on your web site to ensure the quantity as well as the quality of online clients are improved.
With expert SEO services, you can expect a greater interest in your web site. Professional techniques will allow potential suited customers find to your web site and your products and services to be identified to customers. SEO is a subset of Search Engine Marketing. Most SEM companies will offer SEO as a service. Usually this service includes content writing containing researched key terms and words, creating inbound links from other highly ranked web sites, software can be applied to link your web site, search engine submissions and monitoring of your web site progress on search engines.
Choose an expert company which can offer all these services, has a history of experience in the SEO field and can offer the professional services from skilled staff and personnel.
About the Author
Dylan Brent wrote this for the online marketers for MVI Search (http://www.mvisearch.com). He is interested in modern search engine optimisation practices.
It is no longer sufficient to have a great web site design, informative content and useful functions, top products and services. Today the internet is a humming marketing world, offering online customers many choices in companies specializing in any type of specified product or service. It is becoming more and more important for the success of your online business to ensure that search engines find and include your web site on their listings.
Researches have found that 80-90% of internet traffic is generated through search engines. That is, most online users find products, services and information using search engines. Therefore for your web site to receive more traffic, it is imperative for it to be listed high on top search engines such as Google, Yahoo and MSN. With expert and professional search engine optimization services your web site can be optimized in such a way as to ensure that search engines find and include your web site on their listings.
Today when online users search for products, services and information often they will further refine their search using key words or terms to describe the desired product or service. Now search engine optimization techniques use researched key terms and words included in informative content on your web site to ensure the quantity as well as the quality of online clients are improved.
With expert SEO services, you can expect a greater interest in your web site. Professional techniques will allow potential suited customers find to your web site and your products and services to be identified to customers. SEO is a subset of Search Engine Marketing. Most SEM companies will offer SEO as a service. Usually this service includes content writing containing researched key terms and words, creating inbound links from other highly ranked web sites, software can be applied to link your web site, search engine submissions and monitoring of your web site progress on search engines.
Choose an expert company which can offer all these services, has a history of experience in the SEO field and can offer the professional services from skilled staff and personnel.
About the Author
Dylan Brent wrote this for the online marketers for MVI Search (http://www.mvisearch.com). He is interested in modern search engine optimisation practices.
Internet Marketing, the latest Trend of online marketing
Internet Marketing is a new trend used for promoting your product or your website over the Internet. With increasing number of people becoming net surfers, it has become a vital part of your marketing to sell your product or promote your website via Internet.
In brief you can say Internet Marketing is a word which can be the perfect definition for a successful online business”. The latest trend of online businesses considers Internet Marketing as one of the best techniques for generating remarkable revenues through the websites.
Latest Internet Marketing Techniques:
•E-Mailers: E-mailers are simple e-mails which we use to send to our friends. In case of business e-mails we send information about our services or products to a potential client who can be a possible buyer for that service or product. By using any e-mailer software you can send emails in bulk containing special offers which can be provided with your product. This particular technique of the Internet marketing is called E-mail marketing.
•Affiliates: It is one of the most popular methods over the Internet. In this you can offer money to a person, if he is able to sell your product over the Internet through his/her website.
•Paid Listing on Web Directories: Most of the web directories offer the facility of submitting your website within their relevant categories. They charge some money for that which can be annual or on monthly basis as per their policy. After paying, your website will appear on their directory within the relevant category.
•Pay Per Click Advertising: Now days the most demanding technique for Internet marketing is Pay Per Click Advertising. Pay per click advertising is offered by all the major search engines. By paying some money for your choice of keyphrases which is called bidding, your website will appear in the search engine’s sponsored listings. This is an easier technique to get instant returns on investments.
•Search Engine Optimization: Search engine optimization is another globally accepted technique for making any website visible on search engines. Search engine optimization takes time and tremendous efforts. Search engine optimization is another name for enhancing a website according the algorithms of search engines. Your website should follow the criteria made by the search engines to get top rankings. In case of search engine optimization you need not to pay any money to the search engines for showing your results. The results which come through ethical search engine optimization will be reliable and stable for a long time period if you will be updating your website time to time according to the search engines algorithms.
About the Author
http://www.articletrader.com/internet/seo/internet-marketing-the-latest-trend-of-online-marketing.html
In brief you can say Internet Marketing is a word which can be the perfect definition for a successful online business”. The latest trend of online businesses considers Internet Marketing as one of the best techniques for generating remarkable revenues through the websites.
Latest Internet Marketing Techniques:
•E-Mailers: E-mailers are simple e-mails which we use to send to our friends. In case of business e-mails we send information about our services or products to a potential client who can be a possible buyer for that service or product. By using any e-mailer software you can send emails in bulk containing special offers which can be provided with your product. This particular technique of the Internet marketing is called E-mail marketing.
•Affiliates: It is one of the most popular methods over the Internet. In this you can offer money to a person, if he is able to sell your product over the Internet through his/her website.
•Paid Listing on Web Directories: Most of the web directories offer the facility of submitting your website within their relevant categories. They charge some money for that which can be annual or on monthly basis as per their policy. After paying, your website will appear on their directory within the relevant category.
•Pay Per Click Advertising: Now days the most demanding technique for Internet marketing is Pay Per Click Advertising. Pay per click advertising is offered by all the major search engines. By paying some money for your choice of keyphrases which is called bidding, your website will appear in the search engine’s sponsored listings. This is an easier technique to get instant returns on investments.
•Search Engine Optimization: Search engine optimization is another globally accepted technique for making any website visible on search engines. Search engine optimization takes time and tremendous efforts. Search engine optimization is another name for enhancing a website according the algorithms of search engines. Your website should follow the criteria made by the search engines to get top rankings. In case of search engine optimization you need not to pay any money to the search engines for showing your results. The results which come through ethical search engine optimization will be reliable and stable for a long time period if you will be updating your website time to time according to the search engines algorithms.
About the Author
http://www.articletrader.com/internet/seo/internet-marketing-the-latest-trend-of-online-marketing.html
On Page Optimization Techniques
As every Internet user knows, search engine optimization, or SEO, is incredibly important. You can’t get visitors without a high ranking with the search engines, and this is where search engine optimization –SEO- comes in. You have a web site because you want to have visitors, and without visitors there’s really no reason to have a web site at all – right? If you want your site to work, you have to know how to make search engines work for you.
But search engines can be a tricky thing, and not everyone knows how to really use them. A lot of Internet users go to a search engine first, no matter what they might be looking for online. The thousands and thousands of sites that the search engine doesn’t find simply don’t get visited, because no one knows that they exist. Anyone who has ever had a web site knows a little something about SEO, but not everyone knows exactly how to make the most out of it. Search engine optimization can be used to get you more visitors than you ever dreamed, and that’s the whole point of the Internet.
Your web site consists of many different pages, and every single one of these pages can be maximized for search engine optimization. Using your search engine techniques on every page will definitely increase you web traffic and your ranking with the search engines. The tips below will help you turn your site into a number one:
- Internal links
First of all, search engines like sites that have lots of links. Put links on every single page of your site. Internal links are the best, because these links will keep your visitors within your site no matter how much they click around and choose different links. Internal links can even link to different places on the same page of the site. Putting your links within content on the page will also help increase your ranking with the search engines. Placing lots of internal links on your site is a great way to optimize your site for the search engines.
- External links
You definitely want to have fewer externals than internal links. External links will only lead visitors away from you site, and you don’t want that. However, search engines give higher rankings to sites that have both kinds of links – so be sure to include some within the pages of your site. These links can be buried at the margins or at the bottom of the page, if you wish, but they should still be present.
- Keywords
Naturally, your keywords are the most important part of your web site. Each and every single page of your site should be optimized for keywords. Whatever your keywords are, you want them to appear at least four times per page of your site. Keywords must always appear within content, meaning with other text, on the site. Search engines, while they can’t read, can scan text. The more content you have, the better, and you want your keywords to appear frequently within that text on your pages. Optimize every page for keywords, and you’ll get a much higher ranking with the search engines.
Everyone with a web site knows how important search engine optimization is. Search engines will bring in visitors, and a web site must have lots of visitors to thrive. By optimizing every page of your site, you’ll increase your chances of getting a high ranking. High ranking with search engines means more visitors. More visitors mean more profits and more business for you. That’s why SEO is so important, and that’s why it’s something that you absolutely have to do!
About the Author
Charles Preston is the president of Click Response. Click Response offers small businesses a search engine optimization service that guarantees 1st page ranking at an affordable price.
But search engines can be a tricky thing, and not everyone knows how to really use them. A lot of Internet users go to a search engine first, no matter what they might be looking for online. The thousands and thousands of sites that the search engine doesn’t find simply don’t get visited, because no one knows that they exist. Anyone who has ever had a web site knows a little something about SEO, but not everyone knows exactly how to make the most out of it. Search engine optimization can be used to get you more visitors than you ever dreamed, and that’s the whole point of the Internet.
Your web site consists of many different pages, and every single one of these pages can be maximized for search engine optimization. Using your search engine techniques on every page will definitely increase you web traffic and your ranking with the search engines. The tips below will help you turn your site into a number one:
- Internal links
First of all, search engines like sites that have lots of links. Put links on every single page of your site. Internal links are the best, because these links will keep your visitors within your site no matter how much they click around and choose different links. Internal links can even link to different places on the same page of the site. Putting your links within content on the page will also help increase your ranking with the search engines. Placing lots of internal links on your site is a great way to optimize your site for the search engines.
- External links
You definitely want to have fewer externals than internal links. External links will only lead visitors away from you site, and you don’t want that. However, search engines give higher rankings to sites that have both kinds of links – so be sure to include some within the pages of your site. These links can be buried at the margins or at the bottom of the page, if you wish, but they should still be present.
- Keywords
Naturally, your keywords are the most important part of your web site. Each and every single page of your site should be optimized for keywords. Whatever your keywords are, you want them to appear at least four times per page of your site. Keywords must always appear within content, meaning with other text, on the site. Search engines, while they can’t read, can scan text. The more content you have, the better, and you want your keywords to appear frequently within that text on your pages. Optimize every page for keywords, and you’ll get a much higher ranking with the search engines.
Everyone with a web site knows how important search engine optimization is. Search engines will bring in visitors, and a web site must have lots of visitors to thrive. By optimizing every page of your site, you’ll increase your chances of getting a high ranking. High ranking with search engines means more visitors. More visitors mean more profits and more business for you. That’s why SEO is so important, and that’s why it’s something that you absolutely have to do!
About the Author
Charles Preston is the president of Click Response. Click Response offers small businesses a search engine optimization service that guarantees 1st page ranking at an affordable price.
5 Great Simple Search Engine optimization Tips
5 Great Simple Search Engine optimization Tips
I will write here 5 of my favorite simple search engine optimization tips I use:
1. Always increase your link popularity. You can do this using reciprocal links with automated software. Today you can find a good software for this just with some little research. Also, writing and publishing very good informational articles for your industry is another great thing you can do to both achieve many great incoming links for your site and promote yourself as an expert. Directory listings are still great, so don't forget about these ones either.
2. Make sure that your TITLE tag is optimized for your keywords/keyphrases. Make sure that your META DESCRIPTION tag is also optimized in a good manner. Title tag and Description tag are the only one really used by some search engines these days. If you have many tags try to remove them in order to have a clean page.
3. Make sure that you have a clean site man on your site. These will help the bots index and find all your pages very fast. Try to limit your site map to no more than 60 links. If you have 200 pages for ex., just add the most important ones.
4. When you register your domain name, make sure to have one or two keywords/keyphrases in it. Also to achieve a top ranking a little faster don't aim for very had keywords, try to aim some more exact keyphrases. If your site is in the auto industry don't aim for auto alone, your site should have something like online-auto-parts.com as an url. This will help you greatly.
5. Always when you make sub pages name them including your keyphrases like this example: online-auto-parts.com/bmw_parts.htm .
I am sure these tips will greatly help you in your search engine optimization venture. Always make your own research and you will achieve your high rankings.
About the Author
Choose your own seo service from our new seo services site.
I will write here 5 of my favorite simple search engine optimization tips I use:
1. Always increase your link popularity. You can do this using reciprocal links with automated software. Today you can find a good software for this just with some little research. Also, writing and publishing very good informational articles for your industry is another great thing you can do to both achieve many great incoming links for your site and promote yourself as an expert. Directory listings are still great, so don't forget about these ones either.
2. Make sure that your TITLE tag is optimized for your keywords/keyphrases. Make sure that your META DESCRIPTION tag is also optimized in a good manner. Title tag and Description tag are the only one really used by some search engines these days. If you have many tags try to remove them in order to have a clean page.
3. Make sure that you have a clean site man on your site. These will help the bots index and find all your pages very fast. Try to limit your site map to no more than 60 links. If you have 200 pages for ex., just add the most important ones.
4. When you register your domain name, make sure to have one or two keywords/keyphrases in it. Also to achieve a top ranking a little faster don't aim for very had keywords, try to aim some more exact keyphrases. If your site is in the auto industry don't aim for auto alone, your site should have something like online-auto-parts.com as an url. This will help you greatly.
5. Always when you make sub pages name them including your keyphrases like this example: online-auto-parts.com/bmw_parts.htm .
I am sure these tips will greatly help you in your search engine optimization venture. Always make your own research and you will achieve your high rankings.
About the Author
Choose your own seo service from our new seo services site.
Subscribe to:
Posts (Atom)