Saturday, June 2, 2007

How Search Engine Works to List Website?

To learn about how search engine works, we have to first understand what is search engine? There are various definitions of search engines given by universities and search engine developers. The most commonly used definition is.

A search engine is a program designed which helps to find information stored on a computer system such as the World Wide Web or a personal computer. The search engine allows one to ask for content meeting specific criteria (typically those containing a given word or phrase) and retrieving a list of references that match those criteria. Search engines use regularly updated indexes to operate quickly and efficiently.

There are two types of search engines i.e. Robot based search engines and Human edited search engines. Now we will explain you in details about working of the both types that is essential for the SEO Company to optimize the web site by search engine optimization strategies to generate better result.

Robot based search engine:
This is also known as crawler, spider or ant based search engine. These are based on information that is collected, sorted and analyzed automatically from the website by indexing spiders. A software program, (known as a "robot", "spider" or "crawler") reads or indexes the web pages, follows links between pages and sites and collects information stored for later use. The information collected is analyzed into an "index" which is a large database of all the sites the crawler visited and read. A Web crawler is one type of bot, or software agent. In general, it starts with a list of URLs to visit, called the seeds. As the crawler visits these URLs, it identifies all the hyperlinks in the page and adds them to the list of URLs to visit, called the crawl frontier. URLs from the frontier are repeatedly visited according to a set of policies of search engines.

Using combination of policies web crawlers are behaving such as :

* Page selection policy to state which page to download
* Page re-visit policy for checking any updating in web pages
* Politeness policy that states how to avoid overloading websites
* Parallelization policy that states how to coordinate distributed web crawlers.

Human-powered search engines:
This type of search engine relies on humans to submit information that is subsequently indexed and catalogued. Only information that is submitted is put into the index. You submit a short description to the directory for your entire website or SEO writer writes one for websites they review. A search looks for matches only in the descriptions submitted for search engine marketing.

It is very important to know that how sites are listed in search engines. The spider crawls in the web site starting from known pages and follows all the links in the sites. The spider also visits the pages which are submitted manually. Search engines are trying to encourage site owners to pay for the privilege of having their pages to visit by spider.

In both type of search engine, when you search for any word or specific query, you are actually searching through the index created by search engine. The results produced by the search engine will depend on the contents or website competition in the index. Each page stored in the database directory is ranked based on the contents of each web page including the title of the page, Meta tags, text, images etc. A good site, with good content, might be more likely to get reviewed.

Specific page's relevance ranking for a specific query depends on the factors such as page relevance to the words and concepts in the query, its overall link popularity and whether or not it is being penalized for excessive search engine optimization (SEO).

The index of search engine is large database of information which are methodically collected and stored by search engines. The search result pages are depending on these indexes. Search engines frequently updates these indexes as there are thousands of new sites added everyday. Since the search results are based on the index, if the index hasn't been updated since a Web page becomes invalid the search engine treats the page as still an active link even though it no longer is. It will remain that way until the index is updated. When you add or update content in your website and resubmit it in search engine, the search engine stores updated information in its database. But updated result will only appear when search engine updates its index.

Sometimes when you search in different search engines the same search shows different results. One of the reasons is all search results page depends on the algorithm to search through indexes. All search engines use different algorithm method for search. The algorithm is what the search engines use to determine the relevance of the information in the index to what the user is searching for.

When search engine gives page rank it gives special weightage on keyword that appear first in title, Heading Tags, Bold face, description, Alt tags in images, keyword and keyword phrases in meta tags.

One of the elements that a search engine algorithm scans for is the frequency and location of keywords on a Web page. Web pages with higher no. of keywords are typically considered more relevant. For example, one method is to rank hits according to how many times your keywords appear and in which fields they appear (i.e., in heading tags, titles, Meta tags or plain text). Another method is to determine which documents are most frequently linked to other documents on the Web. Another common element that algorithms analyze is the way that pages link to other pages in the Website. By analyzing how pages link to each other, an engine can both determine what a page is about and the link pages (match keyword of original page and link page) and whether that page is considered "important" and deserving of a boost in search engine ranking.

As far as the user is concerned, relevancy ranking is critical, and becomes more so as the total volume of information increasing with the growth of websites. Most of the people have no time to go through scores of hits to determine which hyperlinks we should actually explore. The more clearly relevant the results are, the more we are likely to value the search engine.

http://www.topranker.in/search_engine_work_list_website.htm#working_of_search_engine_for_site_listing

Importance of sitemap for website promotion

A sitemap is a web page that lists all the web pages on a website, typically organized in hierarchical way. Sitemap helps visitors and search engine bots to find pages on the site.

The sitemap provides quick finding for search engine robot. Depending on the size of your website, it may actually link to all of your pages. This means that once the robot finds the sitemap page, it can visit each and every page on your entire website. It is good to have all the content included in the search engine database of your website and it will result in possibility for good ranking in search engine result pages when someone performing a search related to your topic.

Sitemaps improves search engine optimization of a website by making sure that all the pages can be found. Sitemap is very important when a website uses flash or script menus that do not include HTML links.

How to prepare Search Engine Friendly Sitemap?
* Having text links of most important pages, it may have links to every page.
* Write short description related to link to inform visitor about your website.
* Provide a pathway for the search engine robots to follow in order to reach your most important web pages.
* Give your visitors the information about your site and show them how to reach the page they are looking for.
* To help automated search engine robot about what the page is about, write important keyword phrases in the sitemap text and hypertext links.
* Help search engine robots to find by giving link of static web pages instead of link to dynamically generated pages. The search engine may not otherwise find your page.
* It is good habit to have sitemap even for small website, adding a sitemap for visitors and for the search engine robots becomes the part of overall search engine optimization strategy.
* To make your sitemap most attractive to the search engine robots and visitors it is better to include meaningful text along with the page URLs and links. Use keywords in that text including appropriate content for each of the pages to which you link.
* Try to make easy navigation of your site; it is easy for search engine robots to find on your site, it increases chances of being positively listed in their search results. Having a number of properly interlinked web pages within a website would indicate to the search engine that the information architecture of the site is reliable and this would help you to achieve Top ranking in search engine.
* It is also very important for the sitemap to be in the main site directory so search engine spiders can easily find sitemap page and follow the links to assure your main site content is instantly accessible to spiders and visitors.
* Other important aspects in sitemaps, the link color for visited links should be different from that of non visited links so that visitors understand which pages they have already seen and thus, save time and sustain interest of visitor in your website.

What is Google Sitemap?
* Google sitemap is a tool established by Google to arrange an entire website in index order and keeping the latest update information on a website, easier on search engine. By adding a list of all the pages on your website, along with when those pages were last updated Google is able to easily update in its index.
* The goal of sitemaps is simple to provide website owners with a method of getting more pages in Google's index and notifying Google about any recent updates. The problem with this is that Google is already fairly competent at indexing the majority of a website and predicting when a site is going to update itself. The result is that most website owners found it to be a lot of work for relatively little benefit.
* A final point to the importance of a sitemap - Google in its Webmaster Guidelines recommends that sites should have a sitemap.

A sitemap also helps you in planning your site before even you start developing it. Once you decide which pages you want, your job is actually made very simple when you start designing the web pages. A site structure helps in understanding the number of pages on the website and how they would be laid out. Thus, another important point is that a sitemap should actually be the first step in planning for a website.

Therefore the other most important reason the sitemap is vital because it helps the visitors to understand the website structure and layout and thus, quickly gain access to what your website has to offer.

http://www.topranker.in/sitemap_website_promotion.htm#Importance_of_sitemap_for_website_promotion

Search Engine Optimization & Web Design

If You Expect Success of Good Website Designing, Search Engine Optimization is the Solution.

SEO Compatible Standards
Search Engine Optimization, also known as search engine placement and search engine positioning, is the process of improving a web site for higher search engine rankings. And it should be the first step in achieving top rankings. Building websites using the Search Engine Optimization Compatible Standard can greatly enhance their rankings and placement in search engines. Search engines are intelligent and are always updating their intelligence to gain better ethics in the placement of websites. The search engine ranking is determined in an automated method based upon the content of the site, its coding, link popularity, and other SEO factors.

Each search engine has its own proprietary algorithm for ranking websites according to what keywords the search engine user is searching for. A directory will rank your site based on what relevancy a human editor gives it upon review.

Process for Search Engine Optimization

SEO process begins with a complete analysis of website including:


* Site architecture
* Robot accessibility
* SEO Copywriting
* Keyword competitiveness
* Link popularity
* Creation of Meta Information
* Presence of sitemap, robots.txt file
* Use of alt image tags and body tags
* Web Site Load Time and HTML Validation
* Analysis of competitor web sites
* Browser Compatibility and Resolution Checking
* Search Engine and Directory Submission

World wide use of Search Engine
All over the world, 24 hours a day, millions of people use search engines and directories to search for products, services and information. When keywords and phrases related to your business are entered into search engines, you want top ranking of your web site in the listings with search engine submission. Research has shown that if a site does not appear on first three pages of the search results, it will rarely; if ever receive traffic from search engines. Study shows that search engines are the primary way that people find the websites they are looking for. Over 85% of web users find the websites they are looking for by using a search engine.

Search Engine Friendly Website Designs
To make search engine friendly website, web pages are checked for browser compatibility which ensures downloading compatibility which ensures downloading of your website in different countries with different browsers being used.

How Search engine indexes Website?
* Indexing of site title
* Indexing of meta tags
* Indexing of major headings
* Indexing of text content of pages
* Finally the search engine looks at the links on website.

Web Standards sites are coded correctly they tend to do better in search engines.
Index Page: The index page is most important page because most of the time, that's where visitors start. The purpose of home page is to get visitors attention and allow them to find information quickly and easily. By using compelling headlines and emphasizing benefits that link to inner pages, visitors can find more information that steers them to the products and services they would like to purchase. The homepage is base of operation in most cases.

Search Robot Friendly URLS: The website uses mod-rewrites or a flat file structure to ensure that URLS are crawlable.
Valid Markup: Proper usage of tags and elements, including H1-H6 and P tags, to organize content into logical structures. Search engines pay special attention to text that is marked with a heading tag; as such text is set off from the rest of the page content as being more important. The title tag is the most important bit of text on a web page as far as the search engines are concerned. Search engines not only assign the words in the title tag more weight, they also typically display the title tag in the search results.

Anchor Text: Anchor Text is the actual text part of a link, used by search engines as a ranking factor in their hyper-textual algorithm. Anchor text is the text that is visible in a link to a web page. The other point that has become increasingly important is that relevant anchor text should be used in links. This applies not just to the external links pointing into your site, but internal linkages throughout your web site.

Browser compatibility: W3C valid sites typically get top ranking. Insert the DOC TYPE tag at the top of every web page. A DOCTYPE “document type declaration” informs the validator which version of HTML you’re using for your web pages. DOCTYPES are also essential to the proper rendering and functioning of web documents in compliant browsers like Mozilla, Netscape, IE5/Mac, and IE6/Win. DOCTYPEs are a key component of compliant web pages: your markup and CSS won’t validate without them.

For example
[! DOCTYPE HTML PUBLIC "-//W3C//DTD HTML 4.01
Transitional//EN" "http://www.w3.org/TR/html4/loose.dtd"]

Clean Code: Reduce code bloat by moving Java script to external files, and limit the use of tables and other code heavy elements. You desire to have more content than code.

Keyword Authenticity: The ethical practice of using keywords throughout your copy as well as file names, alt tags, title and description tags.

Visible Text Links: Text links rule. Use CSS rather than Flash or JavaScript that inhibits search engine spiders. And always provide a sitemap. Use Cascading Style Sheets (CSS) to implement a clean design throughout web site. This will reduce the time to implement a consistent text or layout style for web site. It will also enable to easily update whole site to make any future changes.

Relative Outside Linking: Links pointing out from the site focus mainly on subject matter that directly relates to content within the site.

Relevant Cross Linking: Links within the site use descriptive Anchor Text to describe the page being linked to. Avoid using "click here".

Websites need continuous refinement and improvement with professional web designing services. Original design also includes professional flash introduction of web site, logo designs, web page development, multimedia presentations, web based software programming for search engine friendly web site.
Benefits of Search Engine Optimization in website design includes awareness of products, services of your company, reduction of advertising as well as reduces communication cost for your website with expansion of market place. Web site easily attracts customers locally as well as nationally and internationally by a search engine friendly website which will maximize your marketing investment.

http://www.topranker.in/search_engine_optimization_web_design.htm#search_engine_optimization_web_design