Friday, May 18, 2007

6 Things High Ranking Websites Have That Yours Doesn't

've looked at a lot of websites over the years and helped a lot of clients, and I've yet to meet anyone who is totally committed to making their site as successful as it could be.


FREE ePolicy Guidebook
Reduce the risks of uncontrolled email and web usage....
FierceIPTV
Businesses Beware - New Battlefront in Email and Web
Research Report: Magic Quadrant for E-Mail Security Boundary, 2006
IronPort Email Security Appliance Overview
Hundreds more titles...


This is particularly true when it comes to search engine optimization. Most online businesses make a half-hearted attempt to get better search engine rankings, but rarely implement anymore than one or two things that could really help make their site a success.

Which of the following does your website have?

1. Unique Page Titles.
Take a look at a selection of web pages that come up high in the search engine results for any search term and you'll notice they all have one thing in common: unique page titles. Denoted by in HTML, page titles tell the search engines what your web page is about, and are thought to be a critical factor in how search engines determine the order in which pages are displayed in the search engine result pages (SERPS) for a specific search term.

2. Descriptive Keyword Links.
How do search engines find and index your site? They follow links from one page to another, indexing content as they go. If the links they follow contain keywords and phrases that are relevant to the page content, the search engines will boost the ranking of that page in their results. You should also use keywords and phrases in the site's navigation (menu), as those terms are (or should be) highly relevant for the page they link to.

3. Keywords in Your On-page Copy.
If you want the search engines to know what your page is about, and rank it appropriately, you must scatter keywords throughout your on-page copy. Keywords in your copy establish your site's relevance for words searchers use when they're looking for your product or service.

4. Clean, Accessible Website Design.
Messy, bloated HTML code, 404 errors, re-directs, too many graphics, content hiding behind forms etc., all hinder the search engines' ability to index your site. And if they can't index it, they can't rank it. Follow W3C's accessibility guidelines when building your website. Better yet, ask your website designer what he knows about standards compliant website design. If he or she can't answer, find someone who can.

5. Focused Site Topic.
It seems logical that the more focused your site is, the higher it will rank for related search terms. For example, a site that is focused on the sale of exercise mini-trampolines will probably do better for the search term "mini trampolines" than a site that tries to sell a number of unrelated or a selection of different exercise equipment. In addition, it makes sense to create specialty websites whenever possible. That way, you don't fall into the trap of trying to do too many things and end up doing none of them well.

6. Relevant Incoming Links.
The number of other sites that link to your site's pages is important but the quality of those sites, and the text used in the link, carries much more weight with the search engines. For example, one relevant link from an "authority" site such as a .org, .gov, or a site that's proven itself as a reliable source, provides more value than several links from unrelated or "unproven" sites.

Of course, there are other several other factors that go into determining a site's ranking (far too many to go into here) but these are some of the most important.

They're all easy to implement, so there's really no excuse for not taking advantage of them especially if you want to make sure your site is found and visited by as many people as possible. Then all you'll need to worry about is getting those people to buy something from you on a frequent basis.

And that's really what it's all about.
To your success,
Julia

Comment on this article at: Read this article at http://www.juliahyde.com/mw/high-ranking-websites

About The Author:

Julia Hyde is an advertising copywriter and consultant specializing in search engine optimization, search engine marketing, and traditional advertising. She currently runs Creative Search Media, a full-service advertising and search engine marketing agency. You can contact Julia via her website at http://www.juliahyde.com

Factors Effecting the Page rank

PageRank is a link analysis algorithm which assigns a numerical weighting to each element of a hyperlinked set of documents, such as the World Wide Web, with the purpose of "measuring" its relative importance within the set. The algorithm may be applied to any collection of entities with reciprocal quotations and references.


FREE ePolicy Guidebook
Reduce the risks of uncontrolled email and web usage....
FierceIPTV
Businesses Beware - New Battlefront in Email and Web
Research Report: Magic Quadrant for E-Mail Security Boundary, 2006
IronPort Email Security Appliance Overview
Hundreds more titles...


Google might use the following to determine the ranking of your pages:
• The frequency of web page changes
• The amount of web page changes (substantial or shallow changes)
• The change in keyword density
• The number of new web pages that link to a web page
• The changes in anchor texts (the text that is used to link to a web page)
• The number of links to low trust web sites (for example too many affiliate links on one web page)

Domain Name Consideration:
• The length of the domain registration (one year vs. several years)
• The address of the web site owner, the admin and the technical contact
• The stability of data and host company
• The number of pages on a web site (web sites must have more than one page)

How Google might rate the links to your web site:
• The anchor text and the discovery date of links are recorded
• The appearance and disappearance of a link over time might be monitored
• The growth rates of links as well as the link growth of independent peer documents might be monitored
• The changes in the anchor texts over a given period of time might be monitored
• The rate at which new links to a web page appear and disappear might be recorded
• The distribution rating for the age of all links might be recorded
• Links with a long life span might get a higher rating than links with a short life span
• Links from fresh pages might be considered more important
• If a stale document continues to get incoming links, it will be considered fresh
• Google doesn't expect that new web sites have a large number of links
• If a new web site gets many new links, this will be tolerated if some of the links are from authorative sites
• Google indicates that it is better if link growth remains constant and slow
• Google indicates that anchor texts should be varied as much as possible
• Google indicates that burst link growth may be a strong indicator of search engine spam

For more details on Page Rank visit at www.halfvalue.com and www.halfvalue.co.uk

http://www.selfseo.com/story-19296.php

Google's New Web Page Spider

Search engines use automated software programs that crawl the web. These programs called "crawlers" or "spiders" go from link to link and store the text and the keywords from the pages in a database. "Googlebot" is the name of Google's spider software.


FREE ePolicy Guidebook
Reduce the risks of uncontrolled email and web usage....
FierceIPTV
Businesses Beware - New Battlefront in Email and Web
Research Report: Magic Quadrant for E-Mail Security Boundary, 2006
IronPort Email Security Appliance Overview
Hundreds more titles...


Types of Google Spiders:
Many webmasters have noticed that there are now two different Google spiders that index their web pages. At least one of them is performing a complete site scan:

• The normal Google spider: 66.249.64.47 - "GET /robots.txt HTTP/1.0" 404 1227 "-" "Googlebot/2.1"

• The additional Google spider: 66.249.66.129 - "GET / HTTP/1.1" 200 38358 "-" "Mozilla/5.0"

Difference between these two Google spiders
The new Google spider uses a slightly different user agent: "Mozilla/5.0". This means that Googlebot now also accepts the HTTP 1.1 protocol. The new spider might be able to understand more content formats, including compressed HTML.

AdWords Spider
Google is using a new crawler software program for their AdWords advertising system that automatically spiders and analyzes the content of advertising landing pages. Google tries to determine the quality of the ad landing pages with the new bot. The content of the landing page will be used for the Quality Score that Google assigns to your ads. Google uses the Quality Score and the amount you are willing to pay to determine the position of your ads. Ads with a high quality score can rank higher even if you pay less than others for the ad.

Purpose of Google Spider
Google hasn't revealed the reason for it yet. There are two main theories:

• The first theory is that Google uses the new spider to spot web sites that use cloaking, JavaScript redirects and other dubious web site optimization techniques. As the new spider seems to be more powerful than the old spider, this sounds plausible.

• The second theory is that Google's extensive crawling might be a panic reaction because the index needs to be rebuilt from the ground up in a short time period. The reason for this might be that the old index contains too many spam pages.

What does this mean to your web site?
If you use questionable techniques such as cloaking or JavaScript redirects, you might get into trouble. If Google really uses the new spider to detect spamming web sites, it's likely that these sites will be banned from the index. To obtain long-term results on search engines, it's better to use ethical search engine optimization methods. General information about Google's web page spider can be found here.

Receive Email When Google Spiders Your Page
A search engine spider is an automated software program that locates and collects data from web pages for inclusion in a search engine's database. The name of Google's spider is "Googlebot". If you have a web site that allows you to use PHP code then your web pages can inform you when Google's spider has indexed them. This little piece of PHP code recognizes Googlebot if it visits the web page, and it informs you by email when Googlebot has been there.

For more details on Google’s Spider visit at www.halfvalue.com and www.halfvalue.co.uk

http://www.selfseo.com/story-19305.php

Thursday, May 17, 2007

Domain Names

things to consider when choosing a domain name
Choosing the right domain name for a website is important in several ways. One is branding, another is search engine rankings, and a third is the target audience/market. Mistakes can be made, and it is better to prevent them that to try to correct them, so right from the start of creating a new website, the domain name itself should be given some consideration.


Target Audience / Market
The major search engines, such as Google, have regional (country-specific) versions in which people are able to search only for websites and pages that are to do with the country. There are two ways for the engines know which websites are to do with a particular country. One is the location of the server where the site is hosted, and the other is the top level domain name extension (TLD). The UK's TLDs, for instance, are .co.uk, .org.uk, etc. Unless a website has a country's domain name extension, or is hosted in the country, it won't be returned in the results when people search the regional engines for country-specific sites and pages.

If a business website is aiming only at people in its own country, then it is important to have a top level domain name extension for that country. Hosting a site with a .com domain extension in the country may be ok if it is hosted in the country, but, if the site targets a country, then it's much safer to go with the country's top level domain name extension, and have the site hosted in the country.

For a world-wide audience and searches that aren't country-specific, it isn't really necessary to consider the TLD, although .com, .net, etc. domain names may be favoured in the search results.

Search Engine Rankings (keywords in domain names)
Search engines use the words in domain names as a ranking factor, and unless a site is being heavily branded, it is good to include keywords in the domain name. The domain name words should be seperated with hyphens. E.g. london-hotels.co.uk rather than londonhotels.co.uk. Search engines treat hyphens as spaces, so the first domain name is "london hotels" as far as the engines are concerned, whereas the second one is "londonhotels" - all one word, and a word that people don't search on, so it doesn't help with the rankings. The first example does help with ranking the site for 'London hotels'.


Branding (business name)
It goes without saying that, when branding a business or company is important, the domain name should be the name of the brand. Even so, it is still necessary to have a country-specific domain name if the business targets people in a particular country.

Using a branded domain name does not prevent the site from doing well in the search engines for its important keywords - it just doesn't add any help for ranking purposes.

Some online brands combine the brand with keywords. For instance, in the UK, we have a TV advertising company called cottages4u. Unfortunately, the word "cottages" in the domain name doesn't help with ranking it for 'cottages', because it isn't a seperate word. On the other hand, using seperate words (cottages-4u), so that the domain name helps with the rankings, isn't good for people to type into their browsers. Keywords in domains are helpful for rankings, but not so much that it's better to include hyphens in a brand name - it isn't. Seperating keywords with hyphens is good when people are not expected to type the domain URL into a browser but, for branding, it's better not to use hyphens.

Summary

Right from the start of creating a website, consideration must be given to its domain name. If the business targets only one country, use the domain name extension of that country. If the business is to be strongly branded, use the domain name for the brand name, and not for keywords, unless they coincide. If the domain name isn't important as a brand, then make it up of important keywords, seperated by hyphens.

http://www.webworkshop.net/domain-names.html

Website Hosting Scams

how they are done and how to spot them

Rogue Website Hosting Companies
Outright scams by website hosting companies are rare, but they do exist and should be watched out for. We had a case in the forum (Rogue web hosting company changing customers' websites) where the web host used the server to intercept all page requests for all sites of all of their website hosting customers. If a request was from a person, the normal page was returned, but if a request was from a major search engine spider, the web host's server scam modified the page that was returned, by dynamically adding a set of links to it. This was done to all the web host's customers' sites, and without their knowledge or permission.

Most of the added links pointed to the web host's own websites, but some pointed to non-existant (virtual) sub-folders within the customer's site. When any search engine spider requested a page within those non-existant sub-folders, the system dynamically created a page of links to the host's own sites, and the host's sites got even more links pointing to them. When people requested any of the non-existant pages, they were redirected to the site's home page, but for search engines, each hosting customer's site contained sections that the customer didn't put there, and didn't know existed. The extra sections couldn't be seen by FTP, because the sub-folders and pages didn't physically exist. They were virtual, dynamically created, sub-folders and pages, which were only ever seen by search engine spiders.

The purpose of the scam was to add many links, with targetted link text, to the website host's own sites, so that they would be pushed up the search engines' rankings. In doing it, the customers lost out in ranking positions, because the host was channeling PageRank out of the customer sites, which caused them to lose a little ranking power.

We were able to get Google to deal with the web host's site without harming the innocent customer sites, so that case worked out ok. But it isn't an isolated case, and the possibility of a website hosting company scamming customers is something to be aware of. The problem with that type of scam is that most customers wouldn't know how to spot it. There are no extra files to be seen by FTP, and, without knowing the file paths and names of the virtual pages, how can such a scam be spotted?


How to spot the scam
Fortunately, Google comes to our aid. Do any search in Google. At the foot of the results page there is a link to "Language Tools". Click it. Near the top of the returned page is a box in which you can enter the URL of a page to be translated ("Translate a web page:"). Enter your page URL there, leave the setting at German to English, and click "Translate". There will be no German words in your page, so it will all come back in English. If the web hosting company has changed anything in your page, you will see the changes in the translated page.

How does that work? When the web server serves one page to people, and a different page to search engine spiders, it is called cloaking. The scam that I've described serves different pages to search engines than those that are served to people. Search engines get the added links, but people don't. It is cloaking. Cloaking looks at the IP address of the requestor before deciding which page to send. If it is the IP of a search engine spider, the page with the added links is returned. We can't spoof a Google spider's IP address, but Google's translation tool uses a Google spider's IP address, because it is a Google spider, so the host returns the page that is intended for search engines. That's how you can see any modificatoons that have been done to your pages by your website hosting service.

If your website hosting company is a small business, you might want to check a few pages from time to time.
Unfortunately, the orginal forum thread about that scam was lost in a technical glitch, but the thread linked to at the top of this piece shows what the cloaked pages looked like, and some links to 'saves' of the original thread pages are posted at the end of the thread.

Hacking Problems
A case (sitemaps hacked??) recently appeared in the SearchEngineWatch forum where websites had been hacked, so that pages are modified when search engine spiders request them. Those particular modifications were to benefit porn sites. In this case, the hack is easy to spot by FTP because real sub-folders and files are added to the sites, so it isn't invisible as in the previous case.

Google people checked it out, and concluded that it wasn't the hosts who were to blame, but the likelihood is that cPanel usernames and passwords had been hacked so that the new sub-folders and files could be placed in the sites. The hack caused one of the sites to be penalised by Google, but it is likely to be re-included now that the cause has been exposed.

This type of hack is bound to be on the increase, and should be watched out for. It is easily spotted by FTP, due to the extra sub-folders, but it is better to prevent it in the first place by using totally obscure passwords.


http://www.webworkshop.net/hosting-scams.html

WebProWorld are Spammers

I hate email spam. Email spam is sending unsolicited bulk emails to people. I.e. the people who receive the emails didn't ask for them, and didn't agree to receive them. But what if a company tricks you by placing an agreement to receive the emails in a place where you are unlikely to see it when you click on a link for something else. Cheating? It is in my book. In that circumstance, you didn't knowingly optin or agree to receive the emails, therefore, the emails are unsolicited - spam.

iEntry is a publishing company who produce a myriad of newsletters that are delivered by email. The purpose of the newsletters is to sell advertising in them. But iEntry have a problem. They can't sell advertising for very much money if the newsletters don't have reasonably large circulations. The larger the circulation, the more money they can get for the advertisements.

Many websites offer newsletters. There is usually an opt in system on the site so that people can choose to receive them, and iEntry is no different in that respect. But those systems don't always produce the number of optins that would increase the circulation enough to make good money from the advertisements, and it must be very tempting to find other ways of acquiring optins, even if it means hiding automatic optins from people. After all, the advertisers, who would be paying more money for the ads, are unlikely to realise that most of the 'optins' didn't know that they were opting in to anything, and that they didn't agree to receive anything, and that they don't want to receive anything. The advertisers would end up paying higher prices to deliver their ads to people who they think chose to receive them - but they didn't. Not only did they not choose to receive them, but they specifically don't want to be spammed by them. But that's exactly what iEntry does.

How do they do it? Many website owners want their websites to be listed in the various online directories, so they submit them to the directories. iEntry owns a directory called Jayde. So a website owner goes to the Jade site and submits his site. In submitting the site, he confirms that he has read Jayde's terms and conditions. Of course he doesn't read them, because it's just a directory and the T&Cs are bound to be perfectly normal. Or if he does decide to read them, he sees the length of the page, and decides not to bother. After all, it's only a directory, just like all the other directories. That's the way that most people would do it, and that's exactly what iEntry rely on to pick up newsletter optins that aren't willing optins at all. The T&C includes a statement that submitting a site to the directory automatically opts the person in to receive a newsletter.

Yes, the statement is there to be seen, but who reads lengthy T&Cs when submitting a site to a perfectly ordinary directory? To my way of thinking, the statement is intentionally hidden from people for the purpose of getting newsletter optins without people realising it - hence, unsolicited emails.

Another way that iEntry gets unknowing optins is with their WebProWorld forum. When people register in the forum, without realising it, they automatically optin to receive a newsletter - hence, unsolicited emails - spam. This is how it works...

The forum is perfectly normal in that, when a person registers, a confirmation email is sent. The email contains a link to activate the new account. Such confirmation emails are normal for many kinds of sites, and people are used to them, so they click the link without reading the rest. What they don't realise is that buried at the bottom of the email is a statement that clicking the link automatically opts in to receive a newsletter. They didn't knowingly opt in to receive the newsletter - hence, unsolicited emails - spam.

The spamming was discussed in a thread in WebProWorld's forum. During the discussion more and more people posted to say that they were receiving the spam as well. Nobody had realised that the forum's confirmation email contained the newsletter statement, because it was hidden at the bottom, and out of sight without scrolling all the way down to see it. As a result of that discussion, WebProWorld moved the statement up a little, but it is still too far down, especially when everyone knows what the email is, and just clicks the link to complete the registration.

There was another discussion in this site's forum, called "WebProWorld are spammers".

I am sure that iEntry have more ways of getting unknowing optins that I don't know about. The upshot of it is that many, probably most, people receive their newsletters without ever knowingly opting in. They didn't choose to receive them, they were unsolicited, and that makes the newsletters spam. Since many or most of the emails are spam and unwanted, the advertisers must be paying way over the odds, wrongly believing that people who receive the newsletters chose to receive them.

Unsubscribe?
So what about opting out of the newsletters? It can be done. There's an "unsubscribe" link at the bottom of each one, and it works. But sometime down the line, it's a near certainty that the spam newsletters will start coming again - not always because iEntry adds you to the list again, although they do do that, but because the tricks are still out there waiting to catch the unwary - all of us! I have unsubscribed to every newsletter several times, but they always start up again. That's because the tricks were well hidden, and like everyone else, I didn't know they even existed.

http://www.webworkshop.net/webproworld-are-spammers.html

Doorway Pages & Links

Before Google came onto the scene with their links-based PageRank, doorway pages were the most effective of the search engine optimization techniques, because they didn't usually need any pages to link to them, and top rankings were relatively easy to achieve. But with PageRank and link popularity playing such a big role these days, are doorway pages still as effective as they used to be?

Doorway Pages Overview
A doorway page is a web page that is optimized to rank well for a particular search term. It is called a 'doorway' because people enter the website by clicking on the doorway page's listing in the search engines' search results.

It is common for a doorway page to be specially created as a doorway for a particular searchterm, but it is equally valid to select the site's most relevant existing page for the searchterm, and optimize it to rank highly in the search engines so that it becomes the doorway page for that term.

Back in the days when search engines based the rankings solely on a page's content, it was normal for one doorway page to be created for each search term. Each page was submitted to the engines and, if they were optimized better than the competition's pages, they would achieve top rankings. They could stand alone and didn't need to be a part of the 'normal' site. Therefore, they didn't need to be designed for users to see, and so they could include an instant redirect to the most relevant of the site's content pages. Search engine optimization was easy back then!

But times change, and search engines no longer base the rankings solely on a page's content. These days inbound links and inbound link text are both very important to achieving top rankings. An 'inbound link' is a link that comes into a page - from the receiving page's viewpoint it is 'inbound'. 'Inbound link text' is the text on the remote page that users click on.

For Google, inbound links and inbound link text are among the most important factors in determining rankings. It means that the traditional, specially created, stand-alone doorway pages are no longer as successful as they used to be because they need to be linked to from other pages on the web, and aquiring good links for them from other websites is just about impossible. What site would want to link to a doorway page?

But all is not lost for doorway pages. All that's needed is to rethink what they are and what they do. Here are two main strategies:-

http://www.webworkshop.net/doorway-pages-and-links.html