Friday, July 13, 2007

How Usage Data Will Affect Your Future Rankings

With all that has changed in search engine optimization over the years, there is one thing that remains constant. Useful websites outperform useless websites over the long haul, from both a ranking and a profit perspective.

When people find your website, what do they do? Do most of them "bounce" by hitting the back button? Or do they stay on the site for a while and read an average of, say, six pages per visit? If you can't answer these questions, you cannot properly optimize your website for rankings and sales. Why? Because usage data, as it's known, is becoming more and more important in search engine algorithms.

An Obvious Point

Before we continue talking about search engine optimization, let me get one glaringly obvious point out of the way. Usage data is most important because it tells you how well people can use your website, and how useful they find it. Of course, it also tells you how well your visitors are converting (subscribing, downloading, purchasing, whatever), and these are the most important factors of all. But this is an SEO article, so let's talk about the impact of usage data on your search engine rankings.

Search Engine Evolution

Search engine companies make most of their money from advertising (sponsored search / pay-per-click). The more people using their search engines, the more money they can make from advertising revenues. In order to retain their user base (and, ideally, to grow that base), the search engine companies have to protect the quality and relevance of their search listings. You don't have to be a genius to connect those dots.

Protecting the Quality of Search Results

How do search engines ensure the quality and relevance of their listings? By continuously developing technology that rewards popular websites. But many in the SEO industry have figured out how to trick the search engine algorithms by making their websites appear larger and more popular than they really are. Call it "black hat SEO" if you want -- labels are beside the point.

The bottom line is that search engines are always trying to weed out the manipulative websites and reward the truly useful, popular websites -- websites, for example, that attract inbound links based on usefulness, not by renting and buying the majority of their links. Link building has become big business, and the more the link-building industry grows, the harder it will be for search engines to count link popularity as the number-one factor of website relevance and popularity. So they use many ranking factors. One of the factors seemingly on the rise (especially with Google) is usage data.

What Is Usage Data?

As the label implies, usage data is information that shows how people are using your website. When people find your website through a search engine, that search engine can track the visitor long enough to find out how they respond to your website (until the visitor closes the browser or deletes their cookie files). So, for example, if 90% of the people who find a website through search engines hit the back button upon reaching the home page, then the usage data suggests a low-quality website. Maybe it's a link farm, or maybe it has poor usability. Whatever the reason, search-driven visitors are leaving the website in droves, and the search engine can track this behavior.

On the contrary, if 90% of the people who find a website through the search engines go on to read several pages, actually spending time on the site, then the usage data suggests a quality website (and a good match with the person's search query).

How Important is Usage Data?

Search engine optimization "gurus" will argue all day about the current and future importance of usage data. Let them argue, I say. One thing is obvious. If you're a search engine company, and you have data that suggests the quality and relevance of a website as viewed by your users (people using search engines), you'd be a fool not to use that data to improve your ranking algorithm.

Let's remember the business model here. Better, more relevant search results help the search engine company grow its user base. A larger user base allows them to sell more advertising and at a better price. More advertising revenue helps the search engine company survive and thrive in a highly competitive industry.

Still One Factor of Many

It's also important to note that usage data is still one of many factors the search engines use to determine website rankings. You still need a well structured website; you still need plenty of relevant, original content; and you still need strong link popularity. But usage data plays a role too, and I predict that role will only expand in the coming years.

Killing Two Birds...

When you improve your website's usage data, you are killing two birds with one stone. You are helping your search engine / SEO cause, but you are also making your website more successful in general. That's because positive usage data suggests a useful website, the kind of website that people come back to and recommend to others. And that's a fundamental requirement in the success of any website.

So how do you improve your website in this regard? Here are some tips.

1. Overall usability - The easier your website is for people to use, the more likely they will be to actually use it. All other things being equal, this means they will navigate more easily and read more content (than a website with poor usability). Your conversion rates will also be higher if you improve usability, so this is a double-benefit.

2. Home page usability - Overall website usability is important, but home page usability is absolutely critical. There are two primary reasons for this. First of all, the majority of people who find your website through search engines will enter through the home page (unless you have sub-directories with unique topics). So you should work extra hard to make sure your home page is enticing, inviting and easy to understand. Secondly, people who enter your site through an internal page will often go to the home page to find their way around. It's the equivalent of a "You are here" poster in a shopping mall. If people can't figure out your home page, they can't figure out your website.

3. Home page enticement - In the last point, I briefly mentioned the enticement factor. Even if you have pages and pages of useful, interesting content, you have to lure people in to read it. This will reduce the number of people who "bounce" elsewhere upon reaching your home page, and it will also increase your conversion rates and general website success. Determine your website's primary benefit to visitors, and make sure that benefit is crystal clear on the home page.

4. Quality of content - I always get a good laugh out of SEO "experts" who say that content doesn't matter, that only links matter. These people are shortsighted to the point of being harmless, so I usually let them think what they want. Linking may increase a website's search engine ranking, but without quality content your search-driven traffic will bail out on you. Goodbye visitors. Goodbye conversions. Goodbye profit. And goodbye usage data!

5. Organization of content - Organization is a sub-theme of usability, which we talked about above. All other things being equal, good organization will separate a successful website from a flop. But it also takes some effort. I always say that "it's damn hard to make a website easy," but you have to do it. You have to draw your little sketches and reason things out in order to create a website that makes sense to people. There should be a place for everything, and everything should be in its place.

6. Interlinking for stickiness - If you have plenty of quality content on your site (and lots of pages as a result), interlinking can help you increase your website's "stickiness." In this context, interlinking is linking between various pages on your website. If you're writing a page about purple widgets, and that page mentions orange widgets as well, make it a hyperlink to the page on orange widgets. WebMD.com and About.com both do this well.

Conclusion

Despite the endless arguments in SEO circles, usage data matters. It matters for search engine ranking and will continue to evolve in this role. But more importantly, it matters to you because it's a direct measurement of how people are using your website, which relates to your website's success in general.


http://www.isedb.com/db/articles/1684/


How Brands Can Mitigate eBay’s Aggressive Search Engine Advertising Tactics

There is a way for brands to address unauthorized sales activity on eBay for new merchandise, which essentially is an attempt to low-ball or undercut the retail pricing established through the corporate and authorized retailer channels.

Many companies are unaware of this, but eBay has a program to empower brands and eliminate trademark infringement and copyright violations in eBay auctions. The result of enforcement is a precipitous drop in the number of third parties leveraging a brand’s IP (intellectual property) assets for profit. The name of eBay’s program is VeRO, and this acronym stands for Verified Rights Owner.

The VeRO program provides participants with the ability to identify and request removal of allegedly infringing items and materials, particularly the unlawful use of trademarks and copyright-protected content. If the infringing eBay user attempts to publish the same or similar listing, their account will be suspended by eBay.


How It Works

Once you have identified a listing on eBay that violates your copyright, trademark, or other intellectual property rights, all you need to do is download eBay’s Notice of Claimed Infringement (NOCI) form, fill it out, and fax it to eBay. You can download the form by visiting eBay.com. After eBay has received your first NOCI, they will send you an electronic version of the NOCI form as well as instructions on how to submit future reports electronically.

In addition, eBay will allow organizations to educate eBay users about their products and legal positions by creating an \"About Me\" page. eBay states that many of their users cease listing potentially infringing items when presented with such information. For example, Motorola’s “About Me” page defines trademark infringement and copyright violation, then further explains why an auction or sale of “Motorola” or Motorola “related” products and/or services may be halted.

If your company is a direct selling or network marketing brand, such as Mary Kay or Herbalife, you may also cite the violation of a trade agreement between your company and your distributors to remove listings.

Incidentally, Apple Inc., the manufacturer of the iPhone, does not currently participate in the VeRO Program.


http://www.isedb.com/db/articles/1685/

Getting Into Google

Last night was our third SEMNE event (Search Engine Marketing New England), and we were humbled to have Dan Crow, director of crawl systems at Google, spilling the beans about how to get your site into Google. He talked for a half hour or so, and then proceeded to answer audience questions for at least another hour. As I sat there listening to him (yes, I actually listened to this one!), I was struck by what an awesome opportunity it was for everyone in that room to be provided with such important information -- straight from Google. It was clear that the 100 or so people in the room agreed. In fact, at 7:30 on the dot, everyone spontaneously stopped their networking activities and simply took their seats without being asked to. These folks definitely came to hear Google!

What Is Indexing?

Dan started out his presentation discussing what "indexing" means and how Google goes about it. Basically, the process for the Google crawler is to first look at the robots.txt file in order to learn where it shouldn't go, and then it gets down to business visiting the pages it is allowed to visit. As the crawler lands on a page, it finds the relevant information contained on it, then follows each link and repeats the process.

Robots.txt Explored

Dan proceeded to explain how to use your robots.txt file for excluding pages and directories from your site that you might not want indexed, such as the cgi-bin folder. He told us how each of the major search engines have their own commands for this file but that they're working to standardize things a bit more in the future.

In terms of what the crawler looks at on the page, he said there are over 200 factors, with "relevance" playing a big part in many of them.

Google Still Loves Its PageRank

Dan also discussed the importance of PageRank (the real one that only Google knows about, not the "for-amusement-purposes-only" toolbar PR that many obsess over). He let us know that having high-quality links is still one of the greatest factors towards being indexed and ranked, and then he proceeded to explain how building your site with unique content for your users is one of the best approaches to take. (Now, where have you heard that before? ;) He explained how creating a community of like-minded individuals that builds up its popularity over time is a perfect way to enhance your site.

Did You Know About These Tags?

We were also treated to some additional tips that many people may not have known about. For instance, did you know that you could stop Google from showing any snippet of your page in the search engine results by using a "nosnippet" tag? And you can also stop Google from showing a cached version of your page via the "noarchive" tag. Dan doesn't recommend these for most pages since snippets are extremely helpful to visitors, as is showing the cache. However, Google understands that there are certain circumstances where you may want to turn those off.

Breaking News!

Google is coming out with a new tag called "unavailable_after" which will allow people to tell Google when a particular page will no longer be available for crawling. For instance, if you have a special offer on your site that expires on a particular date, you might want to use the unavailable_after tag to let Google know when to stop indexing it. Or perhaps you write articles that are free for a particular amount of time, but then get moved to a paid-subscription area of your site. Unavailable_after is the tag for you! Pretty neat stuff!

Webmaster Central Tools

Dan couldn't say enough good things about their Webmaster Central tools. I have to say that seems to be very common with all the Google reps I've heard speak at various conferences. The great thing is that they're not kidding! If you haven't tried the webmaster tools yet, you really should because they provide you with a ton of information about your site such as backward links, the keyword phrases with which people have found each page of your site, and much, much more!

Sitemaps Explored

One of the main tools in Webmaster Central is the ability to provide Google with an XML sitemap. Dan told us that a Google sitemap can be used to provide them with URLs that they would otherwise not be able to find because they weren't linked to from anywhere else. He used the term "walled garden" to describe a set of pages that are linked only to each other but not linked from anywhere else. He said that you could simply submit one of the URLs via your sitemap, and then they'd crawl the rest. He also talked about how sitemaps were good for getting pages indexed that could be reached only via
webforms. He did admit later that even though those pages would be likely to be indexed via the sitemap, at this time they would still most likely be considered low quality since they wouldn't have any PageRank. Google is working on a way to change this in the future, however.

Flash and AJAX

Lastly, Dan mentioned that Google still isn't doing a great job of indexing content that is contained within Flash and/or AJAX. He said that you should definitely limit your use of these technologies for content that you want indexed. He provided a bit of information regarding Scalable Inman Flash Replacement (sIFR), and explained that when used in the manner for which it was intended, it's a perfectly acceptable solution for Google. (You can read more about sIFR here) Dan said that Google does hope to do a better job of indexing the information
contained in Flash at some point in the future.

The Q&A

Many of the points mentioned above were also covered in greater detail during Dan's extensive Q&A session. However, there were many additional enlightening tidbits that got covered. For instance, Sherwood Stranieri from Catalyst Online asked about Google's new Universal Search, specifically as it applied to when particular videos (that were not served up from any Google properties) would show up in the main search results. Dan explained that in Universal Search, the videos that show up are the same that show up first while using Google's video search function.

The Dreaded Supplemental Results

Of course, someone just *had* to ask about supplemental results and what causes pages to be banished there. (This is one of the most common questions that I hear at all SEO/SEM conferences.) Dan provided us with some insights as to what the supplemental results were and how you could get your URLs out of them. He explained that basically the supplemental index is where they put pages that have low PageRank (the real kind) or ones that don't change very often. These pages generally don't show up in the search results unless there are not enough relevant pages in the main results to show. He
had some good news to report: Google is starting to crawl the supplemental index more often, and soon the distinction between the main index and the supplemental index will be blurring. For now, to get your URLs back into the main results, he suggested more incoming links (of course!).

There was a whole lot more discussed, but I think this is enough to digest for now! All in all, my SEMNE co-founder Pauline and I were extremely pleased with how the night unfolded. We had a great turnout, met a ton of new contacts, caught up with a bunch of old friends, and received some great information straight from Google!


http://www.isedb.com/db/articles/1687/

Google Algorithm Update Analysis

Anybody who monitors their rankings with the same vigor that we in the SEO community do will have noticed some fairly dramatic shifts in the algorithm starting last Thursday (July 5th) and continuing through the weekend. Many sites are rocketing into the top 10 which, of course, means that many sites are being dropped at the same time. We were fortunate not to have any clients on the losing end of that equation however we have called and emailed the clients who saw sudden jumps into the top positions to warn them that further adjustments are coming. After a weekend of analysis there are some curiosities in the results that simply require further tweaks in the ranking system.

This update seems to have revolved around three main areas: domain age, backlinks and PageRank.

Domain Age

It appears that Google is presently giving a lot of weight to the age of a domain and, in this SEO's opinion, disproportionately so. While the age of a domain can definitely be used as a factor in determining how solid a company or site is, there are many newer sites that provide some great information and innovative ideas. Unfortunately a lot of these sites got spanked in the last update.

On this tangent I have to say that Google's use of domain age as a whole is a good filter, allowing them to "sandbox" sites on day one to insure that they aren't just being launched to rank quickly for terms. Recalling back to the "wild west days" of SEO when ranking a site was a matter of cramming keywords into content and using questionable methods to generate links quickly I can honestly say that adding in this delay was an excellent step that insured that the benefits of pumping out domains became extremely limited. So I approve of domain age being used to value a site – to a point.

After a period of time (let's call it a year shall we) the age should and generally has only had a very small influence on a site's ranking with the myriad of other factors overshadowing the site's whois data. This appears to have changed in the recent update with age holding a disproportionate weight. In a number of instances this has resulted in older, less qualified domains to rank higher than newer sites of higher quality.

This change in the ranking algorithm will most certainly be adjusted as Google works to maximize the searchers experience. We'll get into the "when" question below.

Backlinks

The way that backlinks are being calculated and valued has seen some adjustments in the latest update as well. The way this has been done takes me back a couple years to the more easily gamed Google of old. This statement alone reinforces the fact that adjustments are necessary.

The way backlinks are being valued appears to have lost some grasp on relevancy and placed more importance on sheer numbers. Sites with large, unfocused reciprocal link directories are outranking sites with fewer but more relevant link. Non-reciprocal links lost the "advantages" that they held over reciprocal links until recently.

Essentially the environment is currently such that Google has made itself more easily gamed than it was a week ago. In the current environment, building a reasonable sized site with a large recip link directory (even unfocused) should be enough to get you ranking. For obvious reasons this cannot (and should not) stand indefinitely.

PageRank

On the positive side of the equation, PageRank appears to have lost some of it's importance including the importance of PageRank as it pertains to the value of a backlinks. In my opinion this is a very positive step on Google's part and shows a solid understanding of the fact that PageRank means little in terms of a site's importance. That said, while PageRank is a less than perfect calculation subject to much abuse and manipulation from those pesky people in the SEO community it did serve a purpose and while it needed to be replaced it doesn't appear to have been replaced with anything of substantial value.

A fairly common belief has been that PageRank would be or is being replaced by TrustRank and Google would not give us a green bar to gague a site's trust on (good call Google). With this in mind one of two things has happened; either Google has decided the TrustRank is irrelevant and so is PageRank and decided to scrap both (unlikely) or they have shifted the weight from PageRank to TrustRank to some degree and are just now sorting out the issues with their TrustRank calculations (more likely). Issues that may have existed with TrustRank may not have been clear due to it's weight in the overall algorithm and with this shift reducing the importance of PageRank the issues that face the TrustRank calculations may well be becoming more evident

In truth, the question is neither here nor there (as important a question as it may be). We will cover why this is in the ...

Conclusion

So what does all of this mean? First, it means that this Thursday or Friday we can expect yet another update to correct some of the issues we've seen rise out of the most current round. This shouldn't surprise anyone too much, we've been seeing regular updates out of Google quite a bit over the past few months.

But what does this mean regarding the aging of domains? While I truly feel that an aging delay or "sandbox" is a solid filter on Google's part – it needs to have a maximum duration. A site from 2000 is not, by default, more relevant than a site from 2004. After a year-or-so the trust of a domain should hold steady or at most, hold a very slight weight. This is an area we are very likely to see changes in the next update.

As far as backlinks go, we'll see changes in the way they are calculated unless Google is looking to revert back to the issues they had in 2003. Lower PageRank, high relevancy links will once again surpass high quantity, less relevant links. Google is getting extremely good and determining relevancy and so I assume the current algorithm issues has more to do with the weight assigned to different factors than an inability to properly calculate a links relevancy.

And in regards to PageRank, Google will likely shift back slightly to what worked and give more importance to PageRank, at least while they figure out what went awry here.

In short, I would expect that with an update late this week or over the weekend we're going to see a shift back to last week's results (or something very close to it) after which they'll work on the issues they've experienced and launch a new (hopefully improved) algorithm shift the following weekend. And so, if you've enjoyed a sudden jump from page 6 to top 3, don't pop the cork on the champaign too quickly and if you've noticed some drops, don't panic. More adjustments to this algorithm are necessary and, if you've used solid SEO practices and been consistent and varied in your link building tactics – keep at it and your rankings will return.


http://www.isedb.com/db/articles/1689/