Wednesday, June 6, 2007

So Google Has Feedburner, What Now?

According to TechCrunch.com, Feedburner is in the final steps of an acquisition by Google for $100 million - with little or no chance of the all-cash buy going south.

What is Feedburner?
"FeedBurner is the leading provider of media distribution and audience engagement services for blogs and RSS feeds. Our Web-based tools help bloggers, podcasters and commercial publishers promote, deliver and profit from their content on the Web."

In my words Feedburner is a central hub of hundreds of thousands of blog and podcast feeds owned by corporations, small businesses and individuals alike that can use its software to publicize their feeds, monitor subscriptions and user interaction, and to monetize their feeds.

How Might Owning Feedburner Help Google?
* An Edge Over Other Analytics Solutions: Google Analytics could integrate an edge that few (if any) analytics programs could match - RSS feed statistics. One of Feedburner's most alluring capabilities is its ability to track subscribers to feeds and the activities of feed subscribers. Integrating this with Google Analytics would automatically provide a powerful edge for Google over other analytics solutions.

* Monetization of Over 700,000 Feeds: Feedburner has created a significant feed advertising solution where advertisers can pay-per-impression for visibility within a chosen marketplace. Publishers (those who create the feeds) can opt to monetize their feed using Feedburner's system. Google will now have access to this network and over time I expect they will enhance this system dramatically to include the new and existing pay per click solutions that have made them so successful.

* More Visibility: FeedBurner's total feeds sit at 700,000 as I write this posting (according to Feedburner). Now a mere 700,000 may seem small in comparison to Google's massive reach but when you consider that some feeds have hundreds of thousands of subscribers (see a list of the top 40 here) it becomes clear this acquisition offers some appealing numbers.

So they did it again... Google has made yet another smart acquisition. If the report from TechCrunch is accurate the acquisition is set to be completed within 2-3 weeks.

http://www.isedb.com/db/articles/1660/

Link Bait is the New Reciprocal Link

Link bait and reciprocal linking programs have quite a bit in common. Without disparaging anyone who engages in link baiting strategies I do believe that many forms of link baiting are just a dressed up version of reciprocal links without the reciprocation. Note here that I said, many forms. I don't mean all link baiting, but I would guess a bigger slice of the link baiting pie holds no more value than the old fashioned reciprocal link schemes.

A brief history of reciprocal linking

We've all heard it said that reciprocal links are dead. Well, they're not. But some forms of reciprocal linking most certainly are. Remember when Google showed up and started evaluating incoming links as part of the ranking algorithm? This started everybody down the path of seeking to acquire as many links as possible. For awhile, the most efficient way to do this was to simply barter a link exchange. "Hey, if you link to me, I'll link to you!"

This solution started the trend of websites adding dozens of "reciprocal link" pages to their sites linking out to other sites in which they were able to work a link "partnership" with. And for awhile, this type of reciprocation worked. At least until the search engines learned to identify these non-valuable pages. Of course this led to three and four-way reciprocal linking tactics and the creation of link "directories." Again, these worked for a while and perhaps still do to a smidgen of a degree.

The basic problem here was one of motivation. It was all about getting a lot of links, many which were largely irrelevant. It was links based on volume and volume alone. Eventually link reciprocates began to target their links a bit more selectively, to be more within the real of relatedness, but still the concept here was on links, not user value.

Along comes link baiting
When you hear people talk about link baiting today, it's almost as if we are in the early stages of the reciprocal links schemes all over again. It's all about link volume rather than link relevance. This isn't always the case but for many, if not most, it is.

Rand has a good series of posts all about how to link bait the linkerati. Essentially, Rand explains how to attract links from the masses of people who are most likely to link to you, not necessarily the ones who are most interested in your product or service. I appreciate Rand's posts and I believe he, more than most, tries to get relevant links through his link bait, but getting links from linkers, rather than a targeted audience or related websites, is going right back to seeking out links for quantity rather than quality.

As did reciprocal linking, this will work. At least for a while. But sooner or later the search engines will get even smarter about analyzing links. Just as links from reciprocal pages were deemed irrelevant, I believe that links from the "linkerati" will largely be deemed irrelevant as well. Search engines, in their quest to provide the best search results for each query, understand that popular doesn't necessarily mean relevant. Especially popularity from a group of people who are always interested in what's popular.

The way sites earn quality links today isn't the same as the way links were earned several years ago, so too the link baiting of tomorrow isn't going to be the link baiting of today. True success in link baiting will be from those that learn how to bait their target audience. Some of that will include the linkerati, but most of it won't. The true skill of link baiting won't be getting to the top of digg, but getting to the top of the minds of your target audience so they can't help but want to link to your site.

http://www.isedb.com/db/articles/1661/

Optimizing Content for Universal Search

By now, you’ve all heard about Google’s new Universal Search concept, which combines all the information within its vertical databases into one index to serve a single set of Web search results. As you can imagine, this will require some adjustments to standard search engine optimization techniques. If you have been following the Bruce Clay methodology, then you should already be on the right track to optimizing every aspect of your Web site that is under your control. With the arrival of universal search, it's not just a good idea; it's a necessity.

Google Vice President of Search Products and User Experience Marissa Mayer said the company’s goal for universal search is to create “a seamless, integrated experience to get users the best answers.” Mayer stated on the official Google blog that the universal search vision would be “one of the biggest architectural, ranking, and interface challenges” the search engine would face.

Mayer first suggested this concept to Google back in 2001. Since then, the company has been building the infrastructure, algorithms and presentation mechanisms needed to blend the different content from Images, Video, News, Maps, Blogs et al into its Web results. This is Google’s first step toward removing the partition that separates its numerous search silos, integrating these vast repositories of information into a universal set of search results. The object is to make queries more relevant for users, but what are the ramifications for SEO?

Google Relevancy Challenge
Based on industry research, Google has a relevancy problem because the database is too vast. Back in 2005, Jupiter Research touched on this, stating it identified an opportunity for vertical search engines. The study inferred that general search engines were good at classifying vast amounts of information, but not very good at serving results that helped users make decisions.

A year later, Outsell came out with “Vertical Search Delivers What Big Search Engines Miss,” a study that also mentioned the opportunity for vertical search due to dissatisfaction with general search engines. This report published the oft-quoted fact stating that the average Internet search failure rate is 31.9 percent. The study identified two market trends contributing to the growth of vertical search – failed general searches and rising keyword prices in paid search.

Another noteworthy study was conducted by Convera. Over 1,000 online business users were asked about their search practices, successes, and failures. Only 21 percent of the respondents thought that search queries on general search engines were understood, a mere 10 percent found critical information on the first try in general search engines. This study concluded, “To date, professionals have not been adequately served by consumer search engines.”

The results of these studies show that Google and other general search engines are challenged to produce relevant results, suggesting vertical and niche search engines could eliminate such problems because the niche databases contain topic-specific information, serving targeted, more relevant answers to user queries.

Google’s Solution to Relevancy
Since Google’s move toward universal search, one can only assume it has considered the above problems and decided that pulling all its databases together, comparing and ranking them accurately at warp speed, could be the solution to relevancy. Doing this requires new technical infrastructure, including new algorithms, software and hardware, which Google has been working on since 2001 and is now in the process of implementing. Universal search has implications for search marketers because it is a departure from the uniformity that characterized search marketing in the past, requiring adjustments in SEO methodology. Since the modifications will be implemented in steps, immediate changes in the SERPS won’t be obvious, and there is time to develop new optimization strategies.

Search Personalization
In addition to universal search, Google is also focusing on personalization in the SERPs. This means users will be seeing different SERPS based on their previous queries, if signed into their Google accounts. Users may or may not notice many changes in the SERPs due to universal search and personalization, depending on their level of sophistication and/or powers of observation. However, marketers will be scrambling. Marketers will need to get their clients listed into as many niche databases as possible to increase the breadth of coverage for universal search. Social media optimization techniques can be used to enhance both universal and personalized search results.

Universal Search Optimization Strategies
The focus on personalization and universal search requires more emphasis on social media SEO strategies because of user interest in creating content and the vast amounts of new multimedia content created daily on the Web. Marketers are beginning to drive traffic via social networking sites, and these efforts are known to enhance search engine optimization campaigns. Strategies include creating multimedia content such as blogs, videos and podcasts, and then getting them listed on social search sites like Del.icio.us, Digg, Reddit and StumbleUpon, as well as niche search engines like Technorati, Podzinger and Blinx.

When creating multimedia content, you must ensure that it is tagged and cataloged correctly. Multimedia content is optimized through established fundamental SEO techniques, such as creating keyword-rich, user-friendly content, unique Meta tags, good site navigation and structure, and implementing a successful linking strategy. Below are a few suggestions for creating and submitting multimedia content for several of Google’s vertical databases to gain extended reach through universal search.

Google Image Search: It has always been a good idea to use images on your site for illustrating your products and services. Now, this becomes a way for your customers to find your site via Google Image Search. Optimize your images with descriptive, keyword-rich file names and ALT tags. Use accurate descriptions of your image files for the benefit of the vision impaired and others who might need to view the site with text only.

Google Video (beta): As with optimizing images, use descriptive, keyword-rich file names for your video files. Also create a keyword-rich title tag, description tag, and video site map. Create a Web page to launch your video, optimizing content for SEO and using anchor text wherever possible. Besides submitting to Google Video, also include Blinkx and other social networking and search sites like YouTube and Potzinger (audio and video search engine).

Google News: Here’s where you can submit your press releases for display as "news" and subsequent indexing. Issue press releases containing current information about new products and events your site is involved with and Google News will likely pick it up.

Google Maps: This is also known as Google Local, a vertical that has been included in Google search results for a while. Give your site a local presence through the Google Maps Local Business Center where local businesses can get a free basic listing to extend their reach in the SERPs.

Google Blog Search (beta): You all have a corporate blog, right? This is how modern companies communicate with their customers and stakeholders. Tag it (digg, del.icio.us, stumbleupon, etc.), submit to Google Blog search, and extend your reach for Web searches on Google.

In closing, there are many ways social and multimedia content can enhance your SEO efforts. Experiment and learn how to use social media to extend your SEO rankings. As you become aware of the many niche databases for submitting multimedia content, this can go a long way toward gaining visibility through Google’s personalized and universal search.

http://www.isedb.com/db/articles/1664/1/Optimizing-Content-for-Universal-Search/Optimizing-Content-for-Universal-Search.html

Protect Your Ecommerce Customers From Identity Theft

There will be $24.3 billion in online transactions this holiday season; this will undoubtedly be accompanied by an equal rise in the number and type of attacks against the security of online payment systems and against ecommerce consumers. Some of these attacks will utilize vulnerabilities that have become noticeable in the third-party components utilized by websites, such as shopping cart software. Other fraud attempts will most likely use vulnerabilities that are common in any web application and can allow a knowledgeable hacker to penetrate the defenses of an ecommerce host webserver.

There are also the more popular electronic fraud techniques of "spoofing" and "phishing." This activity is on the rise with growing numbers of phishers sharpening their talents, producing quite convincing e-mails and dummy notices from ecommerce web sites; the number of consumers victimized using this method is increasing rapidly. Ecommerce website merchant have a clear mandate to exercise due diligence by doing everything possible to protect their hard-won customers from identity theft.

Know Your Software Vulnerabilities
There are a number of security vulnerabilities in some shopping cart and online payment systems that make them a target for sophisticated hackers. Being aware of these can help you take steps to secure your side of the customer's information pathway more effectively.
Some of these pitfalls are:

* SQL injection – this technique is begun by a hacker sending the single-quote (' ) character as a test; this can yield a detailed error message, which reveals the server's back-end technology and in some programs allows the attacker to access usually restricted areas of the server. In one well-known occurrence, a mid-level programmer in suburban California discovered how to uncover credit card numbers, transaction details, etc. from a number of ecommerce websites using unique URLs that contained meta-characters used in SQL programming.
* Buffer overflow vulnerabilities – this involves sending in a large number of bytes to web applications that are not geared to deal with them. Sometimes a hacker can learn the path of the PHP functions being used on the ecommerce gateway server by sending in a very large value in the input fields and interpreting the resultant error message. The server error pages that are returned by these efforts can serve as a valuable source for critical information enabling the hacker to manipulate scripts for fraudulently obtaining customer financial information.
* Cross-site Scripting (XSS) Attack – This method involves a web form that takes in user input, processes it, and prints out the results on a web page, which also contains the user's original input. A cleverly crafted URL is used in this process to try and steal the user's cookie, which would probably contain the session ID and other sensitive information.
* Remote Command Execution – This powerful hacking vulnerability occurs when the CGI script has inadequate input validation thus allowing the execution of operating system commands. This is most common with the use of the 'system' call in Perl and PHP scripts. A hacker using the command separator and other shell metacharacters can execute commands with the privileges of the web server administrator.


Guide Your Customer To Safety
The aforementioned "phishing" and "spoofing" are clever copies of an ecommerce website that request "password validation" or "account updating" and link to a dummy page that captures the financial information the hapless victim has been duped into entering. Another ruse is to have a pop-up form requesting username and password appear over a legitimate ecommerce website page. These all play into consumer unawareness and inattention.

The primary responsibility that any ecommerce merchant has is to caution the customer to be wary, help them recognize a false ecommerce webpage, and give those customers a way to verify the credibility of any communications they may receive that appear to represent the ecommerce store.

There are new fraud techniques appearing every week. The goal of any reputable ecommerce entrepreneur should be to give the customer the most rewarding, satisfying, and above all safe online shopping experience.

http://www.isedb.com/db/articles/1663/

Entrepreneurial Advice: Ask an Expert

As a lifestyle, entrepreneurialism has always demanded the self-employed be prepared to wear many hats. Business owners often perform executive, sales, managerial, productive and janitorial tasks on daily basis, partly due to pride of ownership and sense of excellence, and partly because technology allows them to. While this gives owners a better sense of all aspects of their businesses, it can also cloud their critical judgment and decision making abilities.

I was once asked to review a new website for an online business. The site took its owner several years to complete. He made it while working two other jobs in order to fund the creation of his own business. Building the website helped him focus on his dream, keeping the spirit alive during those long years of saving money from double shift days with nights and weekends spent on the business. Needless to say, the site is massive. Unfortunately it’s also kinda clunky.

A good SEO is an honest SEO though sometimes one’s honesty is a bear trap waiting to bite you. The site, though it was a labour of love, was not ready for primetime and I had to tell him so. Knowing his business back story didn’t help matters. I hoped he hadn’t quit his day job. Needless to say, the experience left a mark on me. Unfortunately, it was kinda clunky.

The person was a dedicated do-it-yourselfer. Naturally he defended his work saying he had read articles by several SEO experts, including my own and applied the ideas throughout the site. I responded that some of what he had done was ok but overall there were so many unique issues a simple review would not have been useful to him. Eventually, this story came to a happier ending than it could have. He ended up redoing large sections of his website based on suggestions he talked myself and other SEOs into giving him. Thousands of other’s aren’t so lucky.

In a similar vein, yesterday while interviewing Neil Patel on WebmasterRadio our conversation turned towards the need to hire social media optimizers to keep up with the ever expanding social networking space. I found myself wondering how a self employed business person, who also doubles as their own chief bottle washer, makes informed decisions about where to spend a relatively small marketing budget, especially if they are just starting their online business.

Recalling my experience with the emerging entrepreneur and his five-year old dream of building a supply business online, I often wonder how business people new to the web make critical decisions in the first place.

Why an online business? Who told them an online business was a good idea in the first place? Where do they get their initial information from? Are local business centers adequately prepared to help set up online businesses? How complete or up to date are their sources? Are there secondary advisors or professional mentors helping them make sense of the information available to them on and off-line? Is it an SEO’s responsibility to act as online business coach if the business clearly needs it?

The advantages of using the Internet as a means of communication are obvious. The ways the Internet has changed entrepreneurs and their decision making processes is somewhat more subtle. I’m not sure I’ll ever fully understand that.

For the past fifteen years, the Internet has provided unprecedented opportunities for individuals to create and manage their own businesses. Many online businesses have been created by people who, in most cases, would otherwise not have found a way to satisfy their dreams of self employment. My first attempts at entrepreneurialism nearly two decades ago were both almost dead before they started due to the enormous entry costs associated with opening a brick-and-mortar business.

It is hardly surprising that fifteen years on, the Internet economy is doing pretty well. What is surprising is that it has done so well even as most of us participating in it have been making it up as we’ve gone along. Along with unprecedented opportunities, the Internet has brought us unprecedented situations.

Twenty years ago, nobody could have envisioned a better advertising system than television. This year, more money is going to be invested in online advertising than in will be spent on television ads. Though a major chunk of incoming cash comes from Madison Avenue the bulk of the “new” online money comes from the small to micro-business sector.

I hold a huge respect for small businesses and the people who run them. It’s a tough and sometimes terrifying reality of long hours and sleepless nights. It takes a lot of courage to start one’s own business but more than courage, it takes a lot of knowledge. Making critical decisions about something as detailed as the structure of an online business is tough and, even for experienced webmasters it isn’t getting easier. In most communities there are small business support systems such as a Chamber of Commerce or non-profit community / business development organizations. It would be good to know those organizations were able to offer fundamental information to emerging online entrepreneurs.

http://www.isedb.com/db/articles/1665/

Changes to Google Webmaster Guidelines

Not only has Matt Cutts offered up some insight on getting out of Google's supplemental index, Google has also updated the "quality guidelines" area of their Webmaster Guidelines, providing a one-two punch for search marketers obsessed with Google's stance on all things optimization.

The main difference in the Google Webmaster guidelines is that Google is now offering up more information on many of the topics mentioned. (In other words, the rules are still mostly the same, Google is just giving you more information about them.

For example, if you've ever wondered what Google means by "no cloaking or sneaky redirects" you now have quick and easy access to a full answer.

When Googlebot indexes a page containing Javascript, it will index that page but it cannot follow or index any links hidden in the Javascript itself. Use of Javascript is an entirely legitimate web practice. However, use of Javascript with the intent to deceive search engines is not. For instance, placing different text in Javascript than in a noscript tag violates our webmaster guidelines because it displays different content for users (who see the Javascript-based text) than for search engines (which see the noscript-based text). Along those lines, it violates the webmaster guidelines to embed a link in Javascript that redirects the user to a different page with the intent to show the user a different page than the search engine sees. When a redirect link is embedded in Javascript, the search engine indexes the original page rather than following the link, whereas users are taken to the redirect target. Like cloaking, this practice is deceptive because it displays different content to users and to Googlebot, and can take a visitor somewhere other than where they intended to go.

It's also worth noting that Google has now included "official" statements on why they want webmasters to report paid links.

However, some SEOs and webmasters engage in the practice of buying and selling links, disregarding the quality of the links, the sources, and the long-term impact it will have on their sites. Buying links in order to improve a site's ranking is in violation of Google's webmaster guidelines and can negatively impact a site's ranking in search results.

Not all paid links violate our guidelines. Buying and selling links is a normal part of the economy of the web when done for advertising purposes, and not for manipulation of search results. Links purchased for advertising should be designated as such. This can be done in several ways, such as:

* Adding a rel="nofollow" attribute to the href tag
* Redirecting the links to an intermediate page that is blocked from search engines with a robots.txt file

Google works hard to ensure that it fully discounts links intended to manipulate search engine results, such link exchanges and purchased links. If you see a site that is buying or selling links, let us know by clicking Report spam in the index under the Tools menu in Webmaster Tools. We'll use your information to improve our algorithmic detection of paid links.

Interesting. Basically, Google is saying that if you sell links and refuse to nofollow them, then you're putting yourself (and your advertisers) at risk.

Somehow, I don't think that's going to go over well.

http://www.searchengineguide.com/searchbrief/senews/010139.html