Wednesday, September 29, 2010

Google Instant: The Impact On Paid Search

Google Instant rolled out September 8th to much fanfare and ballyhoo. The search marketing industry has been abuzz ever since with speculation about the impacts on both paid and natural search.
Our firm has taken a pretty close look at the initial impact on paid search performance and we want to share our findings with you folks.
Methodology
We studied Google AdWords data from the period prior to the launch of Instant and compared it to the first week-plus following the launch for a wide range of clients. Our client base is heavily retail, so those in other sectors may have different findings.
We looked at the impacts both in aggregate and by advertiser to see if averages hid meaningful shifts. We looked exclusively at data from competitive, non-brand search terms.
We tried to answer the following questions:
• What impact has Google Instant had on impressions and clicks on paid search ads overall?
• Does Instant create a greater emphasis on ads served at the top of the page and diminish the traffic on ads served closer to the bottom of the page? Some speculated that Instant would bias users against scrolling and effectively increase the incentive for higher positioning.
• Does Instant help or hurt the long tail of paid search? Some have wondered if watching the results change as you type would encourage users to keep typing as results get more and more targeted, or on the contrary encourage more to stop early and click on the first mildly relevant link.
• Does Instant help or hurt conversion rates? The stated goal of the product is to get people where they want to go faster. Does it also help them find more relevant ads?
• Does Instant impact some types of keywords more than others?
Findings
Question #1: Overall traffic. Initially following the Google Instant launch we did see a small spike in Google impressions and clicks, both from week to week and relative to our traffic levels on Bing and Yahoo. It should be noted that part of the week to week increase benefited from favorable comps to a slow Labor Day weekend, while, even at it’s peak following the launch, Google’s traffic share just slightly cracked two percentage points above its 30 day average.


We’re not sure whether the media coverage around the Instant launch delivered a genuine lift in impressions and even click traffic or whether this blip is more related to roll-out glitches. Clearly whether Instant will garner Google even greater market share, or will ultimately be little more than an update to the suggest function remains to be seen.
Question #2: Top vs. bottom.
We haven’t seen a large shift in traffic composition between ads at the top of the page and those at the bottom for most advertisers.
We looked at a handful of our larger accounts and measured the baseline fraction of traffic that comes in from ads in the top 3 positions to see if that fraction has materially increased.

As the chart shows, the median shifts aren’t zero, but they aren’t huge either. The median advertiser saw a slight dip in the number of ads in the top 3 positions—likely unrelated to Google Instant—but a slight increase in the fraction of click traffic those high positioned ads represent. Very interestingly, the fraction of orders increased slightly more than the fraction of clicks, possibly indicating improved conversion; it is not crazy to suggest that the people who stop typing early to select an ad do so precisely because they’ve read the ad more carefully than the average searcher has done historically.
On the flip side, have we seen a relative decrease in the importance of ads in position 8 through 12? It bears mentioning that since these already account for a fairly small portion of the traffic for most advertisers, the percentage changes are easier to influence materially. For example, if these low positioned ads usually account for 5% of traffic and that drops to 4%, that shows up as a 20% decline in the table below.

Understanding the spikiness of this data set (particularly the order counts), this data isn’t alarming either.
Question #3: Head vs. tail. This is a bit more interesting. Here we picked an arbitrary number of clicks as defining a “head” term, and looked to see whether these head terms had become more or less important as a percentage of the whole.

The first observation is that the number of keywords meeting the definition of a head term increased for most advertisers. That could simply be a seasonal increase in traffic, but in two-thirds of the cases studied the increase in head ads was greater than could be explained by sheer traffic volume.
These ads did drive a larger fraction of the total clicks and sales for most advertisers in the week after Google Instant launched, but given the increase in the number of ads, this by itself doesn’t mean the tail has been significantly diminished.
Question #4: Conversion rates. Early, early indications are that conversion rates for head and tail may have improved as a result. Not ready to proclaim victory on this front just yet, and we’re not talking about a large change, but we’re seeing some indications that this could create a bit of a virtuous cycle.
Question #5: Keyword-level effects. While there may not be a sea-change in overall performance, there are some interesting keyword-level effects that can be tremendously important for some advertisers.
For example, we’ve noticed a huge shift with respect to treatment of singulars and plurals, with the more popular of the two seeming to become the default. For advertisers with tremendous volumes of business tied to a handful of terms, this can be a very big deal.
For several advertisers, ads running on competitor’s trademarks and domain names seem to have dropped off the map. This may be an Instant effect, or possibly a change to Quality Score algorithms. We haven’t studied this comprehensively enough to guess whether this is universal.
We’ve also noticed some odd effects on keywords that have other completely unrelated meanings. As an example: if you type in “toothpaste” all the results are geared towards “toothpaste for dinner” with no ads showing. This is true even after you’ve typed the whole word with a space after it. When you hit “enter” the ads for toothpaste appear.
The reverse of this is what Glenn Edelman of Wine Enthusiast refers to as “short typing.” You’re looking for “Wine Enthusiast” and you get to “Wine En” and see the results you want. You reflexively hit enter and Google now brings back results for “wine en” which turn out to be different. We’re not sure this is a large enough effect to worry about, but if it is, please refer to it as “short typing” and credit Glenn!
We strongly encourage everyone to do a keyword level report from the period before Google Instant launched and compare it to a similar report for the period after. Use a vlookup to match the traffic volume on the keyword after, to the traffic volume on the top keywords before and vice-versa. The results are fascinating!
Conclusions
At this point, we don’t see cause for massive alarm for most advertisers in paid search. I’d love to see more data from the SEO community to see if the impact is more pronounced there.

Friday, July 9, 2010

Paid Search Ads:-Tips & Tricks For Creative Ad Copywriting

Tips & Tricks For Creative Ad Copywriting
Create simple, enticing ads. What makes your product or service stand out from your competitors? Highlight these key differentiating points in your ad. Be sure to describe any unique features or promotions you offer.
Include prices and promotions. The more information about your product that a user can gain from your ad text, the better. For example, if a user sees the price of a product and still clicks the ad, you know they’re interested in a potential purchase at that price. If they don’t like the price, they won’t click your ad, and you save yourself the cost of that click.

Use a strong call-to-action.
Your ad should convey a call-to-action along with the benefits of your product or service. A call-to-action encourages users to click on your ad and ensures they understand exactly what you expect them to do when they reach your landing page. Some call-to-action phrases are Buy, Purchase, Call today, Order, Browse, Sign up, and Get a quote; while ‘find’ and ’search’ may be accurate verbs, they imply that the user is still in the research mode, and may not encourage the user to perform the action you’d most like them to take.

Include one of your keywords in your ad text.
Find the best performing keyword in your ad group and include it in your ad text, especially in the title. Whenever a user types that keyword and sees your ad, the keyword phrase will appear in bold font within your ad on Google. This helps draw the user’s attention to your ad and shows users that your ad relates to their search.
Choose the best destination URL. Review the website you’re advertising and find the specific page that has the information or product described in your ad. If users do not find what is promised as soon as they arrive, they are more likely to leave your website. Be sure that any promotions and particular products mentioned in your ad are visible on your landing page.

Test multiple ads in each ad group.
Experiment with different offers and call-to-action phrases to see what’s most effective for your advertising goals. Our system automatically rotates ads within an ad group and shows the better-performing ad more often.


Remember the intent!
You should have a pretty good idea of where in the buying cycle searchers are using specific keywords. Are they at the top of the funnel or ready to buy at the bottom? Make sure your address user intent in your creative. If they’re just looking for information, your ad should reflect that—you’ll have a quite different ad for people ready to buy and simply browsing for the best deal.

Stand out from the crowd.
If all of your competitors are touting price, try a different tactic such as listing the benefits from your product/service or some awards you may have won. Eyeballs are drawn to differences on the search result page, not similarities.

If you’re all about price, stay on top of your competitors.

If your entire strategy revolves around an offer, such as a percentage off or a low cost deal, then you must monitor your keyword landscape closely. If your bid is at $20, someone might come in at $19.99 and make off with your customers.

Include local terms for your top markets.

If you’re a local business, this is a no-brainer. Make sure someone who is nearby realizes that you are too. It immediately creates a trust factor if they know they can drive out to see you. For national (or international) advertisers, you can let potential customers know that you have a focus in their area and that you are addressing their specific needs. Make sure, however, to back this up on the landing page. Many large advertisers split up their campaigns by region, state or even DMA. That can mean a lot of unique landing pages, but the increased conversion rate due to the relevancy factory may justify the cost.

“Official site” is golden.

If you are indeed the official site for your product or service, using the words official site works really well. I haven’t seen any research for it, but I’ve used this tactic time and again for clients, and it’s never let me down.
“Free shipping” also is a crowd pleaser. This is something you definitely need to test, but it seems to work really well as a general marketing technique, especially on the internet where shipping costs sometimes negate the benefit of ordering direct. But don’t go this route unless you have done the math that free shipping won’t be too costly for you either.

Use www. in your display url.:

Including “www” in the display URL tends to boost clickthrough rates. 80.6% of the ads in our database also included it, leading us to believe that this is a general rule. If you are among the nearly 20% of advertisers who are not including “www” in your URL, you should consider testing it.”

Use language that turns away the “wrong” users.
Not every paid search click is a valuable one. Make sure to spot trends and tailor your ad copy accordingly. For example, if you sell car parts in bulk, and find that your ads are attracting users who just want to buy one or two individual parts, you may want to indicate that you only sell in large quantities. Remember, you pay per click, so getting the wrong users to not click your ads is as important as getting the right ones to do so.

Link Building Best Practices - A Guide to Effective Link Building Methods

Link building is the process of building backlinks to your site. The more backlinks (links from other sites to your site) a site has, the higher it ranks on the search engines. Link building therefore is a mission of all money makíng webmasters.


Link building methods have changed from time to time and a specific method that worked yesterday may not work today. This is due to changes in the industry over time and in some cases extreme abuse. If a specific link building method has been abused too much, then that method will not work anymore.

It is therefore very important to know which link building method works today so that you can spend your time in a most productive way.

There are many link building methods out there today that work with some working better than others. Below are some of the today’s popular links building methods:

Article Submission:-

If you are reading this article, then you know that you can publish your article in article directories. Not only do you gain backlinks to your site from your published articles (see bottom of this article), you will get some traffic as a bonus. Think of you reading this article. There might be many reading your published articles as well.

Article submission is a great way of building backlinks as it provides you with 100% relevant contextual backlinks that Google loves.

Directory submission used to work lot better before, but it is still a popular link building method that still works if done properly. How many directories are out there as of this date is anyone’s guess but one thing is certain: that most of the directories aren’t worth submitting to.

Google considers a Yahoo Directory link as a quality backlink so if you can afford and justify the cost, it is worth submítting to Yahoo Directory. Yahoo charges $300 for a yearly submission and there is no assurance that your site will be approved!

Directory Submission:-

DMOZ is another directory that is worth submitting to and can provide great benefit in your SEO campaign. It may take months to get the approval from DMOZ and the chance of getting approved is pretty slim. A lot of small directories use the DMOZ directory categories so getting listed with DMOZ would mean getting bonus listings on many other web directories.

Another good directory to submit to is the BOTW dírectory that costs $99 for a yearly submission and $299 for a permanent listing.

Other than the above, you should look for quality directories where you can submit your site. You can judge the quality of the directories by analyzing the number of sites listed, number of backlinks the directory has, PR, age of the directory, etc.

Social Bookmarking:-

Social bookmarking worked like charm only a few months ago. If bookmarked on authority and quality bookmarking sites like Digg, Mixx, Propeller, etc, then you can still make good use of social bookmarking. Other than backlinks, social bookmarking also offers you some bonus traffic depending on where you submit to. If you can manage to put together or collect a good bookmarking site líst, then social bookmarking can still be a useful link building method.

Blog Commenting:-

This is a link building method where lots of spamming has already been done so to make the best of it, you need to work a little harder here. It is best to find quality blogs related to your category and make on-topic relevant comments. Not only do your comments add value to the blogpost, you now have a greater chance of getting your comment approved and your comments have greater chance of staying on those blogposts.

Press Releases:-

Submitting to press release sites can get you some backlinks as well. It is however hard to put together a líst of good press release sites that are worth submitting to. If a good líst can be managed, then this method can give some quality relevant backlinks as well.

Social Media and Web 2.0 Pages:-

There are a lot of quality social media and Web 2.0 sites where you can publish your articles for backlinks. You should write articles that are relevant to your site and publish them on these social media and Web 2.0 sites with your keywords hyperlinked to your site. Some of the authority sites are Squidoo, Hubpages, Blogger, Wordpress.com, etc.

Social media and Web 2.0 links are very popular these days because they work great. They give quality, relevant contextual backlinks that Google and other major search engines love. There are lots of other link building methods out there, but if you can utilize the above ones to their fullest potential, there is no need for any other methods.

SEO Worst Practices

SEO Worst Practices:-
1. Do you use pull-down boxes for navigation? Search engine spiders can’t fill out forms, even short ones with just one pull-down. Thus, they can’t get to the pages that follow. If you’re using pull-downs, make sure there is an alternate means of navigating to those pages that the spiders can use. Note this is not the same as a mouseover menu, where sub-choices show up upon hovering over the main navigation bar; that’s fine if done using CSS (rather than Javascript.)
2. Does your primary navigation require Flash, Java or Javascript? If you rely on search engine spiders executing Flash, Java or Javascript code in order to access links to deeper pages within your site, you’re taking a big risk. The search engines have a limited ability to deal with Flash, Java and Javascript. So the links may not be accessible to the spiders, or the link text may not get associated with the link. Semantically marked up HTML is always the most search engine friendly way to go.
3. Is your site done entirely in Flash or overly graphical with very little textual content? Text is always better than graphics or Flash animations for search engine rankings. Page titles and section headings should be text, not graphics. The main textual content of the page should ideally not be embedded within Flash. If it is, then have an alternative text version within div tags and use SWFObject to determine whether that text is displayed based on whether the visitor has the Flash plugin installed.
4. Is your home page a “splash page” or otherwise content-less? With most webites, as mentioned above, the home page is weighted by the search engines as the most important page on the site (i.e., given the highest PageRank score.) Thus, having no keyword-rich content on your home page is a missed opportunity.
5. Does your site employ frames? Search engines have problems crawling sites that use frames (i.e., where part of the page moves when you scroll but other parts stay stationary.) Google advises not using frames: “Frames tend to cause problems with search engines, bookmarks, emailing links and so on, because frames don’t fit the conceptual model of the Web (every page corresponds to a single URL.) “Furthermore, if a frame does get indexed, searchers clicking through to it from search results will often find an “orphaned page”: a frame without the content it framed, or content without the associated navigation links in the frame it was intended to display with. Often, they will simply find an error page.What about “iFrames”, you ask? iFrames are better than frames for a variety of reasons, but the content within an iframe on a page still won’t be indexed as part of that page’s content.
6. Do the URLs of your pages Include “cgi-bin” or numerous ampersands? As discussed, search engines are leery of dynamically generated pages. That’s because they can lead the search spider into an infinite loop called a “spider trap.” Certain characters (question marks, ampersands, equal signs) and “cgi-bin” in the URL are sure-fire tip-offs to the search engines that the page is dynamic and thus to proceed with caution. If the URLs have long, overly complex “query strings” (the part of the URL after the question mark), with a number of ampersands and equals signs (which signify that there are multiple variables in the query string), then your page is less likely to get included in the search engine’s index.
7. Do the URLs of your pages include session IDs or user IDs? If your answer to this question is yes, then consider this: search engine spiders like Googlebot don’t support cookies, and thus the spider will be assigned a new session ID or user ID on each page on your site that it visits. This is the proverbial “spider trap” waiting to happen. Search engine spiders may just skip over these pages. If such pages do get indexed, there will be multiple copies of the same pages each taking a share of the PageRank score, resulting in PageRank dilution and lowered rankings.If you’re not quite clear on why your PageRank scores will be diluted, think of it this way: Googlebot will find minimal links pointing to the exact version of a page with a particular session ID in its URL.
8. Do you unnecessarily spread your site across multiple domains? This is typically done for load balancing purposes. For example, the links on the JCPenney.com home page point off to www2.jcpenney.com, or www3.jcpenney.com, or www4.jcpenney.com and so on, depending on which server is the least busy. This dilutes PageRank in a way similar to how session IDs in the URL dilute PageRank.
9. Are your title tags the same on all pages? Far too many websites use a single title tag for the entire site. If your site falls into that group, you’re missing out on a lot of search engine traffic. Each page of your site should “sing” for one or several unique keyword themes. That “singing” is stifled when the page’s title tag doesn’t incorporate the particular keyword being targeted.
10. Do you have pop-ups on your site? Most search engines don’t index Javascript-based pop-ups, so the content within the pop-up will not get indexed. If that’s not good enough reason to stop using pop-ups, you should know that people hate them – with a passion. Also consider that untold millions of users have pop-up blockers installed. (The Google Toolbar and Yahoo Companion toolbar are pop-up blockers, too, in case you didn’t know.)
11. Do you have error pages in the search results (“session expired” etc.)? First impressions count . . . a lot! So make sure search engine users aren’t seeing error messages in your search listings. Hotmail took the cake in this regard, with a Google listing for its home page that, for years, began with: “Sign-In Access Error.” Not exactly a useful, compelling or brand-building search result for the user to see. Check to see if you have any error pages by querying Google, Yahoo and Bing for site:www.yourcompanyurl.com. Eliminate error pages from the search engine’s index by serving up the proper status code in the HTTP header (see below) and/or by including a meta robots noindex tag in the HTML.
12. Does your “file not found” error page return a 200 status code? This is a corollary to the tip immediately above. Before the content of a page is served up by your Web server, a HTTP header is sent, which includes a status code. A status code of 200 is what’s usually sent, meaning that the page is “OK.” A status code of 404 means that the requested URL was not found. Obviously, a file not found error page should return a 404 status code, not a 200. You can verify whether this is the case using a server header checker and then into the form input a bogus URL at your domain, such as http://www.yourcompanyurl.com/blahblah. An additional, and even more serious, consequence of a 200 being returned with URLs that are clearly bogus/non-existent is that your site will look less trustworthy by Google (Google does check for this).Note that there are other error status codes that may be more appropriate to return than a 404 in certain circumstances, like a 403 if the page is restricted or 500 if the server is overloaded and temporarily unavailable; a 200 (or a 301 or 302 redirect that points to a 200) should never be returned, regardless of the error, to ensure the URL with the error does not end up in the search results.
13. Do you use “click here” or other superfluous copy for your hyperlink text? Wanting to rank tops for the words “click here,” eh? Try some more relevant keywords instead. Remember, Google associates the link text with the page you are linking to, so make that anchor text count.
14. Do you have superfluous text like “Welcome To” at the beginning of your title tags? No one wants to be top ranked for the word “welcome” (except maybe the Welcome Inn chain!) so remove those superfluous words from your title tags!
15. Do you unnecessarily employ redirects, or are they the wrong type? A redirect is where the URL changes automatically while the page is still loading in the user’s browser. Temporary (status code of 302) redirects — as opposed to permanent (301) ones — can cost you valuable PageRank. That’s because temporary redirects don’t pass PageRank to the destination URL. Links that go through a click-through tracker first tend to use temporary redirects. Don’t redirect visitors when they first enter your site at the home page; but if you must, at least employ a 301 redirect. Whether 301 or 302, if you can easily avoid using a redirect altogether, then do that. If you must have a redirect, avoid having a bunch of redirects in a row; if that’s not possible, then ensure that there are only 301s in that chain. Most importantly, avoid selectively redirecting human visitors (but not spiders) immediately as they enter your site from a search engine, as that can be deemed a “sneaky redirect” and can get you penalized or banned.
16. Do you have any hidden or small text meant only for the search engines? It may be tempting to obscure your keywords from visitors by using tiny text that is too small for humans to see, or as text that is the same color as the page background. However, the search engines are on to that trick.
17. Do you engage in “keyword stuffing”? Putting the same keyword everywhere, such as in every ALT attribute, is just asking for trouble. Don’t go overboard with repeating keywords or adding a meta keywords tag that’s hundreds of words long. (Why even have a meta keywords tag? They don’t help with SEO, they only help educate your competitors on which keywords you are targeting.) Google warns not to hide keywords in places that aren’t rendered, such as comment tags. A good rule of thumb to operate under: if you’d feel uncomfortable showing to a Google employee what you’re doing, you shouldn’t be doing it.
18. Do you have pages targeted to obviously irrelevant keywords? Just because “britney spears” is a popular search term doesn’t mean it’s right for you to be targeting it. Relevancy is the name of the game. Why would you want to be number one for “britney spears” anyway? The bounce rate for such traffic would be terrible.
19. Do you repeatedly submit your site to the engines? At best this is unnecessary. At worst this could flag your site as spam, since spammers have historically submitted their sites to the engines through the submission form (usually multiple times, using automated tools, and without consideration for whether the site is already indexed). You shouldn’t have to submit your site to the engines; their spiders should find you on their own — assuming you have some links pointing to your site. And if you don’t, you have bigger issues: like the fact your site is completely devoid of PageRank, trust and authority. If you’re going to submit your site to a search engine, search for your site first to make sure it’s not already in the search engine’s index and only submit it manually if it’s not in the index. Note this warning doesn’t apply to participating in the Sitemaps program; it’s absolutely fine to provide the engines with a comprehensive Sitemaps XML file on an ongoing basis (learn more about this program at Sitemaps.org).
20. Do you incorporate your competitors’ brand names in your meta tags? Unless you have their express permission, this is a good way to end up at the wrong end of a lawsuit.
21. Do you have duplicate pages with minimal or no changes? The search engines won’t appreciate you purposefully creating duplicate content to occupy more than your fair share of available positions in the search results. Note that a dynamic (database-driven) website inadvertently offering duplicate versions of pages to the spiders at multiple URLs is not a spam tactic, as it is a common occurrence for dynamic websites (even Google’s own Googlestore.com suffers from this), but it is something you would want to minimize due to the PageRank dilution effects.
22. Does your content read like “spamglish”? Crafting pages filled with nonsensical, keyword-rich gibberish is a great way to get penalized or banned by search engines.
23. Do you have “doorway pages” on your site? Doorway pages are pages designed solely for search engines that aren’t useful or interesting to human visitors. Doorway pages typically aren’t linked to much from other sites or much from your own site. The search engines strongly discourage the use of this tactic, quite understandably.
24. Do you have machine-generated pages on your site? Such pages are usually devoid of meaningful content. There are tools that churn out keyword-rich doorway pages for you, automatically. Yuck! Don’t do it; the search engines can spot such doorway pages.
25. Are you “pagejacking”?” Pagejacking” refers to hijacking or stealing high-ranking pages from other sites and placing them on your site with few or no changes. Often, this tactic is combined with cloaking so as to hide the victimized site’s content from search engine users. The tactic has evolved over the years; for example “auto-blogs” are completely pagejacked content (lifted from RSS feeds). Pagejacking is a big no-no! Not only is it very unethical, it’s illegal; and the consequences can be severe.
26. Are you “cloaking”? “Cloaking” is the tactic of detecting search engine spiders when they visit and varying the content specifically for the spiders in order to improve rankings. If you are in any way selectively modifying the page content, this is nothing less than a bait-and-switch. Search engines have undercover spiders that masquerade as regular visitors to detect such unscrupulous behavior. (Note that cleaning up search engine unfriendly URLs selectively for spiders, like Yahoo.com does on their home page by dropping their ylt tracking parameter from all their links, is a legitimate tactic.)
27. Are you submitting to FFA (“Free For All”) links pages and link farms? Search engines don’t think highly of link farms and such, and may penalize you or ban you for participating on them. How can you tell link farms and directories apart from each other? Link farms are poorly organized, have many more links per page, and have minimal editorial control.
28. Are you buying expired domains with high PageRank scores to use as link targets? Google underwent a major algorithm change a while back to thwart this tactic. Now, when domains expire, their PageRank scores are reset to 0, regardless of how many links point to the site.
29. Are you presenting a country selector as your home page to Googlebot? Global corporations sometimes present first-time visitors with a list of countries and/or languages to choose from upon entry to their site. An example of this is at EMC.com. This becomes a “worst practice” when this country list is represented to the search engines as the home page. Happily, EMC had done their homework on SEO and is detecting the spiders and waving them on. In other words, Googlebot doesn’t have to select a country before entry. You can confirm this to be the case yourself: do a Google search a “cache:www.emc.com” and you will see the EMC’s U.S. home page.

SEO Best Practices

SEO Best Practices:-
1. Are the keywords that you are targeting not only relevant but also popular with searchers? There is no point going after high rankings for keywords that no one searches for. Compare relative popularity of keywords using Google’s free tools (Google AdWords Keyword Tool and Google Insights for Search) and/or paid tools like KeywordDiscovery.com and WordTracker.com before deciding what keywords to employ on your Web pages.

Despite the popularity of individual words, it’s best to target two- or three-word phrases (or even longer). Because of the staggering number of Web pages indexed by the major search engines, competing for a spot on the first or second page of search results on a one-word keyword will typically be a losing battle (unless you have killer link authority). This should go without saying, but the keywords you select should be relevant to your business.

2. Do your page titles lead with your targeted keywords? The text within your page title (a.k.a. the title tag) is given more weight by the search engines than any other text on the page. The keywords at the beginning of the title tag are given the most weight. Thus, by leading with keywords that you’ve chosen carefully, you make your page appear more relevant to those keywords in a search.

3. Is your body copy of sufficient length and keyword-rich?Ideally, incorporate at least several hundred words on each page so there’s enough “meat” there for the search engines to sink their teeth into and determine a keyword theme of the page. Include relevant keywords high up in the page, where they will be weighted more heavily by the search engines than keywords mentioned only at the bottom of the page, where it’s almost like an afterthought. This is known as keyword prominence. Think in terms of keyword prominence in the HTML, not the rendered page on the screen; Google doesn’t realize that something is at the top of the third column if it appears low in the HTML. Be careful not to go overboard to the point that your copy doesn’t read well; that’s called “keyword stuffing” and is discussed later, under “Worst Practices.”

4. Does the anchor text pointing to various pages within your site include good keywords? Google, Yahoo, and Bing all associate the anchor text in the hyperlink as highly relevant to the page being linked to. So, use good keywords in the anchor text to help the engine better ascertain the theme of the page you are linking to. Keep the link text relatively succinct and tightly focused on just one keyword or key phrase. The longer the anchor text, the more diluted the overall theme conveyed to the engine.

5. Do you employ text links from your home page to your most important secondary pages? Text links are, by far, the better option over ALT attributes in conveying to the search engine the context of the page to which you are linking. (An ALT attribute is the text that appears in a small box when you hover your cursor over an image.) ALT attributes can have an effect, but it’s small in comparison with that of text links. If you have graphical navigation buttons, switch them to keyword-rich text links; if that’s not an option, at least include text link navigation repeated elsewhere on the page, such as in the footer (note however that footer links are partially devalued), or consider the CSS image replacement technique, described below.

6. If you must have graphical navigation, do you use the CSS image replacement technique as a workaround, and do those graphics have descriptive and keyword-rich ALT attributes that are useful for both humans and search engines? Image Replacement is a technique that employs CSS (Cascading Style Sheets) to substitute in replacement copy and HTML – such as a text link or heading tag – when the stylesheet is not loaded (as is the case when the search engine spiders come to visit.) The text-based replacement is weighted more heavily by the engines than the IMG ALT attribute — thus it is preferable to relying solely on the ALT attribute. Of the many ways to implement the image replacement technique, most use CSS to physically move the text off the screen (text-indent: -9999em; left:-9999em;display:none, etc), which is not ideal because the search engines may discount this as hidden text.

Important: resist the temptation to work in additional keywords or text into the text replacement, or your site may be hit with a penalty. A few CSS image replacement methods exist that are preferable because they don’t physically move the content off-page and are still accessible, namely The Leahy/Langridge Method, The Gilder/Levin Method and The ‘Shea Enhancement’. It is still useful to have ALT attributes on your images, more for usability/accessibility than for SEO. ALT attributes should contain relevant keywords that convey the key information from the image that the user would not receive if she had image loading turned off.

7. Does your Web site have an XML Sitemap, as well as an HTML site map with text links? An XML Sitemap file provides the search engines with a comprehensive list of all the URLs corresponding to the pages/documents which are contained on your website. This helps ensure all of your pages end up getting indexed by the search engines. But the XML Sitemap is more than just a list of URLs; it can include additional information about each URL, such as the page’s last modified date and priority (which can impact how frequently the page is visited by the search engine spiders and thus how quickly it is refreshed.)

It’s abest practice to also include the location of your sitemap file(s) in your site’s robots.txt, so that the search engines can “autodiscover” the sitemaps on their own without you having to specify the location of the file(s) in each search engine’s Webmaster Center. An HTML sitemap is a different thing altogether. It’s simply a page on your website with links to all your important pages, displayed usually in a hierarchical fashion. A link to the sitemap is typically present in the footer of every page of the site.

HTML sitemaps have long been touted as good “spider food” because it provides the search engine spiders with a links to key pages to explore and index. Use text links, since they are more search engine optimal than graphical links, as already mentioned. Bear in mind that you should ideally try to stay within 100 links per page, as a recommended best practice by Google (this is a rough guideline, not a hard and fast rule). That may mean breaking up your site map into multiple HTML pages.

8. Are the URLs of your dynamic (database-driven) pages short, simple and static-looking? Pages with URLs that contain a question mark and numerous ampersands and equals signs aren’t as palatable to the search engines as simple, static-looking URLs. Either install a server module/plug-in that allows you to “rewrite” your URLs, or recode your site to embed your variables in the path info instead of the query string; or, if you need to minimize resource requirements by your IT team, you can enlist a “proxy serving” solution such as Organic Search Optimizer.

I’ve written about this at length in this two-part article. Another, oft-neglected aspect of URL optimization is making them short for improved click-through from the search results. In my previous article on URL optimization I discussed an interesting study by MarketingSherpa that found that short URLs get clicked on twice as often as long URLs in the Google SERPs.

9. Does your home page and other key pages of your site have sufficient PageRank (link authority)? PageRank is Google’s way of quantifying the importance of a Web page. Put another way, it’s as much about the quality of the links pointing to a given Web page as it is about the quantity (more so, actually). PageRank has been the cornerstone for Google’s ranking algorithm since the beginning. The more important (PageRank-endowed) pages wield more voting power; the page’s “vote” gets divvied up among all the links on the page and passed on to those pages.

Of course, this is a massive over-simplification, and the PageRank algorithm has evolved over the years to include such things as trust and authority to stay ahead of the spammers. Nonetheless, a form of PageRank is still in use today by Google. You can check Google PageRank scores using the Google Toolbar. Mouse over the toolbar’s PageRank meter to display the numerical rating, an integer value between 0 and 10. Yahoo’s importance-scoring equivalent to PageRank has been referred to internally as both LinkFlux and Yahoo! Web Rank at various times. It’s best to refer to the PageRank-like algorithms of the three major engines more generally as “link authority,” “link equity,” or “link juice”.The PageRank scores delivered by Google’s toolbar server are on a logarithmic scale; meaning that integer increments are not evenly spaced. Thus, garnering more links and gaining in PageRank score from 3 to 4 is easy, but from 6 to 7 is a lot harder.

Also bear in mind that the PageRank displayed in the Google Toolbar is not the same PageRank as what is used by Google’s ranking algorithm. In fact, the correlation between the two PageRanks has degraded over time. Potentially a better predictor of your true PageRank score is the “mozRank” score available from Linkscape. “mozRank” approximates Google PageRank using a sophisticated algorithm and an index of 30+ billion pages. mozRank scores are also on a logarithmic scale. A PageRank or mozRank score for your home page of 7 or 8 is a laudable goal.Note that each page has its own PageRank score. Because most of the inbound links your site has garnered point to the home page, your home page almost invariably ends up being the highest PageRank-endowed page of your site. The PageRank that has accumulated on your home page is passed to your internal pages through your internal linking structure.
Bottom line: if a given page on your site doesn’t have enough PageRank (I’m referring to the super-secret, internal PageRank that Google doesn’t share with us SEOs via the Toolbar), then it doesn’t deserve to rank.

10. Does your site have an optimized internal linking structure? Your site’s hierarchical internal linking structure conveys to the search engines how important you consider each page of your site, comparatively. This of course impacts these pages’ PageRank scores and ultimately their Google rankings. The deeper down a page is in the site tree (i.e. the more clicks away the page in question is from the home page), the less PageRank with which that page will be endowed.

Therefore, it’s critical you think carefully about how you spend that hard-earned PageRank, i.e. where and how you link from your home page and from your site-wide navigation to the rest of your site.Generally speaking, the deeper in your hierarchy you hide key content, the less important that content appears to the search engines — if they even find it (which is not a given if it’s very deep). As an aside, this concept applies not only to your linking structure but also to your URL structure: too many slashes in the URL (i.e. too many sub-directories deep) and you convey to the engines that the page is unimportant. A flat directory structure, where you minimize the number of slashes in the URL, helps ensure more pages of your site get indexed.

11. Do your pages have keyword-rich meta descriptions with a compelling call to action? Because meta tags are tucked away in the HTML and hidden from the view of the human visitor, they have been abused like crazy by spammers trying to hide keywords out of view. The original purpose of meta tags was to provide meta-information about the page which could then be used by search engine spiders and other algorithms. One such piece of meta-information is a description of the page (e.g., its content and its purpose), a.k.a. the “meta description”. Although it won’t improve your rankings to define a meta description (or meta keywords or any other meta tag, for that matter), it is useful from the standpoint of influencing what text appears within your listing in the search results (i.e. the “snippet”), in order to better persuade the user to click through to your site.

Yahoo will frequently employ the meta description as the description in your search results listing. Bing is also displaying the meta descriptions in the search listings. Google may incorporate some or all of your meta description in to the snippet displayed in your search listing; it’s more likely if the searcher’s keywords are present in your meta description. More on the intricacies on Google snippets here. The user’s search terms, and related keywords, like those with the same root – are bolded in the search listing, which improves the clickthrough rate to your page (from the search results). This is known as KWiC (KeyWords in Context).

12. Does your site have a custom error page that returns the correct “status code”? Don’t greet users with the default “File not found” error page when they click through from a search engine results page to a page on your site that no longer exists. Offer a custom error page instead, with your logo and branding, navigation, site map, and search box. Important from an SEO standpoint – make sure that “File not found” error page returns in the HTTP header a “status code” of 404 (or potentially a different 400 or 500 level status code depending on the nature of the error), or it 301 redirects to a URL that returns a 404. You can check this with a server header checker, such as this one. If you send a mistakenly send a 200 status code instead, this error page will likely end up in the index, and thus the search results. This is discussed further in the “Worst Practices.” No matter what the reason for the page’s unavailability (e.g., discontinued product, site redesign, file renamed, server or database issues), you shouldn’t be driving visitors away with an ugly error page that doesn’t provide a path to your home page and other key areas of your site.

13. Do your filenames and directory names include targeted keywords? Google engineer Matt Cutts has blogged that this is a useful “signal” to Google, so if it’s easy to do, why not? Separate keywords with hyphens, not with underscores. Avoid having more than a few keywords into a filename or directory name, as it could look spammy to the search engines.

14. Are you actively building links to your site? A steady stream of high quality links don’t just “happen”; just like ongoing, consistently great media coverage doesn’t just “happen.” If it did, link builders and public relations pros would all be out of a job.The most basic of starting points for link building is the authoritative directories like the Yahoo Directory and the Open Directory. Not only do the high quality directories improve your PageRank and consequently your rankings; they also drive direct click-through traffic. If you aren’t already listed in the Yahoo Directory or Open Directory then you should identify the category most relevant to your business and submit your site. A listing in Open Directory also ensures a listing in the (largely forgotten) Google Directory and numerous other directories powered by Open Directory.

Submitting to Yahoo’s directory costs $299 then $299 per year recurring (it’s free for noncommercial sites, though.) Submitting to Open Directory is free but it’s become practically impossible to get into, at least in the most appropriate category for your site, since the Open Directory’s owner (AOL) and its volunteer editors have left the Directory semi-abandoned. Don’t waste your time and money submitting to hundreds of directories, just pick the most critical ones that are relevant to your business/industry and that Google would likely consider authoritative and trustworthy.

For example, a business-to-business company may wish to submit to business.com and ThomasNet.com. Directories that primarily target webmasters and SEOs to sell them listings, rather than end users who would actually browse the directory, are most likely being devalued by Google and thus would be a waste of your time and money to submit to.

Thursday, April 15, 2010

PPC Bid Management

Steps To Set Keyword Max Bids
Set Low Maximum Bids
If you accept a majority of the tool's keyword suggestions, you'll be bidding on a number of different keywords and generating greatly increased clicks and impressions. Additionally, you should enjoy two pricing advantages:
• In authoring relevant, compelling ad text and in creating closely related keyword groupings, you are increasing Quality Score and lowering your costs.
• Combine this with long tail accumulation and manipulation tools enable you to find a rush of extremely cheap clicks. This means that you can afford to avoid overpaying for more expensive keywords.
Track Performance
This provides you with a fire-hose of inexpensive traffic; it is still useful to track the performance of your Ad Groups. The steps here are simple:
• Allot a small amount of time every week to briefly review which groups and keywords are under-performing and over-performing.
• Determine whether to raise bids, lower bids, or pause a group (depending).
Use Common Sense!
You know your business: take a look at underperforming Ad Groups. Is that a term that makes sense? If not, pause the Ad Group or lower the bid.
Tracking performance and managing bids
Having set a low CPC, it's worth-while to keep an eye on the performance of Ad Groups and even individual keywords.
First, login to your AdWords account. Click the "Campaign Summary" link and navigate to the account that is currently running:

You can then quickly investigate your account group performance in a few simple steps:
• Sort by Cost - We recommend first sorting by cost to determine if any Ad Groups are eating up spend without converting. Simply view the Ad Groups producing the greatest spend, and consider manipulating that group(s) (more on how to manipulate a group in a minute).
• Sort by Cost/Conv. - This will again give you a bird's eye view of under-performing groups. Quickly scan high-spending groups to see if their cost-per-conversion is higher than you're comfortable with. If so, optimize the group. The same is true in the other direction: for groups that are performing very well, consider raising your maximum CPC bids to generate more clicks and exposure.
• Sort by Conversion - This allows you to view the groups that are performing. You can then drill down to discover keywords that are performing well within your managed Ad Groups, and can dedicate a group to them specifically. This provides two benefits:
1. More Conversions - By creating a textual ad that's even more specialized, you can differentiate yourself for the queries this keyword is generating, providing you with more highly qualified clicks (and almost certainly, more conversions).
2. Lower Costs - By the same token, the conversions you do get will cost you less. The reason for this is that you're more aggressively qualifying these visitors, and you're driving up your Quality Score; meaning that overall, your costs will go down.
Raising bids and lowering bids to optimize ad groups
There are three tactics we generally recommend in altering bids for campaign optimization:
• Raise Bids - If you see a keyword or Ad Group that is converting very well, but has a low average ad position and a limited number of impressions and/or clicks, raise the bid to get that term more exposure.
• Lower Bids - For terms that are slightly underperforming or costing more per conversion than you are comfortable with, lower the bid to place less account emphasis on this term.
• Pause an Ad Group or Keyword - If a keyword is really eating up your spend; you might consider pausing that keyword. You can try a more specific variation of that keyword that is more likely to convert, optimizing more than just your bid management
PPC Bid Management-Step One
Effective Bid Management - Step 1: Filtering Keywords with Clicks & No Conversions
Filtering Keywords with Clicks & No Conversions identifies irrelevant Keywords in an account. Also identifies Keywords that have lost traffic and or conversion based on previous PPC bid optimization. Doing this based on 1-2 days or weeks of data would not be recommended unless you are dealing with a product that generates extremely high volume of traffic, clicks, and conversions within that time frame. Recommended time frame would be at least a month or more. After identifying these keywords, ask yourself:
• Is the Max. CPC too high?
• As a result is the Avg. Position too high and therefore generating a lot of junk traffic?
• Is the ad copy not relevant to the search query?
• Is the landing page appropriate for the keyword and ad copy?
Fact of the matter is, if the keyword has not converted in over a month and has only contributed to negative spend then it will not convert anytime in the future and you should consider pausing it or deleing it from the account. Another important fact to note is, you might have only a dozen keywords with high spend or you might have 1000s of keywords with only $1-$2 in spend each but those 1s and 2s add up.
Lastly, those 1s and 2s that are mentioned above, you could use a statistical benchmark to determine if the keyword has actually had enough exposure to convert or not. You could do this by taking your overall account’s conversion rate; you can determine what is the minimum number of clicks needed for a keyword to convert. If the keyword has not incurred that many clicks, you might want to consider leaving the keyword active for a while.
• It is advised that you base this decision on a large chunk of data - hence the first logic to check if there is at least 2 months worth of data available.
• Depending on your product/service, only a week’s worth of data might be enough for you to conduct this analysis due to the high volume of Impressions, Clicks, & Cost.
• Always keep track of your Ad Copy and Landing Page changes as that can effect conversions too.
• This analysis has helped me in eliminating Keywords with high cost and identifying Keywords that have contributed negatively to the account and were not suitable for their respective campaigns.
Lost Impressions, Clicks, & Conversions:
• Could be due to Seasonality?
• Could be due to Increased Competition?
• Could be due to previous PPC Bid optimization (maybe a keyword was not converting as effectively, or bid was decreased aggressively)?
PPC Bid Management-Step Two
Effective Bid Management - Step Two: Filtering Keywords with No Clicks and No Conversions
Filtering Keywords with No Clicks & No Conversions identifies irrelevant Keywords in an account or Keywords that are not getting enough exposure - mainly due to low Max. CPCs. It also identifies Keywords that have lost traffic and/or conversion based on previous PPC bid optimizations. This is usually an interesting set to monitor. One way to approach this:
• Identify the positions that convert the most and meet your business goals/objectives.
• Compare that to the position of these Keywords.
• If position of these Keywords is better than the average position at which your account is converting, then these Keywords are just not relevant to your account and hence should be paused/deleted.
• If position of these Keywords is worse than the average position at which your account is converting, then bidding to position (only for these set of Keywords) is an option to test their worth.
• It is advised that you base this decision on a large chunk of data - hence the first logic to check if there is at least 2 months worth of data available.
• Depending on your product/service, only a week’s worth of data might be enough for you to conduct this analysis due to the high volume of Impressions, Clicks, & Cost.
• Always keep track of your Ad Copy and Landing Page changes as that can effect conversions too.
• This analysis has helped me identify Keywords that have no search volume or their PPC was set too low to prove their relevancy to the account.
• When testing a position based strategy, be cautious and consciously try to set a conservative Max. CPC ceiling for the Keywords, as bidding to position can sometimes generate a lot of clicks and cost without any returns, hence constant monitoring would also help.
Lost Impressions, Clicks, & Conversions:
• Could be due to Seasonality?
• Could be due to Increased Competition?
• Could be due to previous PPC Bid optimization (maybe a keyword was not converting as effectively, or bid was decreased aggressively)?
PPC Bid Management-Step Three
Effective Bid Management - Step Three: Filter keywords that generate Conversions but Negative ROI or higher than your target CPA.
If your PPC account is structured keeping your business goals in mind and you have a solid keyword list in the account, you will only have a handful of keyword fitting this profile and for that matter only a handful of keywords converting in your account overall (80-20 rule always). Ask yourself questions:
• Is the Max. CPC too high?
• As a result is the Avg. Position too high and therefore generating a lot of junk traffic and bringing down your overall Conversion Rate?
• What is the CPA or ROI goal and how does it compare to the CPA or ROI of the keyword?
Based on your business setting, use any formula you wish to use to adjust those bids. I personally tend to use the following two formulas the most:
(Target CPA/Actual CPA) X Current Max.CPC
OR
Target CPA X Conversion Rate
If you have an ROI goal, figure out what CPA value will help you achieve that goal and use the formula above to calculate bids.
• It is advised that you base this decision on a large chunk of data.
• Depending on your product/service, only a week’s worth of data might be enough for you to conduct this analysis due to the high volume of Impressions, Clicks, Cost, & Conversions.
• Always keep track of your Ad Copy and Landing Page changes as that can effect conversions too.
Before making a decision to change the bids or pausing or deleting the keyword, find out if the keyword has always performed similarly. If the answer is Yes, then ask yourself:
• Could it be due to Seasonality?
• Could it be due to Increased Competition?
• Could it be due to previous PPC Bid optimization (maybe a keyword was not converting as effectively, or bid was decreased or increased aggressively)?
PPC Bid Management-Step Four
Filter keywords that are converting as well as meeting your ROI or CPA goals. You could set up another test around these by increasing their bids and being a bit more aggressive with them. Some recommendations:
• Create a separate campaign around them to focus only on those keywords with highly targeted ad copies (make sure you pause the original keyword).
• Analyze their position and test if a higher or lower position increases or decreases conversions overtime (be careful to monitor them and not to lose all your good work with them before the test).
• Testing their bids is another test that will eventually lead you to #b i.e. positions getting higher or lower.
• If you can, set up a new account and domain around the keywords converting at an acceptable cost. Generally, setting up a domain themed around keywords can contribute to incremental conversions.
• Or you can simply leave them as is and monitor them over time to make sure they maintain their performance
• This list of keywords is also a good start to spider the list and test variations of the keywords.