Everything You Should Know About Google Mobile First Index

Google is rolling out a new mobile-first index. This means Google will create and rank its search listings based on the mobile version of content, even for listings that are shown to desktop users. Wondering how this will all work? We’ve compiled answers below.

Since the announcement, we have been tracking what Googlers have been saying about the change based on the industry’s questions. Below you will find a compilation of those questions and answers based on coverage from Jenny Halasz, Jennifer Slegg and me.

What is changing with the mobile-first index?

As more and more searches happen on mobile, Google wants its index and results to represent the majority of their users — who are mobile searchers.

Google has started to use the mobile version of the web as their primary search engine index. A search engine index is a collection of pages/documents that the search engine has discovered, primarily through crawling the web through links. Google has crawled the web from a desktop browser point of view, and now Google is changing that to crawl the web from a mobile browser view.

What if I don’t have a mobile website?

Google said not to worry. Although Google wants you to have a mobile site, it will crawl your desktop version instead. Google said, “If you only have a desktop site, we’ll continue to index your desktop site just fine, even if we’re using a mobile user agent to view your site.”

If you have a mobile site, then you need to make sure the content and links on the mobile site are similar enough to the desktop version so that Google can consume the proper content and rank your site as well as it did by crawling your desktop site.

My mobile site has less content than my desktop site. Should I be nervous?

Potentially, yes. Google has said that it will look at the mobile version of your site. If that has less content on page A than the desktop version of page A, then Google will probably just see the mobile version with less content.

This is why Google recommends you go with a responsive approach — the content is the same on a page-by-page basis from your desktop to your mobile site. You can do the same with other mobile implementations, but there is more room for error.

What about expandable content on mobile?

With desktop sites, Google said that content hidden in tabs, accordions, expandable boxes and other methods would not be weighted as high. But when it comes to mobile, Google’s Gary Illyes said content like this will be given full weight if done for user experience purposes. The idea is that expandable content makes sense on mobile and not so much on desktop.

Will this change the Google rankings in a big way?

Both Gary Illyes and Paul Haahr from Google said this should not change the overall rankings. In fact, they want there to be minimal change in rankings around this change. Of course, it is too early to tell, they said — but their goal is not to have this indexing change impact the current rankings too much.

When will this fully roll out?

Google said they have already begun testing this mobile-first index to some users. But it looks like we are still months away from this fully rolling out. Google won’t give us a date because they are still testing the rollout, and if things go well, they may push it sooner. If things do not go well, they may push it back.

Google did say they will push this out to more and more searchers over time as they become more confident with the mobile-first index.

Is this a mobile-friendly ranking boost?

Google has previously said that content that’s not deemed mobile-friendly will not rank as well. That remains the case with this new index.

In the current index, which most people will continue to get results from, desktop content is indexed and used for showing listings to both desktop and mobile users. A special mobile-friendly ranking system is then used to boost content for Google’s mobile listings. Content that’s not mobile-friendly doesn’t perform as well.

In the new mobile-first index, which some people will get results from as Google rolls it out, mobile content is indexed and used for showing listings to both desktop and mobile users. Then the mobile-friendly ranking boost is applied, as with the current system, to mobile-friendly pages.

How can I tell if Google sees my mobile pages?

The best way is to use the Fetch and Render tool in the Google Search Console. Specify the mobile:smartphone user-agent and look at the preview after the fetch and render is complete. What Google shows you in the rendered results is likely what Google can see and index from your mobile site. If content is missing, then you should look into how to fix that and run the tool again.

Ranking signals will come from your mobile, not desktop version

Google has ranked your mobile site based on many signals from your desktop site, as we covered before. That is going to flip, and Google will rank your mobile and desktop sites based on signals they get from crawling your site from a mobile view.

So the page speed of your mobile site will determine the rankings of your mobile site and desktop site in Google. Google will also likely look at your title, H1s, structured data and other tags and content generated from your mobile site, and use them over your desktop site.

Doesn’t this just flip the issue the other way to where Google is ranking its desktop results based on how it sees your mobile site? Yes, but Google knows that, and the trend is that mobile keeps growing and more and more searchers will use mobile over desktop to search.

Will Google have different indexes for mobile and desktop?

Eventually, Google plans to have only one index, one which is based on mobile content, to serve listings for both mobile and desktop users. During this rollout period, there will be two: desktop-first and mobile-first. A smaller group of users will get results out of the mobile-first index. It’s not something that anyone will be able to control. People will likely have no idea which index they’re actually using.

As Google grows confidence in the mobile-first index, eventually that will be the only index used. Or if the new index isn’t deemed useful, Google could go back to a desktop-first index. It has, after all, called the mobile-first index an “experiment.”

Google said in their blog post, “Our search index will continue to be a single index of websites and apps, our algorithms will eventually primarily use the mobile version of a site’s content to rank pages from that site.”

Paul Haahr from Google reiterated it by saying, “Index of mobile pages for mobile users and index of desktop pages for desktop users won’t happen.”

Will links and rankings change because of this?

There is a concern that mobile content tends to have fewer links than desktop content. This is a concern that is similar to the concern listed above around mobile content having less content than desktop content. Google’s search results are very dependent on links and content. So if both links and content are impacted, will the rankings be impacted?

Google said they are still testing, so it isn’t 100 percent clear. Gary Illyes said, “I don’t want to say anything definite about links yet. It’s too early for that cos things are very much in motion.”

Canonicals: Will you need to change them?

Google said the canonicals will not need to be changed, just keep your canonical tags as is, and follow their recommendations as listed on their blog post.

Can I see the change and the impact in the search results now?

Google said you shouldn’t be able to see the change and impact of the mobile-first index rollout now. In fact, Google said it hopes there is little to no impact after it is fully rolled out. Paul Haahr said, “I would be very surprised to detect any effects of mobile-first indexing at this stage.”

Technically, this is a global rollout, so it won’t be hitting specific regions only.

This should be a fairly comprehensive recap of all the questions and answers compiled.


5 Ways to Boost Brand Awareness

The process of building a strong brand for professional services isn’t only about developing an impeccable image and reputation. The market tends to fluctuate and the competition never sleeps. Thus, the aspects just mentioned are insufficient to achieve all the company’s business goals.

What matters, are means of maintaining the status quo (an excellent reputation of the service) in the minds of the target group. This becomes possible through affecting the processes of brand perception. The more it’s visible and recognizable, the easier it is for people to remember and use it. One can achieve this thanks to the five tools discussed below.

1. Showcasing your services using added value

Companies that provide professional services allot most of their budgets to marketing actions, especially advertising. Although traditional advertising campaigns ensure market visibility, they lack the capacity to strengthen the company’s reputation. Here’s where two certain elements come in, that help you efficiently build a brand. These are means for conveying declarative and procedural knowledge, such as articles and e-books. You can use them to provide comprehensive information regarding your services, as well as to share your rich experience.An ad for your service delivers a clear message to the target group: ‘I’m selling this service, because I say so’. A well-crafted article about your operations, on the other hand, allows its audience to draw conclusions and assess the quality of your service on their own. An article is a direct way to showcase the enormity of knowledge and experience possessed by the specialists in a given industry.

By lifting the veil on professional knowledge, you don’t create a one-way relation with the audience, but rather a win-win situation. You share your knowledge and competence, and in exchange, you receive something way more valuable – the trust and loyalty of your target group.

2. Focusing on High – Quality Content

Taking good care of the type and quality of the shared content is a key aspect of modern branding. Professional knowledge that you convey should be updated with the most recent scientific and current affairs content.This is the only way to convey reliable knowledge, that will strengthen the trustworthiness of your services. Writing e-books about the scope of your operations isn’t aimed only at winning new customers, but also at fostering the ties with the currently existing target group, which is directly linked to the next step.

3. Establishing & Maintaining Relations with Co-Partners

Initiating and maintaining contacts is directly correlated to high-quality content shared with the audience. It may be a good idea to start working with other professional service providers, whose work will indirectly affect the image of your brand. The quality of the shared content is closely linked to implementation of research-based scientific knowledge. Conducting your own research is an endeavor that boosts the brand’s credibility and implies establishing a work relation with a company dealing in methodology and statistical analysis. Only precise, standardized and thorough research results confirm the quality of the content you publish.

4. Being Visible Online

Among the most basic platforms, whose efficient use will make your business more recognizable, is the Internet. These days, most people looking for services turn to online search engines. Businesses with great reputation and strong brands, but with no online presence, significantly narrow down the group of their potential clients.Simply having a website isn’t enough either. High level of competition in the market is tough to tame. It’s a jungle out there – there’s one superior rule that governs the myriad of offers – only the strong will survive, which in Internet terms means the one, who’s better optimized for search engines. Optimizing your website in regards to content and keywords is of utmost importance. You might want to take care of this aspect, starting with the analysis of what’s been done so far, and finding aspects that can be improved upon.

5. Social Media

While sitting on high-quality content, you’re looking for channels to reach an audience as wide as possible. Strictly speaking, to make the word come to life, you need means to spread it. Such means are provided by social media. Using social networks, you can share not only your expertise, but also initiate interactions, which are a great source of feedback.Social media can be used to establish new contacts with potential clients, as well as to foster ties with your current target group. This way, a service provider is even able to reach certain groups among his local community. All the essential tools discussed here will help your brand become more recognizable in the sea of similar professional services, which will in turn reinforce the reputation of your business.

Massive Google Update: 16th January 2016

We thought the 9th of January was big, then Google surprised us with the core algorithm update on the 10th, but today… today is something we’ve never seen before. Algoroo is in red again, but now at nearly 4 roos.


Today’s update really dwarfs everything else. It’s a Burj Khalifa.

There hasn’t been a day like this recorded since we started tracking Google’s results. I’m going to have to check my data, to make sure this is not some sort of false alarm but from what I can see other trackers such as Mozcast are showing a very hot day indeed.

mozcastCommunity comments

‘Burj Khalifa’ added a few more storeys today. Seems like lots of things combined again to me, but only summising. UK finance and law look to take a wellying yesterday in UK winners and losers, whilst Google.com ironically was one of “biggest losers in France according to SERP Watch.”

Everything To Know About Google Panda Update 4.2

“This past weekend we began a Panda update that will rollout over the coming months,” said a Google spokesperson about Panda. “As you know, we’re always working to improve Google so search results are higher quality and more relevant for everyone and this is just one way we do that.”The latest refresh, which hit the search space almost after ten long months has affected approximately two to three percent of English language queries, as per the official announcement.

They also confirmed the percent of queries that are impacted by Panda this time around.

2-3% of queries affected

Google has confirmed that this refresh affects only 2-3% of search queries, which is lower than the previous refresh of 3-5% in September 2014 and the last true update in May 2014 which affected 7.5% of search queries.

Don’t forget that affected search queries doesn’t mean those queries all saw the loss of pages in those search results from this new refresh. It also includes pages that have made a return to those search results as well.

Haven’t noticed any major changes in your site’s performance in terms of rankings or traffic? Well, that’s because this rollout is happening at a snail’s pace. Yes, we dug out all the forums… tracked the plus and Twitter profiles of all the influencers heavily… kept refreshing the SEO blogs a few times a day, but no news as yet of any direct impact. All we know is that it started over the previous weekend and may well be spread over the next few months. So, if you’re lucky, your site may not see the impact for months. On the flip side, you might just see your page rankings alter within a matter of days.

“Coming months”

What exactly does “coming months” mean? Does it mean two months? Six months? Or is it in limbo right now depending on how fast (or not) they decide to roll it out on any given day or week?

That said, long roll outs are nothing new to Google, and we have seen it in the past. No one should really be surprised that they aren’t hitting the entire refresh out at once, even though there are some pretty vocal about the long length of the rollout.

Hopefully, this is something we will get further clarification on, or at least an announcement of some variety when it has finished rolling out in the “coming months.”

Who could be looser?

1.Websites those are having thin, duplicate and poor quality content

2.Websites those were affected by last panda updates but didn’t recovered

Who could be winner?

1.Website those are having full quality contents, and regular updates.

2.Websites those were affected by last panda updates and they received/changed content.

White Papers: Key to B2B Enterprise SEO Success

Enterprise marketers, particularly in the B2B space, often find content development for SEO a difficult proposition. The company’s thought leaders, although happy to pontificate, are often afraid of committing to a writing schedule. And the marketer, admittedly, is trying to get others to “squeeze in” content, essentially “off the books.” With no reward, why should anyone help with the difficult, thankless work of writing articles? The trick is for the marketer to take advantage of three key tactics:

• Appealing to the ego

• Re-purposing

• Transcription

Any enterprise that can harness these three elements can pump out high volumes of quality content. This white paper will show how enterprises can use White Papers as the basis of their content generation process, leverage these three elements to supercharge this process, and diversify their traffic sources beyond SEO in the process.

Very High-Quality White Papers: The Core of the System

The approach presented here (see Figure 1) uses the white paper as the core item that all other items flow from. If you have visions of outsourcing your white paper to India or having an intern write it forget it. When writing a white paper, think in terms of creating a comprehensive resource for a topic that only someone intimately familiar with your industry could have created.

This means either you personally are going to have to write it, one of your company’s thought leaders is going to have to write it, or you’re going to have to pay an extremely high-end professional writer to write it. However you handle it, make sure you include a lot of refinement, back-and-forth editing and perfection in any white paper. The better it is, the better everything that flows from it will be.


“When writing a white paper, think in terms of creating a comprehensive resource for a topic that only someone intimately familiar with your industry could have created”

Start By Simply Creating Lots of Lists B2B marketing is really a Features/Benefits, Problem/Solution, and often, a “Total Cost of Ownership” game; all of these play into creating lists, which are a great, easy way to pull together raw material for a white paper. Start out by creating a list of customer problems in the following lists:

• Customer problems

• Implications of those problems

• Aspects of those problems

• Complications or impacts arising from those problems

• Different approaches to solving those problems

• Pluses and minuses of the different approaches

Once you have these lists, a white paper, then, is merely a “peeling the onion” presentation of whichever of those lists you think are worth exploring. It’s the old “tell them what you’re going to tell them, tell them, then tell them what you told them “game.

First, create a flow or circle diagram (just use PowerPoint: a bigcircle in the middle with arrows going to other circles). That will be a summary diagram for the background section that gives a sense of the structure of the white paper; and it can be around the life cycle of a problem, or a product, or elements that contribute to a problem, or players in a market or anything.

Then, the individual elements of the diagram are simply the sections of the white paper, each with its own table/list. Whenever you present a list or a table (say, problems and implications), just present in the table a bullet item, then a sentence. Then in the text itself, expand on this with a paragraph. Pretty soon you have a section, with a table, and with text that describes what’s in the table

One overall structure B2B for white papers is:

1. Cover Page/Title/1 paragraph description (1 page)

2. Introduction: “tell them what you’re going to tell them” (1/2 page)

3. Background: a section that sets the stage, with the diagram, elements in the diagram, and a sentence on each element. (1 page)

4. A Section Dedicated To Each Element: with a table (list) with sub-elements, and

then a paragraph about each sub element. There should be three to five of these sections (1 page each)

5. Conclusion: “tell them what you told them” (1/2 page)

The total length runs around 6-8 pages. If you end up with 10-12 pages, consider splitting it up into two or three separate white papers, and then beefing *those* up until they each reach 6-8 pages. One approach that works great is to simply put down as much material you can into one massive document, then carve white papers out of it.

Take the High Road — Don’t Mention Your Product or Service

A white paper is supposed to be “thought leading.” It’s crass to sell your product with it; your purpose instead is to impress the prospect with your expertise and to entice them to research whether they want to further a relationship with your company. So don’t mention your product — it knocks your company off the pedestal you’re trying to set it on.

Attain Credibility by Quoting Statistics Wherever Possible

wherever possible, quote your own data, or other authorities (results of surveys and so on… include original URLs so the reader can verify it for themselves).

Here’s a good trick, though: if you have knowledge but there is no actual study or numerical data you can easily point to…you can say “Acme’s extensive field experience with widgets has shown that…” After all, who can argue with extensive field experience?

Why It Has To Be Such a Long White Paper

Because if it were three pages, there would be very little to re-purpose, of course! Also, contrary to popular belief, David Ogilvy was right when he said “Long copy sells.”

Also, if someone downloads a white paper and it’s just a few pages, there is a bit of a feeling of being shortchanged. You should make sure your white papers are substantial, meaty and chock-full of solid facts, thoughts, diagrams, and tables – readers will devour it, and it’s all great material for subsequent marketing efforts.

The Real Work Starts When the White Paper Is Done

Now that you’ve created the perfect warhead for your marketing attack, you need to deliver it many different ways — both by delivering it as is, and by re-purposing it.

“Delivery” activities include:

1. Putting it on your website for download under “White Papers.” Be sure to require users to give minimal information – name, email and company — so you can capture them as a lead without discouraging them from downloading.

2. Using it as a Paid Search call-to-action (“Download Free White Paper Now!”).

3. Announcing the White Paper’s availability in a press release.

4. Announcing it on your blog.

5. Cannibalizing sections of it for individual blog postings.

6. Turning the diagram(s) and table(s) into a presentation.

7. Having thought leaders deliver the presentation at conferences.

8. Publishing the transcribed conference presentation on your blog.

9. Discussing the white paper during your regularly scheduled podcast (more below on this).

10. Publishing the transcribed podcast on your blog.

11. Having thought leaders deliver the presentation as a webinar.

12. Publishing the webinar as a video.

13. Publishing the transcribed webinar on your blog.

14. Re-purposing many of the above incarnations in your email newsletter.

That is a lot of mileage for one marketing piece! This is why spending a lot of time making a *great* white paper can pay off in spades. If you did everything listed above, and let’s say you got 5 blog postings out of the material in the white paper… that’s at least *thirteen* blog postings, by my count. If you were to crank out one very high-quality white paper every two months, and then did all of these follow-on activities, you’d be way ahead of the game — much further ahead than if you tried to hound three or four thought leaders and got an occasional short article out of them.

How This Strategy Leverages the Three Elements

This strategy *appeals to the ego.* By providing a thought leader with the perfect raw material, and excuse, to do presentations at a conference, and a webinar, you’re helping them succeed. What thought leader doesn’t like speaking at a conference? It’s a huge ego boost.

This strategy *re-purposes* content as text, audio, video, and presentations. Notably, much of it ends up being reworded/paraphrased, just in the natural course of things – which is great from an SEO perspective. When a thought leader presents the concepts at a conference, he or she will naturally use their own way of describing them; the podcasters, when discussing the white paper topic, will have their own.

Finally, *transcription* is the secret weapon of this strategy. When you can’t get people to write for you, you often can get them to *talk* for you. You’d be shocked at how much content you can create through transcription of talks, podcasts, and webinars. Try Speechpad at $1/minute. If you have historical content locked away in videos and audio recordings, it could be the cheapest way to produce high-quality content at your command.

Google Update – Google Confirms Quality Update

This week, several sources began discussing what appeared to be a significant update to Google’s algorithm, although the search giant hadn’t made an announcement about the change. Now, Google has confirmed that it made an update to its core search algorithm, and that the latest changes are not related to the Panda algorithm update that is expected shortly.

Multiple reports indicate that although Google updates its algorithm often, the update that was made June 17 was more significant than many others, although Google isn’t providing any details and the specifics of the change aren’t yet clear. Some other reports indicate the possibility that this week’s update was instead similar in scope to the almost constant updates that Google makes to its algorithm.

The search giant said to expect additional core search algorithm updates in the future as they continue to work on improving search quality.

“This is not a Panda update,” wrote Google in a statement to Search Engine Land. “As you know, we’re always making improvements to our search algorithms and the Web is constantly evolving. We’re going to continue to work on improvements across the board.”

Google’s Gary Illyes wrote on Twitter that he “can’t comment more on this though, we make hundreds of changes every year, you know the rest.”

One interesting aspect of what has been reported is that many automated tracking tools, such as Mozcast, showed massive spikes this week in terms of changes happening in the Google search results. Initially, Dr. Pete Meyers from the Moz team thought maybe the HTTPS algorithm had been updated and given more weight, but Illyes said on Twitter that this was not the case. One possible reason so many tools showed a spike this week could be related to the number one Google search result, Wikipedia, changing all of its URLs to go HTTPS this week. This changed so many Google search results, which could cause the tools to spike.

Overall, it seems that what is known is that this was not Panda, not HTTPS and also not Penguin. It may instead simply be a normal Google core search update that the company isn’t providing any details on. We’ll be analyzing the situation further as more information becomes available.

It’s all about the quality (as usual!)

In this past update Google changed how they calculate “quality” content. While Google of course won’t disclose the calculation or parameters, it could be a shift in the relative weights of existing parameters. At the end of the day, it’s just another step toward Google’s goal to make life better for the user. Sometimes it may even be at the expense of using Google to generate traffic; some companies who repeatedly get hit hard from these updates are more prone to reallocating budgets to other traffic driving methods.

But it’s important to remember that for Google, it really is all about the user experience. Panda, Penguin, ‘Mobilegeddon,’ PageSpeed, Caffeine, the Knowledge Graph. All of these updates work toward that aim.

How Much Are the Search Results Worth to Google?

Not once, but twice Google has accidentally publicly displayed values on their search results. But unfortunately both times the values were encoded and we can’t see that data all the time.

Unfortunately the only data points Google share with marketers are

  • the price they charged you for a click
  • how many clicks they sold you
  • any conversion data you decide to share with them
  • rough estimates of click value via their Traffic Estimator tool

You only get those first 3 AFTER you are charged for the traffic, and the fourth offers rough estimates which are likely to be wildly off unless you understand their various ad matching types and normalize the data.

Putting the Value of SEO in Context

As large as Google’s revenues are, more searchers click on the free / organic search results than the paid ads. With that in mind, I thought it would be fun and rewarding to create a guide to estimate the value of ranking for particular keywords in major search engines for 3 reasons:

  1. to raise awareness of the importance and value of SEO
  2. to help motivate businesses and individuals to optimize their web presence
  3. to encourage top ranked websites to improve the quality of their sites in an attempt to protect and improve upon their current rankings (and thus improve the quality of the web as a whole)

Appreciating Google’s Market Domination

U.S. Search Market

As shown in this image from Search Engine Land, most internet traffic rating services show Google to have 60% to 70% of the U.S. search market.

(CP = Compete.com, CS = Comscore, HW = Hitwise, & NR = Nielsen//NetRatings)

Given the above data, it is no surprise that advertiser spend at large SEM companies are aggressively focused on Google. Efficient Frontier’s Q1 2008 data [PDF] shows that Google enjoys 77.2% of their client search ad spend.

Google Under-monetizes to Ensure Long Term Growth

Google is less comprehensive with ad coverage than competing services. When Google choses not to display ads they are not maximizing short term revenues. This Comscore research shows that Google is being selective with when they display ads – only showing them when relevancy is strong.

Holding their advertisers to higher minimum bids and higher relevancy standards shows that Google is intentionally foregoing short term revenues to ensure longterm growth, which makes their current market domination even more impressive.

International Market Domination

In many international markets, that domination is more fierce. In the UK Google not only has nearly 90% of the search market, but – according to Hitwise – Google controls a full 36.55% of the traffic going to UK websites. And, due to how search works, that 36.55% of traffic is generally more targeted and more valuable than the remaining pool of traffic.

The 2007 Global Search Report shows that Google dominates most European and Asian countries. Google leads the market in most countries in the report – with the exceptions of China, Czech Republic, Estonia, Japan, Russia, and South Korea.

Step 1: Establishing a Baseline Keyword Value

Given a fairly constant ranking position and traffic stream you should be able to estimate visitor value AND how much additional value would be created by improving your rankings. If you are not yet tracking your traffic and conversion trends you have a few options:

  1. Use analytics tools to start tracking your traffic and conversions.
    1. If you have a newer small site you may rank well in Microsoft prior to ranking well in Google. Google typically takes longer to rank in if you are starting from scratch.
    2. You can look at search marketshare numbers (from companies like Compete.com, Comscore, Hitwise, and Nielsen Netrating – reported monthly on Search Engine Land) to roughly estimate Google click volume based on how many clickthroughs you get from your Microsoft ranking.
  2. Set up an AdWords account, bid aggressively on appropriate keywords to ensure your ad gets strong distribution, and determine the value of a click based on tracking…
    1. your volume of traffic
    2. your conversion rate
    3. the average customer value (based on a metric like direct immediate ROI or lifetime customer value).
    4. Yahoo! offers a free ROI calculator to help you figure out how much you can afford to pay for traffic, and estimate your potential monthly profit or loss based on (search volume * cost per click * conversion rate * profit per conversion).
  3. Estimate the value of traffic based on tools offered from search engines, including the Google Traffic Estimator and Microsoft Ad Intelligence.
    1. The Google Traffic Estimator can be particularly rough because there are many variables that go into its calculations (including: bid price, ad CTR, quality score, etc.)
    2. This data can be refined by comparing new keywords against keywords you have ranked well for months or years.
    3. Microsoft Ad Intelligence offers top category keywords, and category based value estimates, though they have less traffic and a less efficient ad network than Google does.

Sample Data from Google Traffic Estimator

If you leave the bid price blank while inserting your keywords in the Google Traffic Estimator, they will return click cost estimates and estimated clicks per day that are associated with ranking in the top ad position for 85% of relevant search queries.

Keep in mind that the value of these clicks depends on several major factors

  • market competition – high volume and high value keywords are more efficient
  • AdWords ad placement – if ads appear above the organic search results and are exceptionally relevant they can get a 10% to 30% clickthrough rate. Ads that appear on the side of the search results are typically much more likely to be clicked on ~ 1% of the time.
  • language & country – settings like language and country help you control what market you are researching
  • ad match type – Google offers 3 common ad matching types that control ad distribution, and 2 that help filter your ads from showing

Ads Above the Organic Search Results

Here is a sample of click stats from a campaign spending a couple thousand dollars a month, where this ad is typically the only advertiser for the keywords, and the ads appear above the organic search results. Please note that for the keywords where the average ad position is 1.5 it is for broad match versions of the keyword, where there are competing advertisers for some search queries. Each row of data represents a different keyword.

Sidebar Ads

Here is a sample of data where the ads more commonly appear on the sidebar, though for a couple of the leading search queries in the group (those showing a 3% to 6% clickthrough rate) the ad sometimes appears on the top of the search results.

Accuracast Data

Accuracast shared their data on the CTR by ad position for over 1 million ad clicks.

Understanding Google AdWords (& Google Traffic Estimator) Keyword Match Types

  • [exact match] – will show values for that specific keyword
  • “phrase match” – will show values for searches containing that keyword (also includes exact match values)
  • broad match – will show values for additional related keywords (includes exact match & phrase match values)
  • negative match – this is used inside AdWords to prevent your ads from showing up. Google also has a match type called embedded match that less than 1% of advertisers probably use, and is beyond the scope of this document.

From my experience, depending on the commercial intent of the keyword, the number of advertisers appearing above the organic search results, and the user intent of a query you can multiply the Google Traffic Estimator numbers by about 4 to 7 to come up with traffic estimates that a #1 organic ranking would garner.

By default our keyword tool offers a convenient export option that generates exact match, phrase match, and broad match versions of keywords. We also offer a free keyword wrapping tool that allows you to wrap your keyword lists.

Sample Data from Microsoft Ad Intelligence

Microsoft Ad Intelligence offers a lot of free data. Here is a sample of top health insurance category keywords

Here are Automotive Category KPIs – notice how the average cost per click is highest in the high margin trucks & SUVs category.

Step 2: Typical Click Distribution Profile

A friend recently said “whether we’re 15 or 150 doesn’t make much a difference.” Indeed, search clicks are heavily concentrated on the top portion of the first page of search results. And this trend toward traffic consolidation has accelerated as time has passed.

AOL’s Leaked Search Data

In August of 2006 AOL leaked millions of search records. Some SEOs scoured through this data to look at click data by ranking. A comment on Jim Boykin’s blog reveals the percent of clicks for each position for 9,038,794 searches and 4,926,623 clicks. Donna Fontenot shared the relative click volume of lower ranked results relative to the top ranked site.

Overall Percent of Clicks

Relative Click Volume

  1. 42.13%, 2,075,765 clicks
  2. 11.90%, 586,100 clicks
  3. 8.50%, 418,643 clicks
  4. 6.06%, 298,532 clicks
  5. 4.92%, 242,169 clicks
  6. 4.05%, 199,541 clicks
  7. 3.41%, 168,080 clicks
  8. 3.01%, 148,489 clicks
  9. 2.85%, 140,356 clicks
  10. 2.99%, 147,551 clicks
  1. 3.5x less
  2. 4.9x less
  3. 6.9x less
  4. 8.5x less
  5. 10.4x less
  6. 12.3x less
  7. 14.0x less
  8. 14.8x less
  9. 14.1x less
1st page totals: 89.82%, 4,425,226 clicks
2nd page totals: 10.18%, 501,397 clicks

Based on this data, if you are ranking 8, 9, or 10 you may be able to increase your traffic for that keyword 1,400% by ranking #1. Even jumping from #8 to #3 can triple your traffic.

Eye Tracking Research

Gord Hotchkiss put together 2 eye tracking studies showing how searchers interact with search results.

Notice the following

  • how tightly Google keeps attention focused on the upper left corner of the search results
  • how attention drops off the further down you go on the page
  • ads in the right column generally do not get much attention

More Organic CTR Research

Here is an image from SEO Researcher highlighting the results of a much smaller eye tracking study by Cornell from 2004.

In May, 2010, ad network Chitika shared traffic stats coming into their network by Google ranking. It shows a similar profile to the leaked AOL data.

In December, 2010, Optify did a study of search traffic by ranking. It shows a similar profile to the leaked AOL data. In addition to showing the typical clickthrough rate data that other studies have shown, they also highlighted how keyword popularity and click price impact organic search clickthrough rates. For expensive keywords their data suggest a majority of the clicks go to AdWords ads, whereas on cheap keywords almost all the clicks go to the organic search results. On a per-keyword basis things like user intent & relevancy signals can further impact search click distribution.

Slingshot SEO also created a SERP CTR curve.

AWR launched an interactive organic CTR curve using data from Google Webmaster Tools. It includes features for comparing clickthrough rates across keywords based on: search intent, category, the number of words in the search, branded vs unbranded queries, and variations on the number of ads on the result page.NetBooster offers a CTR template in Excel which allows webmaster to analyze data exports from Google WMT to create their own CTR curve.

The thing to be aware of with the CTR curves is that they are generally illustrative, but they may not be precise for your website & your keywords. (If they were precise across the board then clearly more of the studies would line up exactly against each other in aggregate.)

Here are some examples of how keywords may be different. Some keywords:

  • have 3 AdWords ads above the organic results
  • have an additional Google Advisor ad slot
  • have irrelevant ads and/or organic search results, which cause users to look deeper through the result set
  • are brands, which have many people clicking on the official website listing (some brands are given a larger listing with multiple sitelinks in it)
  • have graphical product ads in the search results
  • are localized to rank local sites higher in some areas
  • have results from Google verticals: Youtube videos, Google News, Google images, Google Product Search, Google Squared, flight data, etc.
  • appear differently on different search engines, with not only different rankings, but also different ads & promotional pieces in and around the search results content area

Since every keyword is different, it is valuable to consider the CTR curves as generally illustrative, rather than a precise measure that is true across the board. If you want more granular data for your website, Google’s Webmaster Tools provides aggregate CTR data per-keyword basis & CTR data based on ranking position.

WMT Warnings

Organic Search Result Click Trends

A large part of the reason that the top listing gets so many clicks is because many searches are navigational / brand focused. Hitwise research shows over 80% of brand searches end up at the brand owners’ website.

But even beyond that brand bias, there is a significant advantage to ranking #1 for other search queries. Jakob Nielsen wrote a 2005 column titled The Power of Defaults, in which he cited research showing…

  • 42% of users clicked the top search result while 8% clicked the second listing.
  • When the top two listings were swapped 34% of searchers still clicked the top listing and 12% clicked the second listing.

Since that point in time the power of defaults has only increased. Over the years the general trend is more searchers clicking listings on the first page with fewer clicks on the second page or subsequent search results. This 2008 iProspect report reveals how drastic this trend has been

Year 1st page top 3 pages > 3 pages
2002 48% 81% 19%
2004 60% 87% 13%
2006 62% 90% 10%
2008 68% 92% 8%

From my experiences those iProspect search habbits skew a bit low, and a blog post from May of 2008 by Enquisite shows nearly 90% of searchers click on the first page of search results. Their data also includes image results, so the consolidation of traffic on the first page of text search results might be even greater than represented in their overall data sample.

As search grows more sophisticated, the web grows to offer more quality signals, search companies collect more usage data, and we grow more accustomed to search the trend of clicking high on the first page is likely to accelerate at an even faster pace.

Pay Per Click Ad CTR Research

In 2004, Atlas DMT shared data about how ad rank affects the clickthrough rate of their clients Google AdWords and Overture ads.

Since 2004 all major PPC networks have got more aggressive at implementing quality scores. Throughout 2007 Google tightened down on quality scores to show fewer ads across fewer search queries, which has had the effect of giving greater weight to the top ranked ads and lowering the opportunity to buy cheap ads where relevancy is limited.

Some of the other search engines show ads across a much larger percentage of their overall search queries, as showing in this chart from Search Engine Land.

Comparing Organic Search Clicks to PPC Clicks

Conversion Rates Are Roughly Equal

A 2004 MarketingSherpa survey of 3,007 marketers highlighted that paid search works slightly better for B2B oriented offers, while organic search works better for B2C offers.

Survey results revealed that much depends on what your target demographic is and what your conversion action is. So, b-to-b marketers seeking lead generation wound up with 7.6% conversions from paid search versus 6.7% conversions from organic clicks. On the other hand, b-to-c ecommerce sites with an average sale of $51-100 converted 4.8% of paid search clicks to buyers versus 6.5% of organic clicks.

A 2006 WebSideStory study gave paid search a slight edge on B2C ecommerce sites, stating that paid conversion rates favored organic search rates 3.4% to 3.13%.

Organic Search Gets More Click Volume

A 2004 iProspect study [PDF] showed the clickthrough rates at the major engines for organic vs paid search results

Please note that a large part of that data skew for Microsoft was because they used LookSmart, which was far more aggressive with ad placement than current search engines are.

  • In 2004 at the New York Search Engine Strategies conference a JupiterMedia analyst stated that 5 out of 6 commercial purchases which originated from search originated from the organic search results. They also stated “algorithmic listings in search indexes generate an estimated six in seven commercially natured search referrals.”
  • In early 2008 Google’s Avinash Kaushik stated that 14% of Google clicks are on paid search ad and 86% of clicks are on organic search results.
  • 2008 Penn State research titled Determining the informational, navigational and transactional intent of Web queries [PDF] found thatroughly 80% of search queries were informational, while approximately 10% were each navigational and transactional. With so many searches being informational and navigational, it is unsurprising that people click the organic search results more often than the associated PPC ads.

Using Competitive Research Tools to Estimate Click Distribution

Compete.com Search Analytics allows you to view the top 5 destinations for the exact match version of a keyword for free, and offers further data if you subscribe to their paid service.

For the keyword credit cards you can search Google to see what the search results look like, and look at Compete.com’s keyword destination results. Notice how Compete.com shows the keyword share that each site gets…that is the percentage of the search traffic from that keyword that they are capturing. This is exceptionally useful if you already rank in the search results and want to see how much lift a better ranking might bring.

* Due to their limited sample size this strategy is only effective for high volume keywords. I believe Compete.com data is currently focused on United States based searchers, though they were recently purchased and are looking to expand their offering.

If search results have changed over time, that may also throw off the ability to pull meaningful click distribution data from the current search rankings, but many of the highest value keywords are fairly stable, largely because many are based on large volumes link equity.

Wikipedia Shares Their Stats for Free!

Google ranks Wikipedia content across a wide array of keywords. This website allows you to see how many pageviews each Wikipedia page gets, and lists the top 1,000 pages, here is a random sample of that data.

Many of these pageviews are driven by Google organic search rankings. Tie this sample data in with the leaked AOL search data referenced above, and for any Wikipedia page that ranks well in Google you should have a good idea of the potential traffic in that space.

Because of it’s encyclopedic nature, each Wikipedia page can rank for a broad basket of related keywords rather than ranking for just a single keyword – so you can think of the Wikipedia stats as being similar to the traffic totals for a group of related broad matched keywords.

Step 3: Factors That Modify Click Distribution

A variety of factors must be taken in account for when estimating overall search volume distribution. While this list is not exhaustive, it contains many of the common factors worth considering. Whenever possible we also offer tips for how to overcome these data biases.

  • brand – if people are looking for a specific brand or intend to go to another location it is hard to outrank the core brand and hard to be perceived as being more relevant or clickworthy
  • user intent & information scent – some users research while others are looking to buy.
  • paid search ads – if paid ads appear above the organic search results they drive down the organic listings
  • sitelinks, subdomains, & multiple listings – Google gives some top ranked sites up to 8 sublinks beneath their listing, which drives down competing listings. For brand related queries it is also common for subdomains to show up.
  • related search suggestions – some search results recommend related search queries. this can play a particularly large role in important verticals like health.
  • clickworthiness – some listings are easier and more appealing for searchers to click on
  • seasonal trends – some searches are particularly seasonal
  • universal search / vertical search – some search results feature house content, selected editorial partner content and/or content from vertical databases
  • level of competition – if the opportunity cost is too great then it might not be worth the effort to try to rank
  • geography – known local sites tend to get a boost in local search results.
  • personalization – if people frequently visit your website then Google may promote your pages in subsequent search results.
  • keyword tail – in some categories the core keyword has most of the search volume, while traffic is more spread out in other areas. This factor will be covered more in step 4.


If someone is looking for a specific brand it is hard to outrank the official site or appear more relevant for core brand related search queries, plus people end up visiting the destination brand site on most navigational searches.


  • An easy way to compete for brand related queries is to rank for reviews, coupons, specific part numbers, accessories, or other related keyword terms that do not force you to compete directly with a company like Dell Computers for the keyword Dell.
  • If you are a brand that sells directly consider setting up a policy that prevents affiliates from outbidding you and driving up your costs on your own brand related keywords.
  • In some cases it may make sense to bid on your brand related keywords, especially if your brand is generically descriptive in nature (like CreditCards.com).
  • If you own a branded web property it helps to create at least one strong subdomain to help drive down any negative publicity and/or competing ads that may arise down the road.

User Intent

Most people who search are looking to research information rather than buying an item, and yet most pay per click ads aim to sell items.


  • If a person searches for a research oriented keyword and your listing uses words like compare and reviews in it then the searcher will find your listing more relevant to their needs and be more likely to click on your listing.
  • Microsoft offers an online commercial intention tool, which offers insights into search intent.

Paid Search Ads

If paid search ads appear above the organic search results they drive down the organic search results and take away many of the potential clicks. Google is aggressively focused on keeping attention focused on the upper left corner of the search results, and will only display ads on top of the search results if they have high perceived relevancy driven by a high CTR.

Tip: We typically scan search results in groups of 3 or 4. If 3 paid search ads appear above the search results for a keyword you really need to be in the top 3 organic search results to get much traffic.

Sitelinks, Subdomains, & Multiple Listings

When Google believes a query has a good chance of being navigational in nature they may place a list of sitelinks under the first listing. Sitelinks can really drive down competing ads.


  • If you have sitelinks for a highly valuable commercial keywords it may also make sense to bid on the related AdWords ads to further reinforce that you are the default market leader in that space.
  • You are more likely to get sitelinks if your domain name exactly matches the search words.
  • If you resell a brand which has sitelinks and subdomains listed for the core term make sure you target some related keyword tail phrases rather than trying to rank for the core word. Unless your site is exceptionally authoritative or your market is not competitive, for the core keyword you might be better off buying AdWords ads.
  • If you own a branded web property it helps to create at least one strong subdomain to help drive down any negative publicity and/or competing ads that may arise down the road.
  • For moderately competitive keywords authoritative sites may be able to get a double listing which may double the probability of a searcher clicking on your site since you have 2 listings in the search results and your site is set apart because one of those listings is indented.

Related Search Suggestions

All major global search engines offer suggested related queries in the search results for some search queries. Sometimes they appear at the top of the search results and sometimes they appear at the bottom.

Tip: If core industry related words are too competitive, you can rank for related keywords that search engines recommend. That allows you to tap the traffic flow of the competitive keywords without requiring the authority needed to rank for them.


Many people target keywords without understanding search intent. If your listing is more relevant than competing services then people are more likely to click on it.


  • Search for some of your most important keywords that you rank for and look at how your listing compares to competing listings. Could you make your presentation more unique or relevant?
  • When creating new pages on important topics consider the search presentation when crafting your page title, meta description, and filenames.

Seasonal Search Trends

Some keywords grow popular around news items or particular dates.


  • Subscribe to related news publications and blogs to track conversation in your marketplace.
  • Use tools like Google Trend to evaluate when search volume picks up for important keywords.
  • Promote important seasonal time sensitive offers at least one month before it becomes popular such that when people search for it your site already ranks.
  • When using paid search set important seasonal keywords in their own ad groups or ad campaigns and monitor and manage them closely when search volume starts picking up.

Universal Search / Vertical Search

Google owns many vertical properties and sometimes integrates results from editorial partners (in the news vertical) or house content (Google Local, images, YouTube videos, etc.) in their search results. In April of 2008 iProspect released a Blended Search Results Study [PDF].


  • If Google has placed vertical search results in their organic results or you think there is a good chance they may then optimize your presence in those verticals.
  • Ensure your site has editorial, tools, brand, and/or some other value add which still allows you remain a destination as Google pulls more data and value directly into their search results.

Level of Competition

Trying to rank for a competitive keyword in 2008 with a new site may not be feasible unless you have significant capital, significant social currency, and/or great ideas.


  • If you are on a brand new website ensure you are using appropriate keyword modifiers and are aiming at some long tail keywords your site has the ability to rank for.
  • Download SEO for Firefox and watch the associated how to video to understand how to evaluate the competitive landscape of a keyword.
  • When you see weak competitors ranking surprisingly well, use competitive research tools to see what else they are ranking for.


People in different locations see different search results.


  • If your target market is foreign consider doing at least one of the following
    • registering a domain name matching that country
    • hosting your site in that country
    • register your site as being associated with that country inside Google Webmaster Tools (can be done at the domain, subdomain, or folder level).
  • Use this Google Global Firefox extension to quickly look up your rankings.
  • If you want to track where you rank over time our Firefox Rank Checker is also worth downloading.


People who have seen your site before are more likely to see it ranked well in subsequent search results.


  • Participate in your community. Make sure you know your market well to use the language that they do, so they come across your site early in the buy cycle.
  • Create the type of content that people want to cite.
  • Create lots of content that make your site relevant for a wide array of keywords.
  • Use a clean and effective site design and show social proof of value so your site is easy to trust and subscribe to.
  • Build interactivity and community into your website so people keep coming back to your site.
  • Advertise aggressively if it makes sense with your current business model. If you sell something you may further extend your reach via the use of affiliate programs.

Step 4: Tapping the Keyword Tail

The Tail is Long

People search for everything under the sun. When the leaked AOL search data was manually classified into 20 different categories the category with the leading volume was other

That other is bigger than most people appreciate.

  • In a 2004 presentation on Challenges in Running a Commercial Web Search Engine [PDF] Google’s Amit Singhal mentioned that out of over 200 million unique daily search queries seen by Google over 100 million are unique.
  • On May 16, 2007, at the Searchology event, Google’s Udi Manber stated that 20 to 25% of the queries that Google sees in any given day are queries that they have never seen before. Matt Cutts later clarified this data point, saying that it is accurate if you look back through the most recent month’s queries.

While earlier research showed more focus on head keywords, in January of 2007 OneStat shared search volume breakdown by number of keywords in the search query for 2 million website visitors. From their data 4 word search queries were more popular than 1 word queries.

In early 2008 Google’s Avinash Kaushik stated that the average Google query consisted of 4 words.

Competitive Research

There are many competitive research tools on the market which show how many keywords a website ranks for. Compete.com Search Analyticsalso estimates what percent of a site’s traffic is driven by each keyword they rank for.

  • By comparing the number of keywords you rank for compared to competing sites you can get a good idea how long your keyword tail is compared to their tail.
  • By looking at the percentage breakdowns of their top keywords you can find important keywords you forgot to target.
  • It may also be helpful to look at competitive research stats for slightly broader keywords and slightly broader websites to look for trends amongst them.

Comparing Search Volumes for Keyword Match Types

Google Traffic Estimator and Microsoft Ad Intelligence allow you to compare the size of the keyword tail to a core keyword. Looking at these ratios and how competitive the search results are for the related keywords can help you determine where to focus your energies.


  • If head keywords [exact match] have a lot of search volume try to buy a domain name that matches those keywords.
  • If keywords have a long tail, create content focused on the tail, especially if few competitors are targeting it.

Compete.com Search Analytics shows keyword distribution estimates for both exact match and broad match versions of keywords. If a site appears low for the exact match version of a keyword but high for the broad match version then they are doing a good job of capturing the keyword tail.

Search Suggestion Tools

Search engines like Google and Yahoo! help auto-complete search queries via services like Google Suggest and Yahoo! Search Suggest. They generally place these recommended keywords in order of popularity / search volume.

Some search results also offer search refinement options that help drive searchers to related search queries.

The Importance of Keyword Modifiers

Many searches that have a long tail associated with them have a number of common modifiers associated with them. By working these modifiers into your content you can rank for a much broader basket of keywords. Here is a XLS spreadsheet of common keyword modifiers and categories.

User Generated Content

Some of the highest value web publishing networks have a strong value because they encourage readers to contribute content to their websites. Dozens of product reviews or comments added to a page offers a lot of unique content which helps the page rank for a much wider net of related keywords. Here are stats showing a single blog post ranked for hundreds of unique keywords