In late 2011, Google announced an effort to make search behavior more secure. Logged-in users were switched to using httpS from http. This encrypted their search queries from any prying eyes, and kept from being passed on to websites the users visits after seeing search results.
This led to the problem we, Marketers, SEOs, Analysts, fondly refer to as not provided .
Following revelations of NSA activities via Mr. Snowden, Google has now switched almost all users to secure search, resulting in even more user search queries showing up as not provided in all web analytics tools.
Yahoo! has recently announced switching to httpS as standard for all mail users, indicating secure search might follow next. That of course will mean more referring keyword data will disappear.
At the moment it is not clear whether Bing, Baidu, Yandex and others will move to similarly protect users’ search privacy; if and when they do, the result will be loss of even more keyword-level user behavior data.
Initially, I was a little conflicted about the whole not provided affair.
As an analyst, I was upset that this change would hurt my ability to analyze the effectiveness of my beloved search engine optimization (SEO) efforts – which are really all about finding the right users using optimal content strategies.
But it is difficult to not look at the wider picture. Repressive (and some not-overtly repressive) regimes around the world aggressively monitor user search behavior (and more). This can place many of our peer citizens in grave danger. As a citizen of the world, I was happy that Google and Yahoo! want to protect user privacy.
I'm a lot less conflicted now. I've gone through the five stages in the Kubler-Ross model. Besides, I've also come to realize that there is a lot I can still do!
In this post I want to share four angles on secure search:
1. Implications of Secure Search Decision.
2. What Is Not Going Away. #silverlinings
While not provided is not an optimal scenario, you'll see that things are not as bad as initial impressions might indicate, yes there are new challenges, but we also have some alternative solutions, and realize that the SEO industry is not done innovating. Ready?
1. Implications of Secure Search Decision.
No keyword data in analytics tools.
We are headed towards having zero referring keywords from Google and, perhaps, other search engines.
This impacts all digital analytics tools, regardless of what company and whether they use javascript or log files or magic beans to collect data.
Depending on the mobile device and browser you are using (for example, Safari since iOS 6), you have already been using secure search for a while regardless of the search engine you use. So that data has been missing for some time.
There are a number of "hacks" out there with promises of getting close enough keyword data, or for marrying not provided with some of the remaining data and landing pages. These are well meaning, but almost always yield zero value or worse drive you in a sub-optimal direction. Please been careful if you choose to use them.
No keyword data in competitive intelligence/SEO tools.
Perhaps you (like me) use competitive intelligence or SEO tools to monitor keyword performance. For example, for L'Oreal:
Secure Search will also impact data in these tools. It will be increasingly distorted because it will reflect only traffic from the small audience of visitors who are not yet using secure search or are using other non-secure search engines or only the type of people who allow their behavior to be 100% monitored – including SSL/httpS. Sample and sampling bias.
You can read this post to learn how these tools collect data: The Definitive Guide To (8) Competitive Intelligence Data Sources.
I really loved having this data. It was such a great way to see what competitors were doing or where I was beating them on paid or organic or brand or category terms. Sadly, it does not matter which tool you use. These tools will only show you a more distorted view of reality. Please be very careful about what you do with keyword data from these tools (though they provide a lot of other data, all of which was of the same quality as in the past).
[sidebar]
These changes impact my AdWords spend sub-optimally. A lot of the keywords I used to add to my campaigns came from the long, long tail I saw in my organic search data (I would take the best performers there and use PPC to get more traffic) and from competitive intelligence research. With both of these sources gone, my AdWords spend may take a dive because I can't find these surprising keywords — even using the tools you'll see me mention below! How is this in Google's interest?
[/sidebar]
No keyword-level conversion analysis.
We have a lot of wonderful detailed data at a keyword level when we log into SiteCatalyst or WebTrends or Google Analytics. Bounce Rates, % New Visits, Visit Duration, Goal Conversions, Average Order Value.
All this data will no longer be available for organic search keywords.
As hinted above, our ability to understand the long tail — often as much as 80% of search traffic — will be curtailed. We can guess our brand terms and product keywords, but the wonderful harvest of category-type, and beyond, keywords is gone.
Current keyword data is only temporarily helpful.
Remember: On a daily basis 15% of the queries on Google have never been seen before by the search engine. Daily! For all 15 years of Google's existence!
That is one reason the data we have for the last year or so, even as not provided ramped up, might only be temporarily helpful in our analysis.
Another important reason historical data becomes stale pretty quickly is that any nominally functioning business will have new products, new content, new business priorities, and all that impacts your search strategy.
Finally, with every change in the search engine interface the way people use search changes. This in turn mandates new SEO (and PPC) strategies, if we don't want to fail.
So, use the data you have today for a little while to guesstimate your SEO performance or optimize your website. But know that the view you have will become stale and provide a distorted view of reality pretty soon.
2. What Is Not Going Away. #silverlinings
While we are losing our ability to do detailed keyword analysis, we are retaining our ability to do strategic analysis. Search engine optimization continues to be important, and can still get a macro understanding of performance and identify potentially valuable keywords.
Aggregated search engine level analysis.
The Multi-Channel Funnels folder in Google Analytics contains the Top Conversion Paths report. At the highest level, across visits by focusing on unique people, the report shows the role search plays in driving conversions.
You can see how frequently it is the starting point for a later conversion, you can see how frequently it is in the middle, and you can see how frequently it is the last click.
I like starting with this report because it allows us to have a smarter beyond-the-last-click discussion and answer these questions: What is the complete role of Search in the conversion process? How does paid search interplay with organic search?
From there, jump to my personal favorite report in MCF, Assisted Conversions.
We can now look at organic and paid search differently, and we are able to see the complete value of both. We can see how often search is the last click prior to conversions, and how often it assists with other conversions.
The reason I love the above view is that for each channel, I'm able to present our management team a simple, yet powerful, understanding of the contribution of our marketing channels – including search.
Selfishly, now we can show the complete value, in dollars and cents, we deliver via SEO.
[Bonus: For more on next steps and attribution modeling please see: Multi-Channel Attribution Modeling: The Good, Bad and Ugly Models.]
If you are interested in only the last-click view of activities (please don't be interested in this!), you can of course look at your normal Channels or All Traffic reports in Google Analytics.
This is a simple custom report I use to look at the aggregated view:
As the report above demonstrates, you can still report on your other metrics, like Unique Visitors, Bounce Rates, Per Visit Value and many others, at an aggregated level. You can see how Google is doing, and you can see how Google Paid and Organic are doing.
So from the perspective of reporting organic search performance to senior management, you are all set. Where we are out of luck is taking things down from here to the keyword level. Yes, there will still be some data in the keyword report, but since not provided is an unknown unknown, you have no idea what that segment represents.
Organic landing pages report.
Search engine optimization is all about pages and the content in those pages.
You can use a custom landing pages report (click that link to download) and apply your organic search segment to that report to get a view that looks like this:
The top landing pages getting traffic from organic search. And of course our Acquisition, Behavior, Outcome metrics.
See Page Value there? Now you also know how much value is delivered when each of these pages is viewed by someone who came from organic search.
So let's say you spent the last few weeks optimizing pages #2, #3 and #5; well, now you can be sad that they are delivering the lowest page value from organic search. Feel sad.
Or, just tell your boss/client: "No, no, no, you misunderstood. I was optimizing page #4!" : )
The custom landing pages report also includes the ability drill down to keyword level, just click on the page you are interested in and you'll see this:
With every passing day this drilldown will become more and more useless. But for now, it is there if you want to see it.
Let me repeat a point. I've noticed some of our peer SEOs making strong recommendations to take action based on the keywords you are able to see beyond not provided. I'm afraid that is a career-limiting move. You have no idea what these words represent – head, mid, tail, something else – or what is in the blank not provided bucket. Be very careful.
Paid search keyword analysis report.
We all of course still have access to keyword level analysis for our paid search spend.
There is one really interesting bit in the paid search reports that you can use for SEO purposes.
When you submit your keywords and bids, the search engine will match them against user search queries. In Google Analytics you have Keyword, in your AdWords report, as above, but if you create a custom report you can drill down from Keyword to Matched Search Query. The latter is what people actually type. So for "chrome notebook," above, if I look at the Matched Search Query I can see all 25 variations the users typed. This is very useful for SEO.
You can download my custom report, it is #2 in this post: Paid Search/AdWords Custom Reports
Beyond this, be judicious about what inferences you draw from your paid search performance. Some distinguished SEO experts have advocated that you should use the distribution of visits/conversions/profits of your AdWords keywords and use that to make decisions about effectiveness of your SEO efforts. Others are advising you to bid on AdWords and guesstimate various things about SEO performance. Sadly these are also career-limiting moves.
Why?
When you look at your AdWords data, you have no idea which of these four scenarios is true for your business:
And if you don't know which is true — and you really won't with not provided in your way — is it prudent to use your AdWords performance to judge SEO? I would humbly suggest not.
If you want to stress test this,…. go back to your 2011 (pre-not provided) data for paid and organic and see what you can find. And remember since then Google has made sixteen trillion changes to how both paid and organic search work, and your business has at least made 25.
Don't assume that your SEO strategy should reflect the prioritizations implied by your AdWords keyword data. The reason SEO worked so well is that you would get traffic you might not have known/guessed/realized you wanted/deserved.
3. Alternatives For Keyword Data Analysis.
With not provided eliminating almost all of our keyword data, initially for some search engines/browsers and likely soon from all, we face challenges in understanding performance. Luckily we can avail ourselves of a couple of alternative, if imperfect/incomplete, options.
Webmaster Tools.
Here are the challenges Google's Webmaster Tools solves: Which search queries does my website show up for, and what does my click-through rate look like?
I know this might sound depressing, but this is the only place you'll see any SEO performance data at a keyword level. Look at the CTR column. If you do lots of good SEO — you work on the page title, url, page excerpt, author image and all that wonderful stuff — this is where you can see whether that work is getting you more clicks. You work harder on SEO, you raise your rankings (remember don't focus on overall page rank, it is quite value-deficient), you'll see higher CTRs.
You will see approximately 2,000 search queries. These are not all the search queries for which your site shows up. (More on this in the bonus section below.)
There are a couple of important things to remember when you use this data.
If you go back in history and do comparative analysis for last year's data when not provided was low, you'll notice that your top 100 keywords in Google Analytics or Site Catalyst are not quite the same as those in WebMaster Tools. They use two completely different sources of data and data processing.
Be aware that even if you sort by Clicks (and always sort by clicks), the order in which these queries appear is not a true indication of their importance (in GA when I could see it, I would see a different top 25 as an example). The numbers are also soft or directional. For example, even with 90% not provided Google Analytics told me I had 500 visits from "avinash kaushik" and not 150 clicks as shown above.
Despite these two caveats, Webmaster Tools should be a key part of your SEO performance analysis.
It is my hope that if this is how search engines are comfortable sharing keyword level data, that over time they will invest resources in this tool to increase the number of keywords and improve the data processing algorithms
Our good friends at Microsoft also provide Bing Webmaster Tools, and don't forget the excellent Yandex Webmaster Tools. Take keyword performance data from anyone who'll give it to you reliably.
[Bonus]
1. Google's Webmaster Tools only stores your data for 90 days. If you would like to have this data for a longer time period, you can download it as a csv. Another alternative is to download it automatically using Python. Please see this post for instructions: Automatically download webmaster tools search queries data
2. GWT only shows you data for approximately 2,000 queries which returned your site in search results. Hence it only displays a sub-set of your query behavior data. The impact of this is in the top part of the table above, Impressions and Clicks. During this time period my site received 1,800k Impressions in search results, but GWT is only showing data for 140k of those impressions because it is only displaying 2,574 user queries. Ditto for Clicks. If I download all the data for the 2k queries shown in GWT, that will show behavior for just 8,000 of the 50,000 clicks my site received from Google in this time period. Data for 42,000 clicks is not shown because those queries are beyond the 2k limit in GWT.
Update: 3. In his comment below Jeff Smith shares a tip on how to structure your GWT account to possibly expanding the dataset to get more information. Please check it out.
Update: 4. Another great tip. Kartik's comment highlights that you can link your GWT account with your AdWords account and get paid and organic click data for the same keyword right inside AdWords. Click to read a how-to guide and available metrics.
[/Bonus]
Google Keyword Planner.
The challenge Google Keyword Planner solves: What keywords (user search queries) should my search engine optimization program focus on?
In the Keyword Planner you have several options to identify the most recent keywords — the most relevant keywords — to your website. The simplest way to start is to look for keyword recommendations for a specific keyword.
I choose the "search for new keyword and ad group ideas" section and in the landing page part type in the URL I'm interested in. Just as an example, I’m using the Macy's women's activewear page:
A quick click of the Get Ideas button gives us … the ideas!
I can choose to look at the Ad Group ideas or the Keyword Ideas.
There are several specific applications for this delightful data.
First, it tells me the keywords for which I should be optimizing this specific page. I can go and look at the words I'm focused on, see if I have all the ones recommended by the Keyword Planner, and if not, I can include them for the next round of search engine optimization efforts.
Second, I have some rough sense for how important each word is, as judged by Avg. Monthly Searches. The volume can help me prioritize which keywords to focus on first.
Third, if this is my website (and Macy's is not!), I can also see my Ad Impression Share. Knowing how often my ad shows up for each keyword helps me prioritize my search engine optimization efforts.
It would be difficult to do this analysis for all your website pages. I recommend the top 100 landing pages (check that the 100 include your top product pages and your top brand landing pages — if not, add them to the list).
With the advent of not provided we lost our ability to know which keywords we should focus on for which page; the Google Keyword Planner helps solve that problem to an extent.
You don't have to do your analysis just by landing pages. If you would like, you can have the tool give you data for specific keywords you are interested in. Beyond landing pages, my other favorite option is to use the Product Category to get data for a whole area of my business.
For example, suppose I'm assisting a non-profit hospital with its analytics and optimization efforts. I'll just drill down to the Health category, then the Health Care Service sub-category and finally the Hospitals & Health Clinics sub-sub-category:
Press Get Ideas button and — boom! — I have my keywords. In this case, I've further refined the list to only focus on a particular part of the US:
I have the keyword list I need to focus my search engine optimization efforts. Not based on what the Hospital CEO wants or what a random page analysis or your mom suggested, but rather based on what users in our geographic area are actually typing into a search engine!
A quick note of caution: As you play with the Keyword Planner, you'll bump into a graph like this one for your selected keyword or ad group ideas. It shows Google's estimate of how many possible clicks you could get at a particular cost per click.
Other than giving you some sense for traffic, this is not a relevant graph. I include it here just to show you that it is out there and I don't want you to read too much into it.
Google Trends.
The challenge Google Trends tool solves: What related and fastest-rising keywords should I focus on for my SEO program?
Webmaster tools focuses us on clicks and the Keyword Planner helps us with keywords to target by landing pages. Google Trends is valuable because it helps expand our keyword portfolio (top searches) and the keywords under which we should be lighting a fire (rising searches).
Here's an example. I'm running the SEO program for Liberty Mutual, Geico, AAA or State Farm. My most important query is car insurance (surprise!). I can create a report in Google Trends for the query "car insurance" and look at the past 12 months of data for the United States.
The results are really valuable:
I can see which brand shows up at the top (sadly it’s not me, it’s Progressive), I can see the queries people are typing, and I can see the fastest-rising queries and realize I should worry about Safeco and Arbella. I can also see that Liberty Mutual's massive TV blitz is having an impact in increasing brand awareness and Geico seems to be having support problems with so many people looking for its phone number.
I can click on the gear icon at the top right and download a bunch more data, beyond the top ten. I can also focus on different countries, or just certain US states, or filter for the last 90 days.
I can also focus on different countries, or just some of the states in the US or only for the last 90 days. The options are endless.
There are two specific uses for this data.
First, I get the top and rising queries to consider for my SEO program. Not just queries either, but deeper insights like brand awareness, etc.
Second, I can use this to figure out the priorities for the content I need to create on my website to take advantage of evolving consumer interests and preferences.
If you have an ability to react quickly (not real-time, just quickly) the Google Trends tool can be a boon to your SEO efforts.
Competitive Intelligence / SEO Tools.
Competitive intelligence tools solve the challenge of knowing: What are my competitors up to? What is happening in my product/industry category when it comes to search?
SEO tools solve the challenge of knowing: What can I do to improve my page ranks, inbound links, content focus, social x, link text y, etc.?
There are many good competitive intelligence tools out there. They will continue to be useful for other analysis (referring domains, top pages, display ads, overall traffic etc.), but as I mentioned at the top of this post, the search keyword level data will attain a even lower quality. Here's a report I ran for L'Oreal:
If you see any keyword level data in these tools, you should assume that you are getting a distorted view of reality. Remember, all other data in these tools is fine. Just not any of the keyword level data.
There are many good SEO tools out there that provide a wide set of reports and data. As in the case of the CI tools, many other reports in these SEO tools will remain valuable but not the keyword level reports. As not provided moves toward 100% due to search switching to https, they will also lose their ability to monitor referring keywords (along with aforementioned repressive and sometimes not-so-overtly repressive regimes).
When the keywords are missing, the SEO tools will have to figure out if the recommendations they are making about "how to rank better with Bing/Google/Yahoo!" or "do a, b, c and you will get more keyword traffic" are still valid. At a search engine level they will remain valid, but at a keyword level they might become invalid very soon (if they’re not already)
Even at a search engine level, causality (in other words, “do x and y money will come to you”) will become tenuous and the tools might switch to correlations. That is hard and poses a whole new set of challenges.
Some of the analysis these tools start to provide might take on the spirit of: "We don't know whether factors m, n, and q that we are analyzing/recommending, or all this link analysis and link text and brand mentions and keyword density, specifically impact your search engine optimization/ranking at a keyword level, or if our recommendations move revenue, but we believe they do and so you should do them."
There is nothing earth-shatteringly wrong about it. It introduces a fudge factor, a risky variable. I just want you to be aware of it. And if you want to feel better about this, just think of how you make decisions about offline media – that is entirely based on faith!
Just be aware of the implications outlined above, and use the tools/recommendations wisely.
Let's try to end on a hopeful note. Keyword data is almost all gone, what else could take its place in helping us understand the impact of our search engine optimization efforts? Just because the search engines are taking keywords away does not mean SEO is dead! If anything, it is even more important.
Here are a couple of ideas that come to my mind as future solutions/approaches. (Please add yours via comments below.)
Page "personality" analysis.
At the end of the day, what are we trying to do with SEO? We are simply trying to ensure that the content we have is crawled properly by search engines and that during that process the engines understand what our content stands for. We want the engines to understand our products, services, ideas, etc. and know that we are the perfect answer for a particular query.
I wonder if someone can create a tool that will crawl our site and tell us what the personality of each page represents. Some of this is manifested today as keyword density analysis (which is value-deficient, especially because search engines got over "density" nine hundred years ago). By personality, I mean what does the page stand for, what is the adjacent cluster of meaning that is around the page's purpose? Based on the words used, what attitude does the page reflect, and based on how others are talking about this page, what other meaning is being implied on a page?
If the Linguistic Inquiry and Word Count (LIWC) can analyze my email and tell me the 32 dimensions of my personality, why can't someone do that for my site’s pages beyond a dumb keyword density analysis?
If I knew the personality of the page, I could optimize for that and then the rest is up to the search engine.
Crazy idea? Or crazy like a fox idea? : )
Non-individualized (not tied to visits/cookies/people) keyword performance data.
A lot of the concern related to privacy is valid, and even urgent when these search queries are tied to a person. The implications can be grim in many parts of the world.
But, I wonder if Yahoo!/Bing/Google/Yandex would be open to creating a solution that delivers non-individualized keyword level performance data.
I would not know that you, let's say Kim, came to my website on the keyword "avinash rocks so much it is pretty darn awesome" and you, Kim, converted delivering an order of $45. But the engines could tell us that the keyword "avinash rocks so much it is pretty darn awesome" delivered 100 visits of which 2% converted and delivered $xx,xxx revenue.
Think of it as turbo-charged webmaster tools – take what it has today and connect it to a conversion tracking tag. This protects user privacy, but gives me (and you) a better glimpse of performance and hence better focus for our organic search optimization efforts.
Maybe the search engines can just give us all keywords searched more than 100 times (to protect privacy even more). Still non-individualized.
I don't know the chances of this happening, but I wanted to propose a solution.
Controlled experimentation.
Why not give up on the tools/data and learn from our brothers and sisters in TV/Print/Billboards land and use sophisticated controlled experiments to prove the value of our SEO efforts?
(Remember: Using the alternative data sources covered above, you already know which keywords to focus your efforts on.)
In the world of TV/Radio/Print we barely have any data – and what we do have is questionable – hence the smartest in the industry are using media mix modeling to determine the value delivered by an ad.
We can do the same now for our search optimization efforts.
First, we follow all the basic SEO best practices. Make sure our sites are crawlable (no javascript wrapped links, pop-ups with crazy code, Flash heavy gates, page tabs using magic to show up, etc.), the content is understandable (titles in images, unclear product names, crazy stuff), and you are super fantastically sure about what you are doing when you make every page dynamic and "personalized customized super-relevant" to each visitor. Now it does not matter what ranking algorithm the search engine is using, it understands you.
Now its time for the SEO Consultant's awesomely awesome SEO strategy implementation.
Try not to go whole hog. Pick a part of the site to unleash the awesomely awesome SEO strategy. One product line. One entire directory or content. A section of solutions. A cleanly isolatable cluster of pages/products/services/solutions/things.
Implement. Measure the impact (remember you can measure at a Search Engine and Organic/Paid level). If it’s a winner, roll the strategy out to other pages. If not, the SEO God you hired might only be a seo god.
At some level, exactly as in the case of TV/Radio/Print, this is deeply dissatisfying because it takes time, it requires your team to step up their analytical skills and often you only understand what is happening and not why. But, it is is something.
I genuinely believe the smartest SEOs out there will go back to school and massively upgrade their experimentation and media mix modeling skills. A path to more money via enriching skills and reducing reliance on having perfect data.
There is no doubt that secure search, and the delightful result not provided, creates a tough challenge for all Marketers and Analysts. But it is here, and I believe here to stay.
My effort in this post has been to show that things are not as dire as you might have imagined (see the not going away and alternatives sections). We can fill some gaps, we can still bring focus to our strategy. I'm also cautiously optimistic that there will be future solutions that we have not yet imagined that will address the void of keyword level performance analysis. And I know for a fact that many of us will embrace controlled experimentation and thereby rock more and charge more for our services or get promoted.
Carpe diem!
As always, it is your turn now.
I'm sure you have thoughts/questions on why not provided happened. You might not have made it through all the five stages Kubler-Ross model yet. That is OK, I respect your questions and your place in the model. Sadly I'm not in a position to answer your questions about that specifically. So, to the meat of the post …
Is there an implication of not having keyword level data that I missed covering? From the data we do have access to, search engine level, is there a particular type of analysis that is proving to be insightful? Are there other alternative data sources you have found to be of value? If you were the queen of the world and could create a future solution, what would it do?
Please share your feedback, incredible ideas, practical solutions and OMG you totally forgot that thing thoughts via comments.
Thank you.
PS: Here's my post on how to analyze keyword performance in a world where only a part of the data was in not provided bucket: Smarter Data Analysis of Google's https (not provided) change: 5 Steps. For all the reasons outlined in the above post this smarter data analysis option might not work any more. But if only a small part of your data, for any reason, is not provided, please check out the link.
Avinash,
I think one thing worth considering to mine "organic" searches is to enable Site Search or log keywords search on your site if your site has a search functionality (do this!).
Often times users come in from Google Search and then narrow down with more long-tail'ish keywords.
If you're not looking at what users are searching on your site after they get there, you might be missing out on some interesting keywords.
Joseph: I. Love. This. Idea!
I can't believe I forgot this. But yes, internal site search is a goldmine of ideas for SEO and even looking at conversion performance (even if by the time the person is on our site the commercial intent is stronger).
I'll update this post with your thought. Thank you.
-Avinash.
PS: My post: Kick Butt With Internal Site Search Analytics
Since Google is still willing to sell your PPC search details to advertisers, it makes its claim of 'doing it for security and privacy' farcical.
Cody: You are right, we continue to be able to see search performance at a keyword level inside GA
It is in section two of the post.
-Avinash.
Cody: About a week before you posted this, I watched Googler Bill Kee get hammered over exactly this point @ a DMA session. It was supposed to be a discussion about attribution, but once someone pulled on this thread and it got real heated. Even a fellow panelist from Adoboe piled on.
The problem is twofold.
First, it reminds us all that we need Google more than they need us. And that sucks. In so many ways Google is single handidly responsible for our robust eCommerce industry, but when decisions like this get handed down without community input, it, well, sucks to be us.
Secondly, they have zero credibility when they argue that it's based on privacy concerns yet still provide on the paid side. It's either a privacy issue or it isn't. Period.
Hi Avinash,
As always an illuminating and balanced overview of the real implications and solutions to the (not provided) situation.
One item that I thought deserved a little more attention: monitoring rank.
The new situation means that we need to join the dots between keyword research, volume and quality of traffic to individual pages, and in my opinion average positions for monitored keywords.
The tricky part will be correlating the various factors – a bewildering combination of the work we do, algorithm changes, competition's efforts, volumes of traffic and rankings. This in my opinion is vital.
My guess is that one of the big players (Moz is the most likely) will figure out a means of joining the dots and making the connections between these data sources. If not there's a great opportunity to build a much sought-after tool – but for someone far cleverer than me.
Dave: GWT is providing you average rank, it is nice to have that directly from Google. : )
I'm not sure that with the keywords going away that we are necessarily in a worse situation regarding rank. People said they could guess what causes rank to go up or down, usually they were just making bad guesses. These guesses are made much worse by Google/Bing/Yandex constantly making changes to their algorithms.
SEO remains as complicated today as it was yesterday. There are general themes of what we are supposed to do: great content, crawlable sites, earned links, videos, active social presence, solid customer love – expressed digitally, etc. We still do the same thing (and still without being able to concretely guess which one is doing what!).
The disappointing part is that we could show "Look my efforts resulted in keyword x moving profit by $yy,yyy! Give me more money!!" We are going to have to get more creative about how to show that.
-Avinash.
AK,
Thanks for sharing such detailed articles.
I'm yet to figure out connection between NSA, Mr. Snowden and Google's decision to encrypt all search queries after the revelation. May be I missed the bit of news.
At the end of all this we are left with nothing but frustration to find keyword traffic which is bread and butter for our community.
Google Webmaster Tools's search query (and other reports) downloading via API doesn't work. I've tried both PHP script and Python. It worked with PHP and I manage to download desired data after tweaking and playing around with the script but guess what, the data is incorrect. Downloaded data doesn't match with the one on GWT.
At the moment I've to donkey work to manually download data from GWT on monthly basis. And it's not practically if you work with 500 odd websites. A simple solution to keep track of keyword traffic would keep SEO people happy, it doesn't matter if they stick it in GA or GWT.
Ajay: I would recommend working with an expert if you need more help with automating the GWT download. Perhaps one of the GACPs can help you, here's a list: http://www.bit.ly/gaac
Also, please do try again. Recently the GWT team has implemented many upgrades (including providing 2k queries instead of just 1k).
And do see the second item I mention in the bonus section of GWT. It is important to know this is not all of the solution (at least for now).
-Avinash.
This was one of the rare instances in which I wasn't looking forward to reading your article…the connections you made and the reasoning behind them simply did not make any sense.
I also find it a bit hypocritical, and frankly, comical, that GA is now offering demographic data. There is actually less privacy for users now, demographic data, geographic data, landing page data, and user behavior data…who needs keywords
Bragi: Remember, demographic and psychographic data is massively aggregated (regardless of if you use Yahoo! or Google or AOL or any other digital platform). If you want to dig into this a bit more please consider two things: Sample bias, sampling bias.
I would absolutely not use the data in GA, YWA etc. as an ability to tie to a specific person. It is ok for general trends or, in a best case scenario, micro-segments.
Avinash.
Avinash,
As one of life's optimists, the positive spin I've put on this recent development is:
1. Google's hummingbird update renders exact keyword matching less important.
2. It creates a level playing field as far as quality of content is concerned and the best content will do well even if it's not quite so great in terms of SEO.
I know, clearly I have my head in the sand, but shifting attention to high quality content using common sense approaches to keywords in the way you've shown above can't be so bad can it?
Susie: You had comment #8 I think, it warms my heart to bump into an optimist!
1. I have betrayed my bias about rank in my reply to Dave in this post. Like other updates, Hummingbird evolves the algorithm but I'm not sure from an Analyst/SEO perspective it makes keyword data less relevant. It would have been nice to know, as Google/Bing/Yahoo! evolve at their end, what the impact is, at least on clusters of keywords.
2. This is very important. I wish it did not come this way, but I do think this will reduce dramatically some of the deep obsession with every little tiny nuance of every tiny thing displayed by some in the SEO community. They obsess well past the point of diminishing returns. This I think will go away now (or dramatically reduce), freeing up time for more productive endeavors.
That said, if we produce high quality content we should care about ensuring we do good SEO. Not at a nitty-gritty let me rework the Bing algorithm level, but the six things I mention in my reply to Dave above.
Thanks!
Avinash.
PS: Your head is not in the sand. You found this post, you commented!
Great post, as ever. You & Rand Fish move the bar up and we all follow the pathways. Bottom line for me is the foundations that SEO can work off for all the players that do not have the deep pockets to pay the auction engine ferryman. The "keyword" keywords root domain is about brand recall and the ability to carry elements common of the TM brand. Get the early stage strategy right or compete with deep pockets.
Thanks for updating the detail report on this topic.
I am a regular reader of your posts and your posts help me a lot in my research work.
Keep on sharing such type of posts help people like me and others know and understand it better.
Hi Avinash,
Great article, once again!
I know you've covered GWT organic search data in your article. However, by linking your GWT account to your AW account (yes, provided you are an AW advertiser, that is!), you now get to see the interplay between Paid & Org data. And, many a time, I see more queries & kws data in that awesomely awesome report.
Would you recommend that as a potential source of keyword mining (for PPC) & query sourcing (SEO)?
Avinash, here is another tip for your bonus or #silverlinings section:
If your website has URLs with classic subdirectory structure like domain.com/topic/subtopic/subsubtopic, then you can create a profile for each subdirectory (as long as it has children) in Google Webmaster Tools.
This can expand the dataset significantly, reaching deep into the long tail depending on the overall traffic level and diversity of keywords people use for your type of content.
If you haven't done so, give it a try and you will probably be pretty happy you did.
Jeff, would this impact the connection between webmaster tools and analytics?
Bill, so far what I've seen is that the GA account uses the data from the main profile, but not the sub-profiles. It seems like the caps might be set on the GA side. I guess you could make GA profiles that match the GWT profiles to see if GA will show deeper data, but I haven't done so. I haven't done this primarily because I import the GWT data to a database and populate a table with other internal data, which includes some high-level stuff from GA and some from our own homegrown toolkit. It would be interesting to see if the GA team would consider a feature to expand the data automatically if other profiles are created in GWT.
I'm not a security person, and maybe you know more, but can you help explain to me how breaking the standard protocol of how httpS works, is making things more secure?
For example, normal httpS works such that when data transfers from httpS to httpS it gets passed through securely and only between those two parties. Now, Google has changed this process so that httpS to httpS does NOT pass one specific data for one channel and passes stlil for another. This was a specific change to how httpS works and I'd think this is a less secure way given the modification of httpS, which leaves it open potentially to more hacks.
Would love to know if I'm wrong on this?
Micah: Every page on this blog is proof that I'm not a security expert either.
From what has been publicly shared, I've understood that almost all of the web is http and while your behavior on a httpS site would be "hidden," sharing data over to a http site would still be open to being monitored. Hence the action.
Perhaps the entire web should switch to httpS with 2048-bit encryption keys. That would mean more privacy/security for everyone (regardless of search/keyword). And maybe then options become available for data sharing.
I'm going to try and figure out how to do this on my little blog.
-Avinash.
Okay, but you did say it was more secure and I'm wondering how that could be if Google introduces a hack that changes how httpS-to-httpS functions.
@micah @avinash There has been no change to how https to https functions.
HTTPS > HTTP = no referrer sent.
HTTPS > HTTPS = yes referrer sent
BUT – Google has chosen not to send the full document.referrer EVEN when the connections are SSL on both sides.
In other words, they still don't "play by the rules" even when both parties are HTTPS.
In my humble option, this mainly has to do with the following lawsuit that Google recently settled and is a way to appease the courts & public opinion.
siliconbeat.com/2013/07/22/google-settles-privacy-suit-over-search-queries/
The sad thing about this whole 'privacy' thing is that Google didn't leverage the fact that by telling webmasters / publishers that they won't be able to get keyword referral data unless they upgraded to HTTPS would have made the large swaths of the internet a much more secure place. IF this was really about privacy then Google wouldn't pass keyword referrals to their Advertisers. But, as advertisers "obviously" need this data so that is simply out of the question.
Danny Sullivan has written about this time and again. Most recent article I could find is this one. searchengineland.com/post-prism-google-secure-searches-172487 Danny's stuff is always worth reading.
Maybe if the Google folks get the message that this is bad for Advertisers (as Avinash mentioned) and overall search quality, *maybe* they'll start pushing referrers through to other secure sites. IMHO, that's never going to happen though. G is a money making machine (Q3 was enormous) and their behavior this past year has demonstrated that they really don't give a damn about stomping all over whomever they please. Just do a simple search for "weather" with your zip code and you'll see that they scrape their (admittedly very nicely displayed) weather data directly from Weather.com / Weather Underground / and Accuweather who will no longer get the organic search traffic and ad revenue they used to. Google shows that data on their homepage now for the benefit of the "user."
*Sigh* ->
Yehoshua
Yehoshua, thanks for the clarification! Glad it's not hacking httpS, sucks that they aren't following standard protocols, but at least it's not harming the web directly further.
I fully agree with you Yehoshua. The NSA argument is a red herring. Google prides itself on matching search results with intent. It's not like these spying and/or repressive governments only could see KW data previously and nothing else before the era of "not provided". The referrer doesn't matter if they are still tracking where you go online and they can still track your browser activity if they want to (let alone key loggers, malware, etc.). This is not about governments. It's about civil lawsuits and taking actions to appease privacy advocates who don't like Google. Whether it's the Street View wifi data collection issue or the unified privacy policy allowing shared data across all properties, google had to throw a bone to these advocates.
I find the whole argument by Google completely disingenuous.
@avinash, just for the record, I've passed all of the stages. Just because I've reached acceptance of the new reality doesn't mean I have to buy bad PR ;-)
That being said, for us, with or without KW data we've found a lot more value in evaluating landing pages for their contributions to revenue (directly on last click as well as in MCF models). We've always found that unless you have crazy amounts of traffic, focusing on specific KW phrases yields too fragmented of data to make conclusions in any reasonable amount of time. KW data can definetly help with some directional understanding and generate ideas but KW it wasn't the be all end all either.
I'd hoped that you'd give a shout out to closed loop reporting. So few companies have actually done this. Few companies have implemented the new costing functionality in GA and so many companies, including mine, have 90+% offline sales which are not GA tracked anyway.
The move to Not Provided is an opportunity for people to reevaluate what and how they measure marketing performance. If they can't easily calculate ROI and understand the specific level of contribution that their efforts make to the organization then they should start NOW on connecting their systems to marry up website and other business data. It's not that hard to do. It just requires a little but if technical knowledge (hire a GAAC if you don't have the internal resources) and the dedication to see it through.
Would it be possible to "SSL" based on country? (Perhaps Google.com is non-SSL vs. Google.cn, etc.)
Thanks for the insight Avinash, this 'not provided' change is quite a pain.
But at the same time I think you've pointed out some great tools that we've all probably heard of before but may not have necessarily been using enough due to reliance on our Google keyword reports.
So this change could be a good thing – inspiring creativity and experimentation.
What is the tool you are showing the screenshots from for competitive analysis?
Thank You
Tyler: I've deliberately not named any CI/SEO tools in this post because that might distract from the overall conversation. Don't want to point any fingers and have that discussion.
But I do link to my CI tools post above, you are welcome to check that out as I do discuss many specific tools there.
Avinash.
Thank You
Hi,
I asked this question on linkedin but couldn't find any responce.
Can Any one Explain me why still i am getting some amount of organic keyword in keyword report if google has Implemented Encryption(HTTPS). i think it must be 100% not Provided, then whats the reason of getting some google organic keyword in report?
Shiv: Secure search is rolling out at a different pace in different places. Also for a varied number of reasons some people might not get secure search.
For these two reasons you will see a bunch of not provided, but also see referral keywords from non-secure search sessions.
Avinash.
One of my niche site keywords get ranking . get visitor by search engine but i was confused "not provided" keywords.
After reading this article i am clear.
Thanks for your great information.
Hi Avinash,
I seriously like the fact that Google protects its us, online searchers, by implementing the secure search. But then, if they really want to protect us then they must apply this to paid search. No matter where I look at it it is just about generating more revenues for Google Adword rather than protecting user's privacy.
Thanks!
Truth!
A detailed post indeed.
But this is a general practice that we try to collate the data we're getting at keyword level from the Paid as well as the Organic sources. We always do keyword addition to the paid campaigns from all the sources like SEO, DSA queries, Search queries and currently we started with the PLA queries as well.
I agree that the structure of queries change a lot and for a campaign the ideal scenario is when you've all the possible queries in exact match, but it never happens (that's why they call it ideal) so we just keep digging for all the information we can extract from all other possible sources.
My bet is on the trending keywords (queries) which we can get from Google Trends. That is definitely a go for thing for expansion.
Hi Avinash,
i read all your posts, and even took MM certification, so I am not going to elaborate on how much I appreciate your work and always valuable insights, if you don't mind :)
I wonder what do you think about Paid & organic report in AdWords? The numbers seems pretty specific rather than bucketed, where do they come from? I think it could be also worth looking at among other resources to build up the puzzle.
Thanks
Eva
Using the plethora of evidence that remains can help to uncover the 'most likely' search terms that lead to a visit – a clue to the intent at least… Explanation is here: slideshare.net/DavidSewell2/the-not-provided-tool
Another tool that can help discover new keywords that relate to the intent behind the query is here: searchintent.co.uk
Both are prototypes – so tell me what you think @seoeditors – and hopefully help bring a 'candle in a dark cave' when determining what people are actually trying to do when searching :)
Avinash – I have been waiting for you post on this topic, and you hit the nail right on the head.
The thing I love about this post is that it lays out the issue/problem, but it goes to the next step to provide solutions. As a community, we all need to be more solution driven, things are going to change and we have to adapt.
As an analyst this is a part of the journey and excitement. Who know what new tool this might lead too. Sometimes it can be frustrating, but you can spend all your time whining and creating zero value, or you can jump back in the mud and start playing again.
SEO is just as important today as it was yesterday. We just have to continue to put out EPIC content that follows SEO good practices, and get creative in the way we measure our efforts.
Its not like we have no data!
Great post!
Cheers,
Dominic
P.S. You inspired me to start my own analytics blog. It is still very rough and it will be sometime before I am producing the epic content you have here.
Great post, Avinash.
With Google's attribution of the increase in Not Provided to the NSA revelations, what are your thoughts on Google's willingness to continue sharing paid search terms with advertisers?
To me, it doesn't seem fair that they can make such a huge change in the name of privacy while they continue to allow advertisers to see what keywords are driving clicks.
David: There is certainly some cognitive dissonance when we think about this decision.
But not being involved with the policy setting part, I'm unfortunately not privy to any decisions that might be made regarding the paid search data. Though as might be obvious in this post, I'm an advocate for finding the right balance.
-Avinash.
And we appreciate your insights on finding that balance. :) I was just curious about your thoughts regarding that dissonance. Thanks again for a great post – looking forward to the next one!
David: I believe that different people have different amounts of information. This asymmetry means that our view might be incomplete.
I don't question the motivation of the people who made the decisions, and I know that they have more information around the decision than I do. Hence I'm reluctant to launch attacks on their very existence and cast aspersions. :)
You and I have the same information in this case. It is hard from our point of view to reconcile the no to organic and yes to paid. I hope we get more context that helps us understand the decision. Silence is not helpful, though in some cases it is the best we get.
-Avinash.
PS: Make no mistake, as an Analyst I'm not advocating for taking away paid keywords!
I agree completely on the silence thing. As an analyst of both paid search and SEO campaigns, having keywords in one marketing effort but not in the other has definitely prompted some interesting conversations with my clients and it would be nice to have a clear(er) explanation of why the powers that be have adjusted certain things but not others. Under the guise of post-NSA privacy concerns just doesn't cut it a lot of the time. In fact, I'll be interested to hear what their revenue is like after a quarter or two of securing searches as I know many agencies will be launching Adwords campaigns to gather keyword insight. This might just be the conspiracy theory marketer in me though. ;)
Even so, I'm taking this shift as a blessing in disguise so to speak and have used this as an opportunity to talk about goals with my clients instead of keywords. After all, ranking number 1 for a keyword does not make them money – increases in leads and sales do. I've seen too many marketers let their clients get "down in the weeds" with things like that and failing to focus on the things that impact their bottom lines.
I'm interested to see how Google continues to deal with privacy in search, although I'm more interested in the different ways marketers like you and I accommodate the shift and change the way we define success with our marketing initiatives. As Dylan said: The times they are a-changin'.
Super helpful as always, Avinash! Thanks.
Thanks for providing nice information through this articles coz its upgrades my SEO knowledge. And as per my way If you want to get Success in SEO then you should continue running with Google.
I have never been did PPC but I always use Adwords keywords tools or (Keywords Planner) because if you have small business and you want to success though SEO then this one is very helpful to you.
Wow, thank you so much for this information. I had not idea that things were changing so much with SEO/Keywords.
I'm just getting back into blogging after a few years off and it is amazing how much has changed. I'm literally having to relearn everything.
Thanks for taking the time to clearly lay everything out to novices like me.
Hi Avinash,
I'm a huge fan of your blog. You have a great talent for analyzing, distilling, and communicating!
A point of clarification regarding the appearance of search data in competitive intelligence tools like Compete PRO. You state that search data is disappearing from such tools. Depending on the vendor, this may or may not be correct. With their panelists permission, vendors like Compete have the ability to capture and report on HTTPS activity. For such vendors, the move to HTTPS may reduce the amount of search data (and therefore the sample size) if panelists do not allow that vendor to see HTTPS. However, we expect that the search data in Compete PRO will continue to be viable and useful. For instance, at Compete, we estimate that we still see greater than 90% of the search activity that we had been seeing before the change.
Regards,
Conor O’Mahony,
VP of Products at Compete.
Hi,
Is there any advanced segment we can apply in GA, to get back keywords instead of "not provided". I really needed this.
Thanks for a great post.
Atul: No, I'm afraid the keyword is not sent as a referrer by Google hence no web analytics tool will be able to report those keywords.
Not provided directly means that, not provided.
Avinash.
Recently I have started to expand my horizons diving into the analytics world. With everything I read I see much I still don't know. Or how much there is to learn (as with everything in life, I guess).
Your posts, the ones I have read so far, have always been very enlightening, and this one is no different.
I had already been exploring the not provided when I encountered it, and came to all kind of different 'solutions' (I use the term lightly). Up to the hugely and overly complex figuring out the parameters in Google's search result pages.
As I am in favour of pragmatic approaches, I see a whole range of possibilities in your post – and in the comments – that everyone can use.
I assume the same applies for the 'not set'.
Arjen: I'm glad you found the post to be of value.
(not set) is not the same issue as not provided. Not Set typically denotes some kind of technical configuration issue or a piece of missing data where Analytics was expecting one.
Please see this article for detailed guidance on how to deal with not set: What the value (not set) means https://support.google.com/analytics/answer/2820717?hl=en
-Avinash.
I should have added a '?' to my 'not set' statement (because it was a question).
I asked it because I had read that 'not set' comes from 'private tabs/browsing', which will only increase in use. As it will also be set as the default.
On my company's statistics I saw a huge rise in the (not set) a while ago. While now I cannot find a trace of it anymore.
Did something change?
Will there be an effect from the private browsing?
While reading your post, I feel as if a related had died.
Or, is it myself? A part of me? Or my job? Or only the way I've been working?
I must admit that, sadly, this is the very first time I feel so much anger against Google. And it is of no help…
I think they took a wrong decision. And a brutal one.
This will destroy the business of many smart marketers doing a clever job to overcome their competition.
This will make things harder for the smartest to find their place.
This is an obscurantist decision.
And more importantly: Will this decision prevent NSA from accessing our personal data? And how?
So much effort for nothing! So much analysis, studies and now we are "back to black" as would say Amy!
They switched off the light, and we shall return to our cavern without keyword analysis! Doing keyword positions in Google!!!
And still … I can't help myself thinking this is another attempt to kill SEO!
Why can't we see Google Webmaster Tools in Google Analytics the same way than other profiles? Why can't we apply segments to webmaster tools data in Analytics?
Why just 90 days of Keywords in Google Webmaster Tools?
How can I show my customers that we are doing well with 90 days?
When will Google understand that their attempt to kill white hat SEO is like commiting suicide?
Google, by discouraging white hat SEO talent, you are driving away a lot of attention from your core business to other areas you do not control that well!
This is sad! I won't give-up on this one. I'm too angry for that because a decision against talent, studies and intelligence.
Antoine: I share your feeling of angst around losing this data. It was valuable in optimizing our search efforts (both paid and organic).
I humbly disagree with you that this is an existential threat to SEO. Search engine optimization today is just as important as it was yesterday (for the smartest SEOs, this is an opportunity to get paid even more because they will know how accommodate for the new normal in their work).
With regards to only 90-day data in GWT… I've added a link to a python script that will allow you to download the data and keep it around as long as you want to. Two other commentators were kind enough to share even more options for getting your organic search behavior data, please see update #3 and #4 in the GWT section above.
I'm glad to hear that you are not going to give up and you are going to do the type of strategic analysis that is still possible to ensure that your business gets the max search traffic that it deserves!
-Avinash.
Excellent! Thanks Avinash, this was a great post to find at just the right time (kudos google).
Anyway I have been wondering if we were going to loose GWT keyword data and was happy to see it is remaining. So you feel using GWT keyword data to calculate non-branded organic traffic trends from month to month would be a bad idea? I know there is a huge data set missing but isn't there some consistency in the data they do report?
This is my primary KPI with many clients.
Thanks!
Chris: You should absolutely use GWT!
At the moment two thousand of your top keywords are provided by GWT, it was only one thousand recently so this is great progress. You are able to download it (see links in the post) and you are able to do lots of great analysis with it (including brand and non-brand).
Avinash.
Also, I'm not a huge fan of the interface myself but Raven Tools uses webmater tools data and stores it for a longer period of time.
Avinash thx for this great article… best article I read to this topic …
Hi, Avinash!
There's one part of keyword analysis that I didn't see mentioned here – brand awareness.
It used to be very useful to compare branded & non-branded organic search. Here in the public sector, so we don't have access to any of the competitive intelligence tools that could step into that gap.
What else would you suggest for replacing the branded/non-branded analysis?
Thanks! (as always)
Ramona: Depending on the size of your government, and the size of your agency, you might have data in Google Trends or in some competitive intelligence tools.
Barring that you do have access to Webmaster Tools reports via Google, Bing, Yandex that should still show you this data. In the case of Google top 2k, in case of others you might have more or less.
Avinash.
Google Keyword Planner outbound link broken in article…May want to fix that?
This is a brilliant article. I really appreciate the ways you show to use some of Google's other tools, as well as some third party programs to define areas of interest for our current and potential visitors.
All in all I am glad this move was made regardless of what the security implications are or are not… I think it's important to focus on what our viewers are interested in. What voice do they click with, what kinds of articles get the most love and shares. Before the httpS it was much more about the hard data which almost left you tweaking your content around specific search terms. Which led to redefining your content and outreach and putting personality and soul out the door on you. With this i think we will see an increase in people trying to simply tell a heck of a story and not worry about keyword density and exact match terms. With a little more personality in the posts, now it's about story telling and we will end up with much more compelling pieces taking the best spots.
Do you think this will improve content? With Hummingbird dont you think the keyword focuses will become blurred anyways, at least to a level. I see it moving rapidly to be able to quickly identify intent much more accurately. In that case keywords be damned, it's about telling the best story on the trending subjects that are related to you.
Gerald: I wish it did not come about this way, I still love my keyword data, but I agree with you that these changes will drive strategic analysis and get us all to focus on bigger things.
Google, Bing, Yandex etc. will continue to make changes to their algorithms, we just have to focus on understanding intent, understanding ourselves and our value, and focus on customer centricity. Birds humming, bears dancing, or whatever, we'll be just fine.
I do expect changes to happen much much faster than in the past, and it is our ability to have a portfolio of great Own and Rent existences that will be most valuable. (See #5 here: http://goo.gl/uEHf7H)
-Avinash.
We can not see the details of keywords, but there are still so many ways to find the needed information. If we can make good use of these tools, we will find our gold mine in the future.
Thanks for sharing, Avinash!
When I first started seeing the increase in not provided I too was wondering what sort of cloak and dagger tactics were being used.
Thanks for sharing the post
Morgan
Hi Avinash
Thanks for this article. The pain behind it is obvious. I think we saw the writing on the wall back in 2011, didn't we?
It would be great to have some official clarification on exactly who Google is wanting to withhold the keyword data from. Back in 2011 the claim was made (http://googleblog.blogspot.co.uk/2011/10/making-search-more-secure.html) that it was operators and lurkers on open wireless networks. Now it appears that it's website owners themselves.
Some clarification on who keywords are being withheld from, and why, would be appreciated.
Best regards
– Alan
Alan: I'm afraid I'm not sure I can add anything you might find to be of value, though reasons outlined in the post you've shared seem all valid now (and of course more people fit in the "lurkers" category now).
Perhaps because there has been no incremental information means there is more evolution yet to happen in the policy.
Avinash.
Hi Avinash
If we knew that, unlike as stated in the original blog post, Google was trying to hide keywords from site owners then it would be pointless offering workarounds and solutions.
If we knew it was for some other reason then alternatives could be offered.
Every decent website has a mission – a reason that website exists. Any site owner worth their salt should be measuring and improving their website against its mission, whether its visitors come directly, from natural search, from paid search, from links on other websites or wherever. The referrer has been built into web standards from pretty much day 1, partly in order to allow this measuring and improvement to take place.
Google knows this. It has talked about it extensively at conferences and online. You personally have spoken about it, and I personally have witnessed you personally speaking about it. :)
Google still provides keywords to advertisers, precisely to allow this measurement and improvement to take place in the commercial world. But they've stopped providing it to organic listings on the SAME search results as those advertisers, meaning that sites whose mission is not necessarily commercial gain are not able to measure and improve in the same way as advertisers.
Google describes itself as an ethical company. My understanding of an ethical action is the one that does the most good and the least harm within the environment that is affected by the outcome of that action. I don't see Google removing referrer information as an ethical action within the online environment. If Google wants to be considered as an ethical company, then it needs to explain who it is withholding referrer information from, and why withholding that information does more good and less harm to the Web than not withholding it.
Alan
I seriously think that the hue and cry made by the SEO industry even when Google announced the "Not Provided" aspect was very futile. My views expressed regarding "not provided" in Nov. 2011 and Oct. 2011 on the following posts evoked a lot of criticism :
blog.webpro.in/2011/11/search-queries-googles-encrypted-not.html
seocopywriting.com/content-marketing/why-googles-recent-changes-mean-good-news-for-the-seo-industry/
But my views still remain the same and I think when keyword data will not be available to website owners will shift their attention to more meaningful metrics. This will result to more creative discussions rather than chasing keywords.
I fail to understand why this made the SEOs attack Google with accusations , because according to me all these changes are drawing a clear, distinct line between organic search campaigns and paid campaigns and carving a niche for the SEO industry where the focus is on the holistic and overall quality aspects of the website rather than the cost paid for every click received.
Hey Avinash,
What you refer to as: 'Non-individualized (not tied to visits/cookies/people) keyword performance data' totally makes sense. Having worked with PII for many years, making data anonymous and secure is actually a pretty simple and necessary process. For someone like Google it would be pretty easy to do I'd have thought. It's whether they see value in doing it that is the key! If all other search engines are in the same boat, then what's their incentive?
Shall we start a petition! :)
Will: I believe that current digital companies are in the very early stages of tracking / privacy / security innovation. I do believe that the companies see the value in it, the challenge is how to balance the million factors at play and the million players at play.
But, it is early days. I'm optimistic we will see interesting solutions over the next year.
-Avinash.
PS: I seriously doubt petitions will be effective in this particular case (or for that matter most cases in this context regardless of geo, company, group).
I'm optimistic too. It's given me a business idea!!
The petition reference was meant to be 'tongue in cheek' – I still fail to remember that my humour doesn't come across well in this medium!!! :) Totally agree that a petition would be pointless.
Cheers,
Will
Hi, Avinash. Love your work, by the way. Reading about future, mining and seeing through a news site perspective (Globo.com), I think about the new Segment Builder interface in Google Analytics. I love that now we can segment users and not just visits.
Anyway, I was wondering about the "Day of the First Visit" feature. When setting the "first visit" will google analytics consider the user`s first visit ever in his whole 2 year cookie history? Or will he consider only the visits made during the time frame I choose? This is really important, because it impacts all my cohort analysis, right?
Thank you so much, hope you can enlighten me.
Rebecca: You might find this help article, about dates, to be of value:
https://support.google.com/analytics/answer/3123951?hl=nl
And do please remember that segments are applied over the time period you select. So if the time period in your report is March 2014, that is the data you are looking at.
-Avinash.
Thank you so much for taking time to answer that. I had already found that support ticket. I know that segments are applied over the time period I select. But what about Count of Visits? Doesn't Count of Visits count all of the visits of a unique user in a website's history despite the time period selected?
If that's so, Date of the First Visit only works if I want to analyze New Users, right?
Maybe I'm not making myself very clear, what I want is seeing how many users have made a specific number of visits in a period of time. Is that possible or only Universal Analytics will make it possible?
Thank you in advance. Best regards.
I've been debating whether I wanted to leave a comment on this post, Avinash. Sadly, we're doing all the same things you're advising and there's just no replacing keywords completely. Yes, we can move on and find different reports, but the user intent will never be as clear as it once was for organic visitors.
I find it completely odd that Google can securely serve ads and send the keywords along in those reports, but it's not the same for organic. No one has ever explained to me why you couldn't securely pass along organic referring keyword data via a similar process as what AdWords does. I guess I'll always just assume this change was made to increase revenue but presented as a privacy issue, which really sets me back on the ol' Kubler-Ross model.
Great Article! Thanks Avinash.
Has anyone ever done a correlation study between key SEO on-page factors and search term rankings? I guess the annual SEO Moz survey would help…
I've been cutting analytics by organic as a source and the looking at landing pages, and adding in page title as a secondary dimension (this is a hack – look it up if you want to know how).
More often than not (and particularly if the H1 heading matches the page title) the head term rankings and therefore traffic will be for words and phrases in the page title.
SEO's know that this (in conjunction with an exact match anchor text link) is still the most impactful factor in SEO, so testing this, as you say, is the way to go. This to an extent means nothing has changed.
Great article. Thank you
Another way – that doesn't make use of SEO tools – would be asking your website visitors directly. Mini surveys on the sidebar or in a popup that say something like "Thank you for visiting our site! You can help us improve our services by letting us know how you found us" and then add a field asking for keyword entered, referral sites (for niche stats), how satisfied they were by the site, etc.
Since Google applied a manual penalty to one of my website, I'm turning to user surveys and mailing lists, and so far they work like charm. :)
– Luana S.
Google's (long running) agenda to move all searches to HTTPS has been in place since 2011. At most, they used Snowden/NSA merely as an excuse to accelerate it.
The phrase "following revelations of NSA activities via Mr. Snowden, Google has now switched almost all users to secure search" implies causation. It should be clear that is not the case. If SEO agencies and in-house search departments can't prove or measure the results of their work, they're likely to be contracted and resourced less and less over time.
And that's good for Google – some of the resources dedicated to SEO will be shifted to PPC, ultimately increasing competition in the PPC space, and thus raising bidding prices. It goes hand in hand with Google's recent move to penalize Adwords campaigns which do not use extensions. It raises the level of competition and CPC in the PPC space.
Google's ultimate goal is to have no one spending time and money on SEO, only on PPC. At some point, they will succeed.
Eddie: I'm afraid I can't add to your comment on causation, or for that matter the impact of this change on the PPC space (as I'd mentioned above, in my large account the impact on PPC has been negative as the thousands of keywords we would add from the long tail are gone now).
I'll disagree on the SEO bit. For all search engines, Google and others, it is critical that sites are well built from a technical perspective so that are crawlable (SEO!), written with focused business content (SEO!) that is updated on a timely basis (SEO!) and undertake efforts to put in place inbound marketing efforts – including outreach and social – (SEO!).
If as you say people don't spend time and money on SEO, it hurts search engines substantially. And hence I believe if anything, SEO is even more important now and the smart SEOs (see my suggestions in the post) will be paid more than they ever have been.
-Avinash.
Avinash, there might be a semantic basis for our disagreement.
"Sites are well built from a technical perspective so that are crawlable" can be considered to be part of SEO, but can also be considered the Webmaster's job – hence the name "Google Webmaster Tools". Writing/Updating content, IMHO, is the content writer/copywritter's job. Putting "in place inbound marketing efforts" can also be done by the SEO guy, but it is increasingly getting automated (consider for example, WordPress Jetpack's publicize options, which can be set by anyone).
In the end, the job of making SERPS rank higher, the core and ultimate goal of the SEO job, will be replaced by making Ads rank higher. In Google's vision of the future, they just want people to keep creating better and better content – not daring to use techniques to make said content rank higher. (heck, you could even use this last line as a summary to 80% of Matt Cutt's answers in his videos).
I admit that my point is slightly exaggerated, and we can disagree on how much, out of the total effort, falls on the lap of the SEO guy. But if his core objective, along with ways to prove ROI on it, are going to disappear, how much of effort will really be left for him, in the end?
Eddie: You are absolutely right. I do see SEO much more expansively in terms of what the effort is, in terms of who does it and, hence, what can't and can't get automated.
For all of those reasons, I'm substantially more optimistic.
But it is ok to have a difference in perspective. : )
Avinash.
Avinash,
Does GA Premium Version shows keyword details from Organic? Or is it the same like free one?
Ramakrishnan: I'd noted at the top of the article that the change was implemented by Google and not Google Analytics.
In as much it affects all digital analytics tools exactly the same way. If the data is not there, no tool from any company has it available to provide to you.
Avinash.
Hi Avinash
One possible workaround would be to pass the keyword data through another mechanism, i.e. not the referrer. This workaround would available only be available to Google, not to any other analytics provider (not without Google's assistance, anyway), as Google is in the position of being at both ends of the transaction (search engine AND site analytics provider). This workaround could work much like the Adwords gclid.
This is why I was asking who Google is hiding the keyword data from as, if it's the site owner, there would be no point even discussing such workaround mechanisms.
– Alan
Alan: I'm afraid I disagree with the "only Analytics get the KW" data bit. One thing I like about this policy, and this entire post is a testament to how much I don't like this, is that it is consistently applied to all web analytics tools. Anything else would be profoundly sub-optimal.
At the moment everyone has access to the top 2k queries in GWT, anyone with any tool can have that data. Hopefully, as I mention in the post, GWT gets investment and attention from Google to provide even more data and even more features.
Over the long term, perhaps there is a way that Google, and other entities that decide to go down the https path, will find innovative ways to pass data that clients will find to be of value. Fingers crossed.
Avinash.
Hi Avinash
> I'm afraid I disagree with the "only Analytics get the KW" data bit.
Do you disagree with the fact that Analytics "could" get the keyword data, or that it "should"? Because it definitely "could" get it, using a mechanism other than the referrer. If you disagree that it "should", then you're not disagreeing – i.e. I actually agree with you – as, for the data to only be available through Google Analytics would be an abuse of power.
> Over the long term, perhaps there is a way that Google, and other entities that decide to go down the https path, will find innovative ways to pass data that clients will find to be of value.
They could do that right now if they wanted to. The fact that that the keyword data is withheld from https is the biggest indication that Google is choosing to withhold keyword data from site owners as, using https, the whole referrer would be encrypted just as securely as the page content itself. There's no reason at all for Google to remove the referrer from a https connection, unless they don't want the site owner to have it.
Given this, it's difficult to avoid the conclusion that Google is choosing to give keyword data to paying advertisers, yet withhold the same keyword data – on the same search results – from the site owners upon whose sites the entire Google service is built.
Alan
Hi Avinash,
Since, we are heading towards 100% NP keywords, how can we analyze performance from the SEO point of view to differentiate "Branded keywords" and "Non Branded Keywords" 'cause this is what is one of the key metrics of the SEO performance for many companies?
Kallol: As mentioned in the post, the keyword data that is not available in analytics tools is simply not available.
You can use one of the options in section #2 for strategic analysis and #3 for keyword analysis (Webmaster Tools, Google Trends, Competitive Intelligence tools, Google AdWords Planner etc.). They all allow you to do some brand/category analysis at a company or industry level.
Avinash.
Thanks Avinash, I would try to congregate data from the above mentioned tools and analyze but still it won't be giving the exact interpretation of the data, but surely will narrow down the assumptions. Thanks.
Hi Avinash,
Nice write-up. I've been looking at the Paid & Organic report as a solution (I've linked to my working so far through my username here).
You mention in this post that GWMT is limited daily to approx 2,000 queries. Can you confirm if you are seeing the same limitation with the Paid & Organic report? I ask as I've not got access to a sizeable site currently.
Tom
Tom: You are seeing what you are supposed to be seeing. : )
Kartik mentioned it in his comment above, and I'd added a Update 4 in the GWT section relating to this. Please check it out.
Avinash.
Now reaching over 90% of not porvided on my blog.
I can understand the move of Google, thinking about bad SEO praxis, but by making it more complex for guestimate SEO data to a small group of profesionals, Google is leaving mlllions of enterpreneurs without vital data to make business decision.
Things are becoming far too complex form many people who don't have ressource to contract a SEM agency to find out keywords that are driving the best visits.
Thank you for sharing this post, I think there is a lot of confusion out there with all the major changes – 100% Secure Search and Hummingbird rolling out.
This really gives you the birds eye view of how it will affect SEOs and how we find work around ways to analyze data moving forward.
I have a question that is unrelated to this post – I was in Keyword Planner today and saw TM symbols next to some of the keywords – trademarked keywords?!!
Whats up with that?! Can anyone clarify – might be a good idea for a new post topic
Thanks,
Adam
Adam: The Keyword Planner is helping you make smarter decisions by identifying clearly which terms trademarks. This should help eliminate surprise when keywords get disapproved for trademark reasons.
Here's an article that shares more on trademark and AdWords:
~ AdWords Trademark Policy
-Avinash.
Thank You Avinash. The only think left is to ask Google to bring it back, lets all raise our voices and ask them to bring it back.
Finding ressources like this one, makes me confident that we will overcome the "Not Provided"-challenge. Thanks again for a great walk-through :)
Hi Avinash,
Since GA is now not reporting on organic keywords, are the aggregate organic search numbers still correct? One of my client noticed a steady increase in Direct traffic numbers and was wondering if the organic search numbers were now being mislabelled as Direct?
Thanks,
Rohit
Rohit: All traffic coming from Google for organic search is clearly marked as Google traffic. Underneath that, if the keyword is not being shared then those visits are marked clearly as (not provided). They will never be in Direct.
If you see a spike in Direct traffic, checkout the causes outlined in this post:
~ Excellent Analytics Tip #18: Make Love To Your Direct Traffic
You can also work with a GACP to help identify the specific issue at your end. Here's a list: http://www.bit.ly/gaac
Avinash.
*All traffic unless some of the mobile browsers are impacting Google search traffic again due to the odd Google implementation of not provided that is. :)
Hi Avinash,
This post inspired some creative thinking to get at the analysis I needed. Due to some data capture mistakes the source data was convoluted for our client and through landing page and segmentation slicing I was able to find what I needed. Thanks!
Here is more detail on exactly what I did if you or anyone is interested: digitalperformance.squarespace.com/blog/2014/1/13/get-creative-how-to-use-landing-pages-and-segments-to-make-up-for-missing-not-provided-keyword-data
Very interesting and provides much food for thought. My experience level is an advanced beginner and just organizing my first campaign. I can see I have some challenges as I was hoping to use organic search before launching a PPC strategy. I will be reading and re-reading this article and the posts along the way. You put quite a lot of effort in this article and I want to do it justice. Any further suggestions on getting the best data for organic searches will be welcome.
Thanks again
Great post. I totally see this working.
Thanks for sharing this.
Pablo Abbate
I'm liking moz's new keyword research tool – I find it to be one of the best.
*I'm not an employee of moz lol – I just like the tool :-)
Hi Avinash,
What is your opinion on SEO is dead in 2018?
Regards,
Brian
Brian: SEO is not dead, and I can't think of a near-future where it is going to be dead. I urge getting over this dead thing.
SEO as doing quick fixes low quality things is certainly difficult and does not work. SEO as the full spectrum of creating content that is unique to you, to do so on a platform that is discover-able, to ensure there is a consistent strategy to update and refersh and stay relevant, all this is SEO to me and it will continue to be immensely valuable.
Avinash.