Concealed Keywords to Bring Back Classic SEO Methods

Google’s apparent plan to conceal all keywords from Google Analytics users will drive publishers to dust-off classic SEO methods, including those Google disdains.

Google appears to be moving to secure all searches by organic searchers. In a recent article Post-PRISM, Google Confirms Quietly Moving To Make All Searches Secure, Except For Ad Clicks, Danny Sullivan outlines additional new evidence that seems to show that Google will be concealing all keywords from Google Analytics, so that SEO’s will see only the dreaded “(not published)” in the list of keywords that drive customers to each page in a site. The implications are quite astounding really, and represent a possible power consolidation by Google, at the expense of publishers and their SEO teams.

Keyword Not Provided

Keyword Not Provided

Shown here, in the very place in Google Analytics where a we’re supposed to see a helpful list of keywords that our customers are using to find us, we instead see that Google Analytics, is not sending us the keywords. To be more specific, the Google search engine is sending traffic to the page with q= stripped out of the referer information in the header records. Without this information, there’s no way to know with certainty what keyword or phrase led to the referral.

In the example shown here, from 24 September 2013, 79% of all keywords were withheld by Google search. And with Google search providing as much as 80% of the search traffic, that means few keywords leak through for use by publishers.

Keyword Not Provided to Harm Publishers

Publishers will be harmed by Google’s withholding of search keyword information. A publisher will not know with certainty what keywords land customers on each page in the publisher site. There are several awful impacts that will result.

  1. Without knowing what actual keywords bring customers to a page, publishers will not be able to tailor the page to fulfill customer expectations and to provide the specific information customers want to see. The veterinarian who publishes a page on English Bulldogs won’t see that many people are coming to his page on “English Bulldog Diseases”, wanting ideas on how to care for their sick dogs.
  2. Publishers will have difficulty understanding high bounce rates due to unexpected traffic from related keywords that have a different meaning. The unfortunate clothing reseller with men’s and women’s belts to offer won’t be able to see that customers looking to buy industrial fan belts may also be bouncing from his site. The infamous example applies: the builder of maple night stands and desks will bounce traffic from young singles looking for one night stands. Perhaps a silly example, but there’s a point in there.
  3. Conversion optimization based on keyword or search phrase will be dead in the water. Without a keyword, there’s no way to optimize without paying Google for their PPC ads.

Despite Google chest-thumping that publishers should provide a good user experience for visitors, the move to conceal keywords will greatly inhibit publisher’s ability to do so.

So finally we come to the suspicion that Google won’t tell you what keyword was used unless you pay for the privilege. Only Google Adwords customers will see the keywords that drive traffic to the pages in a publisher’s site.

Classic SEO Techniques Coming Back

Google’s move to hide keywords will force SEO’s to bring back tools they haven’t used as often recently. Some of these Google despises, but their return will be purely due to decisions Google is making now.

  • Keyword per Page – Publishers will be more often inclined to tune individual pages in their sites for a single keyword, from the page URL to on-page content, and all points in between. A lot of formerly attractive sites could end up looking like keyword-stuffed landing pages. With a page tuned for “red shoes”, the publisher will be able to infer that the keyword used was fairly close to “red shoes”.
  • Keyword Rank Tracking – Google hates SEO’s who bang long lists of 4 or 5 thousand keywords against their search engine to check search rank for each page. However, with the Google Analytics keyword data gone, and the Google Webmaster data being as accurate as mush, publishers will want to know the keywords for which each page ranks so that the publishers can infer the incoming keyword.
  • Webserver Log File Analysis – Sometimes the webserver log file shows you additional customer detail that Google does not. The webserver log file can also provide good detail on the exact sequence of pages that individual customers visit, page by page. Google has nothing like this at the individual customer level. Webserver log file analysis tools are far cheaper than expensive tools like Clicktale.

In addition to these ancient tools, it’s possible that publishers will resort to heavier use of site search tools to try to induce the customers to re-enter keywords. Sort of like stopping people at the door and asking for the password.

In any case, it’s hard to see Google’s concealed keywords as anything beneficial for publishers, unless those publishers sell log file analyzers, keyword rank checkers, and site search tools.

References for Concealed Keywords

About Bruce Brownlee

Bruce Brownlee is a data scientist for AVOXI and founder of Bruce Brownlee Company. Bruce works with Google Cloud Platform tools and machine learning APIs, as well as R and Python for analysis, modeling, and product development. He has led establishment of SEO, PPC, web analytics, and web development practices for AVOXI and Bruce Brownlee Company. Bruce studied mathematical optimization, probability and statistics, and electrical engineering in college and worked as an engineer and software developer before starting web development and site tuning in 1997.

Speak Your Mind