Google has been at the forefront of creating search engine technologies, taking upon itself to constantly refine and better its online search methodologies. An example of this is the ‘Search in Delphion-for Google’s Patent search’ developed by Google which is used for specialised search functions only and is subjected to an ongoing process of further improvement. Google has aggressively endorsed its vision of collecting, retrieving and presenting information in the most accessible and useful manner, keeping a browser’s overall experience and safety of personal data as the top priorities. However, it seems that Google’s endeavours in the recent past might not be as popular in the Google Analytics and SEO community though they might please browsers across the world.
Understanding the Situation: Why is Google Safe Search being criticised?
Google’s endeavour to constantly improve the quality and accuracy of searches is also linked to Search Engine Optimization (SEO). Yes, this has helped Google emerge as the market leader in search engine technology, often dictating terms and setting the benchmarks for other search engine brands. However, in the recent past some doubts have arisen about Google’s perspective in terms of making its search engine functionalities being actually “useful”. For instance, ‘Google Safe Search’ has not been received positively by the global SEO community and by webmasters handling Google Analytics. This is primarily because Google’s Safe Search makes a browser’s searches more secretive, making it hard for online marketing experts to decode how and when their website is being browsed. While Google opines that Safe Search aims to guard a Google user’s online confidentiality, many Google Analytics experts believe that this is just a masked effort by Google to further raise its revenue share.
Google Safe Search versus SEO Analysts: What is the Fuss About?
Google Safe Search was launched on October 18, 2011, asking browsers to sign into their Google account and moving to a more personalised browsing experience. While an average web user might not understand the dynamics of this browsing option, for webmasters this is a major roadblock towards identifying the behaviour of visitors to their websites. This is best understood by understanding how organic search results or un-paid search results function on Google. Here, a browser visiting a website after being directed from the list of sites displayed on Google’s search-results page is tracked in the form of an individual query. Such data is then complied and used to analyse the overall manner in which a website can engaged and sustain a browser’s attention.
Please note that when using Google Secure Search, Google doesn’t use the keyword value in the traditional manner, i.e. from the referring URL. Thus, if you are performing a search on Google Secure Site, the keywords you searched for are not recorded as a part of the referrer information. The referrer information is like a string of keywords that is used by the analytics community to determine which of the keywords have been the most profitable for bringing more visitors from Google to the client’s site.
Until recently, webmasters used such data to edit their site’s structure or navigational features—however, in Google’s Safe Search mode, such data is highly compromised. Google Webmaster tools present another a limitation wherein the history of a search term is retained and displayed for 30 days only and information about the internal pages of a site that was browsed is quite restricted.
Google’s Explanations Has Few Takers
Google has defended itself insisting that it is dedicated towards making the browser experience more secure. The search engine giant has gone on to explain this via an example wherein it quoted that WiFi network users were at extreme risk of being exploited by online tracking mechanisms such as Firesheep that targets Firefox users and extracts browser data without presenting any notification. This essentially means that users on a particular WiFi network are at an increased risk of having their personal data being hijacked. Thus, Google justifies its Safe Search, insisting that this feature enables users on all types of connections to guard their confidential data.
However, these sentiments have not been received well by search marketers who insist that Google is indirectly trying to push more websites towards subscribing for paid searches since inorganic or paid searches are outside the realm of limitations imposed by Safe Search. This perspective is plausible since a major chunk of Google’s revenue is sourced from its paid search advertising platform, Google AdWords.
This argument is further fuelled by the fact that browser-related information is still available in the most comprehensive (like before) manner to paid websites even though organic search results sites with a higher Page Rank continue to suffer. Page Rank of a website too is determined by Google’s complex series of parameters and thus, it seems that websites are at the mercy of Google for being highlighted on the search results page as long as they don’t upgrade towards using Google AdWords.
“Solutions” to Neutralize Google Safe Search Are Already Emerging!
For SEO analysts this is a major handicap as unearthing search trends and the keywords has become even more challenging with Google Safe Search. Yes, the unpaid sites still receive a kind of aggregated information that contains the top 1,000 search queries responsible for bringing traffic to the site over the last, 30 days, i.e. through Google Webmaster Tools. However, this data is abysmally limited as compared to the kind of detailed information that was accessible earlier. Some vendors, like Keystonesolutions, have come forth with their own codes that facilitate to track browser behaviour even if the user has come through organic search results displayed on Google Secure Search pages.
However, there is much more to follow in this highly-debatable niche since Google is likely to challenge the use of such codes that are bound to eat into the company’s advertising revenue prospects. There is likelihood that Google can set-up its own tracking mechanisms for such codes and even penalise sites using them.