Security

Google Sitemaps Flaw Draws User Concern

A flaw in Google Sitemaps is allowing users to see the statistics for AOL, MSN, and any other Internet property with a Google Verification File on its server, according to Search Engine Roundtable, a search engine and marketing forum.

The flaw allows users to view the top search queries and top search query clicks for a competitor’s sites. The flaw also exposes crawl stats, page analysis, and index statistics.

Jason Dowdell, who operates MarketingShift, the blog focused on media research and technology, told TechNewsWorld that Google’s Sitemap attempts to help users determine what keywords are searched the most, but a flaw in the program is backfiring on some of its users.

“Google is sharing information specific to member Web sites with anybody who finds out about this hack,” Dowdell said. “From a competitive standpoint, essentially that means you can confirm some suspicions about how your competition is doing, like what words are their top five most trafficked keywords, and use that for your own advantage.”

Google was not immediately available for comment.

The Privacy Question

Ironically, the Google Sitemap FAQ tells users that the company uses the verification process to keep unauthorized users from seeing detailed statistics about a site. But Dowdell said Google uses the same verification file for everybody.

“Only you can see these details, and only once we verify you own the site,” Google wrote. “We don’t use the verification file we ask you to create for any purpose other than to make sure you can upload files to the site.”

The flaw has some users up in arms over what is viewed as piracy violation. So who is at fault for all of this? British search engine optimization specialist and David Naylor, more commonly known as DaveN, wrote in his blog that Google is mostly to blame for not properly thinking through the whole verification process.

“But perhaps the real question should be: How much do you want to trust Google with your data when they get caught making mistakes such as this? This kind of data generally isn’t too sensitive, but imagine if we put a competitor’s site in there?” Naylor asked in his blog. “At very least we’d be able to know exactly what keywords to target.”

Looking for a Solution

SearchEngineWatch.com Editor Danny Sullivan wrote in the site’s forum that Google “chose a bad way to do things.”

“Let’s get productive. How should they rebuild security?,” he asked. “I can see having to put code in robots.txt, easy to do, and nice way to say you want control over the entire domain. Something page-based also makes sense. But JavaScript on a page at the root level, seems hard for someone to fake.”

Another forum debater, TwisterMC, a Mac search engine optimization designer, agreed that Google has some fixing to do. “I think the verify code should change every time someone adds a new site to their Google Site Map admin area,” he said. “That way the code would be truly unique.”

Google had made no public statement about its plans to correct the problem at the time this article was written.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

How confident are you in the reliability of AI-powered search results?
Loading ... Loading ...

E-Commerce Times Channels