E-Commerce

SPECIAL REPORT

What Web Performance Metrics Really Mean

data science analytics

As reliance on and investment in e-commerce grows, so too does companies’ overall investment in Web monitoring software and services. In 2002, the average Web analytics company saw sales increase 15 percent, according to Forrester Research. Solutions — both for in-house and for external monitoring — are available from a slew of companies, with costs ranging from free to thousands of dollars.

Still, no matter how sophisticated the underlying technologies, determining what all this data really means can be a challenge. Web performance companies sometimes report that a given transaction at a company’s site was successful only 95 percent — or even 80 percent — of the time. If one-fifth of customers really were being turned away, that would be terrible news for e-commerce vendors and the general public.

Fortunately, industry executives dispute this interpretation of performance data, citing multiple variables often unrelated to technology that could affect the rate of success for online transactions.

“In general, transaction success rates are much higher than 95 percent,” Ken Godskind, vice president of marketing at Web performance tracker AlertSite, told the E-Commerce Times.

The Abandonment Factor

For example, some failed transactions might be due to user abandonment rather than a technological mishap.

“Abandonment is a very difficult thing to track,” Ed Gondek, an interactive designer at IBM Tivoli in Raleigh, North Carolina, told the E-Commerce Times. “It’s all about the user experience.”

Some transactions are dubbed “unsuccessful” because the user merely was comparing prices or features with no intention of buying anything, Gondek said. Also, at busy times of the year — Christmas, Valentine’s Day, Mother’s Day — traffic may be slower, thereby pushing some transactions into “failure” mode because of time-out constraints.

“When companies are reporting that 20 percent of transactions fail, to be quite blunt, it’s really a leap of faith,” Gomez editorial director Alan Alper told the E-Commerce Times. “There’s no way that figure can be supported. The transaction may actually have gone through.”

Added John Lovett, senior performance analyst at Gomez: “From our data, we’re seeing that’s simply not happening.”

Beat the Clock

Developers of performance metric software and solutions can tailor their programs to meet individual client’s needs. However, most have a default time limit before they deem a transaction unsuccessful. Gomez, for example, uses 25 seconds per page, Lovett said.

“There is a time-out element to the transaction,” Roopak Patel, senior Internet analyst at Keynote Systems, agreed in an interview with the E-Commerce Times. “We have a default number we use typically. This, as a rule, is 12 seconds per page. We believe that’s a pretty reasonable threshold.”

AlertSite customers select the cut-off line between success and failure, according to Godskind, and they can choose anywhere from 1 to 90 seconds. “Very few customers set it below 30 seconds,” he noted.

Also, while some monitoring companies test only against high-bandwidth connections, others include — or even specify — dial-up connections as well.

“We actually do our testing from both a broadband perspective and dial-up,” Lovett said. “We try to evaluate from the last mile.”

User Perspective

Another reason for seemingly high transaction “failure” rates could be companies’ move to define failure on a subjective level — by focusing on the user experience through a consumer’s eyes — as well as on an empirical level that takes only numbers into account.

“We are measuring Web sites not necessarily from the inside of the Web site, but from the outside of the Web site as the customer would experience it,” Alper said. “We look at things from a user-intention perspective.”

The user’s experience is most important, TeaLeaf vice president Geoff Galat agreed. “An end user can access a Web application that delivers a blank screen. That page will no doubt load very quickly, and the system is, indeed, up. To twentieth-century notions of availability and performance, that application has succeeded, but that user’s attempt at transacting business has failed,” he told the E-Commerce Times.

“Most e-commerce companies today rely upon system uptime and page download speed as arbiters of a site’s success. At TeaLeaf, we believe the only arbiter of an application’s success is the end user’s ability to conduct business with that application,” Galat added. “And the only way to truly measure that success is to capture each and every user’s complete Web session.”

Read Between the Lines

While they may differ in their approaches to monitoring, industry executives agree that companies that rely on e-commerce must invest in technology to boost performance. If they do not, consumers will choose other sites, telephone sales, or their local mall, Galat cautioned.

“When you begin to look at application success from perspectives other than uptime and download speed, you see that myriad other possibilities for failure exist,” he said. “In essence, this amounts to a ‘death by a thousand cuts’ scenario. I believe customers are not coming back [if a transaction fails]. They are going to other Web sites, picking up the phone, or heading to the mall.

“There has been a good deal of self-congratulation with respect to e-commerce, and much of it is deserved,” Galat added, “but e-commerce won’t truly hit its sweet spot until companies start measuring success or failure of their Web customers in the same manner as their in-person customers.”

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by Alison Diana
More in E-Commerce

E-Commerce Times Channels