E-Commerce

EXCLUSIVE INTERVIEW

AOL CPO Jules Polonetsky on Hitting the Privacy Sweet Spot

The privacy requirements of Internet users can vary widely. Some divulge the most personal details of their lives on blogs and social networking sites; others want to remain as anonymous as possible.

Users know they have control over what they push onto the Web — but what about data that portals and other online giants retain on the back end? Online firms are piling up user data to get to know them better — and serve ads based on keywords in searches and e-mails.

Although some online users are delighted when companies match their shopping patterns with on-target suggestions, others are discomfited by the feeling they’re being tracked and resent invasions of their privacy.

Efforts of online giants to quell users’ fears about data retention haven’t been served particularly well by privacy policies — often upwards of a thousand words — written in ponderous legal language.

AOL, for one, wants to streamline its communications to make users more aware of what’s happening to their data and what their rights are.

The E-Commerce Times recently spoke with Jules Polonetsky, AOL’s senior vice president and chief privacy officer, about data retention, privacy regulations and education.

E-Commerce Times: What user data does AOL retain, and how long is it kept?

Jules Polonetsky:

AOL is a combination of a number of different businesses and services. The AOL ISP (Internet service provider) that people use to dial up retains information for a very limited amount of time. That data is actually generally around for a very limited period of time just to ensure that people are connected.

AOL is also a portal. We own and operate a number of Web sites, and the features on those Web sites have different retention periods based on what their particular service is. For instance, there is amyAOL personalized home page for folks that don’t want to do a lot of work dragging modules around and like the idea of clicking and saying, “I like these kinds of articles, so find me things that are related to that and personalize my home page for me.” That has a feature that allows users to simply clear [information] because they don’t want that history to be displayed anymore. …

On our ad network side of the business, where Platform A is the structure that includes Tacoda and Advertising.com and Quigo, they serve ads on lots and lots of Web sites, and depending on what the relationship with the Web site is, [we] may end up having data long enough to deliver an ad and provide a report.

One area where we’ve been very specific about having a retention policy is in the area of search data. When I work with businesses to try to set these retention periods, my focus is on what is the area where there are [privacy] reasons to have tighter retention periods? And then what are the areas where [the data] — well, it’s summary reporting, it’s log file data used for sort of auditing but it’s not used for anything, it’s not available — doesn’t really have a significant privacy ramification? Then, what is the data that is more personal? What is the data that is more related to “here’s the things I’m looking for online?” That’s obviously data that has a higher priority.

We keep search data, for instance, for no more than 13 months because we’re aware that it is a type of data that is of much greater privacy impact. [We keep] some data for hardly any time at all, literally traffic running through the ISP. …

We’ve been in the midst of looking at the different kinds of data and trying to figure out what should we make a priority for quicker deletion, what is really needed for auditing, for fraud, for delivery of the service, for research, and then what isn’t needed at all. Search is the one where we’ve been most definitive, because that’s been the highest priority.

ECT: How else is this information used?

Polonetsky:

It depends on the product. Users visit Web sites that are part of the Advertising.com network, for instance. Advertisers purchase ads across the Web sites in that network, Advertising.com delivers the ads, the users hopefully like the content they’re seeing, appreciate that it’s supported by the ads — and sometimes perhaps they even click on [ads] because they want to know more about something. The advertisers expect to get reports that tell them how many ads were delivered, because we’re charging them. How many of those ads got clicks? … That sort of data generated from the process of serving ads will be kept and used to generate reports for advertisers that indicate what happened in the ad-serving process.

They also would need to go back and figure out false clicks. You have advertisers and others who might try to generate fake traffic in order to get paid per click, and you might learn that you’re getting lots of strange clicks from a particular foreign jurisdiction and they seem to be coming from a whole collection of interesting machines there. You learn that that’s actually a bot network, and then you may want to go back and pull those back out and not charge for what are fake clicks.

ECT: What’s your take on regulatory efforts to limit the time period during which user data is retained?

Polonetsky:

My general view is that this is an area where businesses have first started getting really serious and are taking a hard look at the reasons why they need data to provide services, and [are evaluating] some of the secondary reasons and where to draw those limits. It took us quite a while to figure out exactly what we thought the time we needed for search data was, and we’re taking a hard look at other areas. …

I think it’s early in the U.S., and I’m very optimistic that businesses are putting this on their radar screen for the first time. … If I was sitting down and trying to create hard and fast rules, I don’t know exactly where I would draw the lines in every particular business model. I think it’s a bit early for regulatory efforts because so many of the uses aren’t completely defined well enough yet.

ECT: Why is online privacy protection a controversial issue? What do the companies want, as opposed to what the regulators want?

Polonetsky:

I think there’s sometimes a very knee-jerk reaction to any sort of restrictions. And I think the business folks [will] tell you they want certainty. Tell them what the rules are, and they will compete. When there’s noise and sturm and drang, that’s not good for consumers and that’s not good for the business environment. It just creates confusion in the marketplace. I think it’s important that there are good standards about how to handle information and how to give users control online. But I do think that there’s so many business models, there’s so many different nuances of what actually consumers want, that it can be easy to rush to simple solutions.

Who knew? I’m a social networking addict. I have hundreds of friends on Facebook, and I love them all and I love pushing out information to them and hearing what they’re up to. AOL recently boughtBebo, and so now I’m on Bebo as well. And I love it. I’m an extreme extrovert. Instead of spamming all of my friends, telling them, “hey, I’m doing a fund-raiser here, I’m organizing this event,” I can just push it out.

But who would have known years ago that people wanted to [do that], and that there would be an entire generation that expected to be easily broadcasting information about themselves to hundreds of people that they were connected with? You easily could have said, “Well, that would be crazy. That’s illegal. Nobody should be allowed to do that.”

AOL is one of the inventors of the Buddy List, and the patent holder — for many years when we would be having privacy discussions and I would be insisting on notices and controls or options to be turned off on products that he was proposing, — said to me, “You would have told me when we were rolling out the Buddy List that that ought to be opt-in.” And I said, “You know, I guess so. I mean, people know that you’re online. Why should someone know you’re online just because they know your screen name?”

And of course, the whole point at the end of the day is that I can IM you because I see that you’re online. It works nicely, and it is why people use instant messaging — because they can instantly ping you online.

Understanding the evolving consumer interest and concerns here is, I think, the challenge. And I think you’re seeing businesses starting to step up and give users more control in this area. You’re seeing people establish different limitations and adding different options, different defaults and tweaking things — and sometimes you see the pushbacks. …

We’re constantly testing and tweaking. What we’re learning is that everyone cares about privacy, but what that means to each person can be very different. What it means to the 20-something who lives in a semi-public interconnected way, and what that means to me and what that means to my mom are very different. And we also have learned that communicating about privacy is interesting. We did some interesting research a little while ago. We said, “How can we properly even talk to people?” An average person wants to get their mail or they’re looking to check the weather. They’re not looking to click and read a 12-page privacy policy with lots of very careful legalese and technical considerations. How can we actually even have a meaningful conversation with a user?

[This month], we launched an animated penguin that tells users that behavioral targeting is happening, and in a cute but informative way lets them understand how it works and how they can opt out of it. You could have all kinds of privacy policies and all kinds of opt-in and opt-out requirements and no one would look at it — and if they did, they probably wouldn’t understand it. Here, we’re using an educational device to capture people’s interest and inform them. We found in the research that we did that a substantial number of people were very interested in different modes — they wanted cartoons, they wanted video or they wanted diagrams in addition to a paragraph in a privacy policy. We found that the younger audience under 35 was particular interested in experimenting with multi-modal forms of communication. So, we’re really first starting to learn to talk how to talk to users about privacy in a way that’s meaningful.

We launched banners that invite people to learn about how advertising works online and what people’s choices are. When you click on the banner, you seethis penguin who goes to AnchovyGourmet.com. Visually, you see that he gets a tailored ad, he went to AnchovyGourmet, and [an ad company sends a cookie to his computer], and then he goes to the next Web site and the ad network reads the cookie and realizes that this is someone who likes anchovies, and even though he’s now at the weather at PenguinTimes, he gets an anchovy cookie.

It’s a little general. It doesn’t have every technological point in there and every legal point. But one of the things I think we have done in the industry with privacy policies is that we’ve been so concerned about the legalese and the technical accuracy and exactly what the regulator wants to say, that we’ve lost the average user.

This is our attempt to try to talk to the regular people. AOL’s success was in large part based on an ability to make the Internet and the Web easy for users who are beginning. Now that we are in the advertising and content side of the business, we’re trying to at least capture some of that DNA to talk about advertising and data to consumers. Hopefully, we will be successful, and we’ll see users feeling more in control. I think that it will address a lot of the privacy concerns.

In our campaign … we have [created] general consumer education, catching the user who is looking for the quick, easy less-than-a-minute gist. We also think that there is a technical advocate expert audience — the watchdogs — [that] craves the greater detail. They want to know how cookies work and IP addresses, and what’s routine. They get deeper into the weeds. So we also launched a privacy blog that is connected to that campaign called“PrivacyGourmet,” and the motto is “Add Transparency,” like a gourmet cook might add an ingredient. And we started the process of putting out details for the nuts and bolts for the person who wants to understand what’s happening behind the scenes.

ECT: Do you think this is a trend with other online companies?

Polonetsky:

I do. You see Google launching video channels to have their engineers lecture about how cookies work. You see eBay delivering ads that have a little ad choice notice in them. I think you’re seeing the leading companies recognizing that if they don’t do their advertising in a trustworthy manner, it’s not going to be accepted by consumers, and then it won’t work for advertisers either.

ECT: What can companies offer users who suffer privacy breaches besides an apology?

Polonetsky:

I think that the key thing for businesses to do is figure out how to properly communicate with users so that you can tailor what you provide to their expectations. … If you overreach, you’re going to quickly lose your customer. And so, hitting that sweet spot where the information exchange is one that the consumer sees of value.

I love it when I’m at a shopping site and they recommend something based on my past purchase. I get why they’re doing it. It indeed does surface books or products that I’m interested in, and it doesn’t scare me. If I don’t understand what’s happening and I’m feeling targeted and tracked and intruded on and interrupted, that’s not going to work for advertisers, that’s not going to work for consumers, and the model fails. So, getting the right balance — having the data that is useful for advertisers but that does provide a relevant benefit to users that they control — that’s what we’re all looking to accomplish.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

More by Rachelle Crum
More in E-Commerce

E-Commerce Times Channels