Our Full-Service B2B Marketing Program Delivers Sales-Ready Leads Click to Learn More!
Welcome Guest | Sign In
ECommerceTimes.com
Code42

FTC Issues Regulatory Warning on Big Data Use

By John K. Higgins
Jan 20, 2016 5:00 AM PT
ftc-regulation-big-data

The U.S. Federal Trade Commission is extending its regulatory reach to the e-commerce impact of big data.

For years, the FTC has asserted vigorously its authority to apply existing consumer protection laws to emerging developments in the realm of information technology.

Now it is signaling that it will apply that same vigor to big data under the regulatory authority it possesses through the Federal Trade Commission Act and other laws.

The commission "will continue to monitor areas where big data practices" could violate those laws "and will bring enforcement actions where appropriate," it said in a report issued this month.

The title of the report, "Big Data: A Tool for Inclusion or Exclusion? Understanding the Issues," and the rationale for its publication seem to indicate that the scope of the document is limited to the FTC's concerns about imbalanced outcomes of big data. They include potential negative impacts on low-income and underserved segments of society, including discrimination in lending and job opportunities.

Legal Standard

The commission will not hesitate to enforce FTC Act prohibitions against unfair and deceptive practices related to big data in all applications, not just those affecting a particular segment of the population, it said in the report.

It will utilize the same legal basis for big data situations that it uses for cases involving the hacking of consumer records, identity theft, and fraudulent misrepresentations in e-commerce transactions. That legal leverage "is not confined to particular market sectors but is generally applicable to most companies acting in commerce," the FTC said.

In a clear warning to businesses, the FTC cited two significant e-commerce enforcement cases with implications for big data regulation:

Disclosure: One case involved charges of misrepresentation and deception against a credit card marketing company for failure to disclose all the conditions affecting cardholders. While the key issue centered on disclosure, an underlying condition involved the use of behavioral scoring data affecting credit availability to customers. Generating increasingly detailed information on consumer behavior, of course, is a major element in the commercial use of big data. In a settlement, the company was prohibited from repeating the misrepresentation practices.

Identity theft: In another enforcement action, the commission cited a company for selling personal information to third-party identity thieves posing as legitimate subscribers. The FTC contended that company neglected to adequately check on the status of the third party, despite the presence of red flags. Such cases "show, at a minimum, companies must not sell their big data analytics products to customers if they know or have reason to know that those customers will use the products for fraudulent purposes," the commission said.

The situations outlined in the two cases fall under the basic FTC Act but do not cover all possible scenarios. The commission's legal authority regarding unfair and deceptive practices covers activities that involve "a material statement or omission that is likely to mislead a consumer acting reasonably under the circumstances," the FTC said.

"If a company violates a material promise -- whether that promise is to refrain from sharing data with third parties, to provide consumers choices about sharing, or to safeguard consumers' personal information -- it will likely be engaged in a deceptive practice," it noted.

"Companies that maintain big data on consumers should take care to reasonably secure that data commensurate with the amount and sensitivity of the data at issue, the size and complexity of the company's operations, and the cost of available security measures," the commission said.

Regulatory Road Map

The report underscored the FTC's regulatory commitment to vigilantly monitor the cybersecurity impacts of e-commerce advancements.

"The regulated community would be wise to pay attention to the FTC's big data report. As with any formal FTC pronouncement, it provides a road map as to the FTC's thinking and likely future enforcement agenda," said Lisa Sotto, a privacy law specialist at Hunton & Williams.

"It is particularly important to remember that existing laws apply to the use of big data. Too often, companies fail to consider how existing legal regimes might pose an obstacle to the manner in which data collected as part of a big data initiative may be used. The FTC's reminder of this point should serve as a cautionary word to companies that are overly aggressive in their use of big data," she told the E-Commerce Times.

"The report does not offer FTC's thoughts as what companies can do to avoid violating existing laws when using big data," said Scott Talbott, senior vice president for government affairs at the Electronic Transactions Association.

"The report sounds an ominous tone, which could have a chilling effect on the use of big data," he told the E-Commerce Times.

Other Laws

The FTC's enforcement of two other laws -- the Fair Credit Reporting Act, or FCRA, and the Equal Credit Opportunity Act, or ECOA -- could affect companies that collect, analyze, market and use big data, especially data related to consumers.

FCRA applies to companies known as consumer reporting agencies, or CRAs, that compile and sell consumer reports containing information that can be used for credit, employment, insurance, housing, or other decisions about eligibility for certain benefits and transactions, the FTC noted. Conventional tools CRAs use include debt payment histories, rental payments, liens and even bankruptcy filings.

Enhanced consumer data generated by big data collection and analytics also will be subject to FTC enforcement under FCRA, the commission said, citing examples such as ZIP code identification, shopping history and social media usage.

Whether conventional data or enhanced big data resources are used, "the standards applied to determine the applicability of the FCRA in a commission enforcement action, however, are the same," the FTC said.

ECOA prohibits credit discrimination on the basis of race, color, religion, national origin, sex, marital status, age, or because a person receives public assistance. Such activity must cause a "disparate impact" resulting from a creditor who treats an applicant differently based on a protected characteristic.

For example, "a lender cannot refuse to lend to single persons or offer less favorable terms to them than married persons even if big data analytics show that single persons are less likely to repay loans than married persons," the FTC said.

Another example would be the use of ZIP code data that could trigger a violation if it could be linked to ethnic group discrimination, the commission noted.

Big data can provide huge economic and social benefits -- as well as potentially negative outcomes, the FTC acknowledged. It was abundantly clear from the report that big technology advances will meet with a commensurate regulatory response.

The day after the report was released, FTC Commissioner Julie Brill emphasized that point.

"Eighty years ago, Congress gave the FTC authority to protect consumers from a broad range of unfair or deceptive acts or practices," she said at a Privacy Summit sponsored by the governor of Washington.

Under that authority, the FTC has initiated nearly 100 privacy and data security enforcement actions, Brill noted.

"The flexibility and breadth of our authority to obtain remedies that protect consumers has allowed us to keep up with rapid changes in technology," she said.


John K. Higgins is a career business writer, with broad experience for a major publisher in a wide range of topics including energy, finance, environment and government policy. In his current freelance role, he reports mainly on government information technology issues for ECT News Network.


Facebook Twitter LinkedIn Google+ RSS
Code42
How do you feel about technology and security?
Very insecure -- I would gladly pay extra for better security.
Very insecure -- I'm using technology less as a result.
Very insecure -- but I'm willing to make the trade-off.
Secure enough -- I take reasonable precautions.
Secure enough -- I'm not a likely target.
Very secure -- I trust tech companies to protect me.