FTC Warns Targeted Ads and Big Data Can Be Civil Rights Issues

Flat design modern , ... ]

Flat design modern , ... ] Bloomua/Shutterstock.com

Using big data to target ads for financial products could mean “low-income consumers who may otherwise be eligible for better offers may never receive them.”

The same data analysis techniques that help businesses send ads to their target market could unfairly exclude other consumers, a new Federal Trade Commission report warns.

FTC this week issued a 50-page document outlining potential downsides to big data techniques. It also recommended companies using these techniques to ensure they aren’t violating federal laws, including the Equal Credit Opportunity Act or the Fair Credit Reporting Act.

FTC’s report was compiled based on discussions at FTC’s 2014 workshop, titled “Big Data: A Tool for Inclusion or Exclusion?” During the workshop, participants raised several concerns about ways common big data practices, if misused, could marginalize individuals.

For instance, the report said, using big data to target ads for financial products could mean “low-income consumers who may otherwise be eligible for better offers may never receive them.”

Some applications might directly violate the Equal Credit Opportunity Act, which prohibits credit discrimination because of race, religion, color, national origin, sex, age, marital status or public assistance, according to FTC, which enforces it. Violations of that law usually involve the “disparate treatment” or “disparate impact” of an individual because of those factors.

“[S]uppose big data analytics show that single women are more likely to apply for subprime credit products. Would targeting advertisements for these products to single women violate ECOA?” the FTC report asked. Prohibiting single women from applying simply because they’re single women would be a violation, but “what if a single woman would qualify for the prime product, but because of big data analytics, the subprime product with a higher interest rate is the only one advertised to her?”

“[C]ompanies should proceed with caution in this area,” FTC advised.

Concerns about big data and discrimination aren’t new; the federal government has been discussing the potential civil rights implications for years. In 2014, a White House working group posed similar questions, and suggested the government continue exploring how to “harness big data for the public good while guarding against unacceptable uses against citizens.”

A Senate committee in 2013 reviewed data brokers, who sell data by grouping consumers into “buckets” for targeted marketing. (While targeted ads can sometimes help consumers by showing them products they might need, “it can become a different story when buckets describing consumers using financial characteristics end up in the hands of predatory businesses seeking to identify vulnerable consumers, or when marketers use consumers’ data to engage in differential pricing,” that report said.)

In its report, FTC also noted that while traditional scoring relies on credit indicators such as late payments, and compares it to historical data from other consumers with the same credit characteristics, predictive analytics might compare one particular characteristic of a consumer to others with that same characteristic to predict his or her ability to meet those obligations.

“The difference is that, rather than comparing a traditional credit characteristic, such as debt payment history, these products may use nontraditional characteristics—such as a consumer’s zip code, social media usage, or shopping history—to create a report about the creditworthiness of consumers that share those nontraditional characteristics,” the report said.

A company could then make a credit decision based on a zip code; if it were to do so, it could “have a disparate impact on particular ethnic groups because certain ethnic groups are concentrated in particular zip codes,” the report said.

These incidents would have to be analyzed on a fact-specific, case-by-case basis, according to FTC.

FTC recommended companies think carefully about their big data practices, and to consider factors such as how complete their databases are, how fully their data reflect certain populations, how accurate their predictions really are, and whether their practices raise ethical concerns, among others.

In the meantime, FTC will continue to monitor big data practices, the report said.

(Image via /Shutterstock.com)