Business

Apple Card Algorithm May Tilt Favorably Toward Men

New York’s Department of Financial Services has initiated an investigation into a tech entrepreneur’s complaint that credit limits for the new Apple Card are based on gender-biased algorithms.

 

Thanks @QuestCNN and @PaulaNewtonCNN for having me on. Intentional or not, discrimination against protected classes of individuals is prohibited under NY law. https://t.co/CbRgUuDBMI

— Linda Lacewell (@LindaLacewell) November 11, 2019

The investigation, which DFS Superintendent Linda Lacewell announced Saturday, apparently stems from a series of tweets David Heinemeier Hansson posted starting last Thursday, revealing that his Apple Card credit limit was 20 times higher than his wife’s.

 

The @AppleCard is such a fucking sexist program. My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does. No appeals work.

— DHH (@dhh) November 7, 2019

Hansson is the creator of the programming tool Ruby on Rails and cofounder of the real-time group communication tool Basecamp.

The tweets quickly gained momentum as comments focused on gender disparity in particular, ill-conceived algorithms automating the credit-assigning process in general, and calls for corrective action.

Although Hansson did not provide income-related details for himself or his wife, he noted that they filed joint tax returns and that his wife’s credit score surpassed his.

Apple cofounder Steve Wozniak joined the furor, tweeting in response to Hansson that he had been granted a credit limit 10 times higher than his wife’s. Wozniak added that he and his wife do not have separate bank or credit card accounts or any separate assets.

 

The same thing happened to us. I got 10x the credit limit. We have no separate bank or credit card accounts or any separate assets. Hard to get to a human for a correction though. It’s big tech in 2019.

— Steve Wozniak (@stevewoz) November 10, 2019

Unintended Consequences?

The Apple Card, which debuted earlier this year, is a joint venture between Apple Inc. and the New York-based GS Bank, which is responsible for all the credit decisions on the card.

The controversy may be related to unintended results due to lenders’ use of algorithms to make credit decisions. Reports of the Apple Card irregularities are just the latest in a series of complaints about algorithmic decision-making that have drawn congressional attention.

There are cases of algorithms unfairly targeting specific groups even when there was no intent to discriminate, researchers have found. Some lawmakers already have demanded a federal response.

Goldman Sachs Bank USA posted an explanation on Twitter clarifying its Apple card credit decision process. The statement denied that the bank makes decisions based on gender.

The points stated in the post:

  • The Apple Card account is individual with no sharing of a card holder’s credit line with family members;
  • Each credit application is evaluated independently based on the applicant’s income and creditworthiness;
  • Other factors include personal credit scores, how much debt, how the applicant has managed debt; and
  • It is possible for two family members to receive significantly different credit decisions.

Missing Piece: Human Review

Credit limits should be determined by income and debt-to-income ratios. If they were based solely on that information, then the alleged discrimination might have less to do with the Apple Card and more to do with workplace discrimination, suggested Shayne Sherman, CEO of TechLoris.

“Women generally earn less than their male counterparts and are less likely to earn promotions,” he told the E-Commerce Times.

“This results in not only in current lower wages, but lower prospective wages, and ultimately lower credit limits,” Sherman pointed out. “It’s the reason there are robo investing accounts like Ellevest that are designed for women, and the fact that they are less likely to earn an income comparable to a man over the course of their working career.”

It is totally illegal to discriminate based on gender, said Nancy J. Hite, president and CEO of The Strategic Wealth Advisor.

“The systems in the 21st century are too often determined by those that construct the algorithms and are infrequently, if ever, reviewed by knowledgeable humans,” she told the E-Commerce Times.

It’s likely that regulators will include all credit card issuers in their investigation of this issue so as not to target one company, in this case Goldman Sachs and the Apple Card, Hite said.

Automated Growing Pains

Credit issuers’ algorithms worked differently in determining female creditworthiness in the past, Hite noted. For example, a shared checking account gave the wife equal access to the wealth. Thus a wife’s credit rating would and should be comparable to the husband’s.

Those algorithms did not include data about which person owned the accessible assets. They only researched business ownership.

New automated systems have become more intelligent and business-specific. Automation will be the first approach for the industry. It reduces salary and benefit costs, which are always at the top of the list for all businesses, she noted.

“This will be corrected in short order by regulation,” said Hite. “It is already in the law, and we await the next variable to be discovered. Changing systems is always messy.”

Cleaning Up the Mess

The suspicions surrounding the Apple Card are, at minimum, disturbing. In the weeks to come, both Apple and Goldman Sachs likely will go back to the drawing board, tweak the algorithm, and announce that they have fixed the problem, suggested Reid Blackman, founder of Virtue Consultants. This will hold only until the next one rears its head, of course.

“What we really have here is a biased or discriminatory outcome, and it is frankly shocking that Apple and Goldman Sachs did not do anything to uncover this kind of bias on their own,” he told the E-Commerce Times.

Having a due diligence process in place with operationalized artificial intelligence ethics would have presented multiple places where this error would have been caught, said Blackman, adding that nobody should be developing AI now without an ethical safety net.

What Apple and Goldman Sachs failed to do was implement ethics quality control, bias identification and mitigation, he said. They should already have been actively assessing who was getting what kind of credit and tracking disparate outcomes in their ongoing analysis.

“Having good quality control in place means they would have had a plan for something like this. Bias in AI and data is a well-documented problem. At this point, if you are not prepared, it is negligence,” said Blackman.

Balance and Control Needed

This situation underscores the need for a balanced approach toward automation. Can algorithm-based decision-making take the noise out of the system? Yes. Is it a silver bullet? No, said Cisco Liquido, senior vice president for business strategy at Exela Technologies.

“When we work with a client, we always advocate for a balanced approach towards automation because the known unknown — in this case potential unconscious gender biases — cannot be kept in check by computer alone,” he told the E-Commerce Times.

Some of the reporting around the Goldman Sachs/Apple Card issue points a finger at accidental bias in the credit decisioning algorithms that might have discriminated against women. There was no field for gender in the Apple Card credit application, according to Jay Budzik, CTO of Zest AI.

“AI is not to blame. Unintended discrimination can happen whether or not a lender uses AI,” he told the E-Commerce Times. “Age-old loan scoring methods based on only a dozen or so backward-looking measures can be biased and discriminatory, too.”

The essential component is that lenders are able to interpret both the inputs and outputs of any model to ensure it does not perpetuate bias. AI models make this analysis slightly more complicated because they use more data and churn through millions of data interactions, explained Budzik.

However, they are proven to provide wider access for communities that have been locked out of housing, credit and other opportunities because of discriminatory barriers. AI also saves banks from making bad loans, he added.

Possible Solution

In an effort to make AI safe and fair to all consumers, Zest AI created ZAML Fair, software that allows companies to upload their current lending model and spot any instances of disparate impact.

“Disparate impact” refers to any practices that adversely affect a protected group of people more than another, even though the rules may be formally neutral.

ZAML Fair tunes the model to achieve a set of less discriminatory alternatives, said Budzik. The goal is to preserve the economics of a lending portfolio while providing all the model governance documentation and fair lending analyses necessary to give lenders the confidence they need to put new, fairer models into production.

Jack M. Germain

Jack M. Germain has been an ECT News Network reporter since 2003. His main areas of focus are enterprise IT, Linux and open source technologies. He has written numerous reviews of Linux distros and other open source software.Email Jack.

Leave a Comment

Please sign in to post or reply to a comment. New users create a free account.

Related Stories
More by Jack M. Germain
More in Business

E-Commerce Times Channels