Apple Card, Goldman Sachs and sexist artificial intelligence

Apple may have inadvertently exposed sexism in the credit card business. The company’s lack of joint accounts has forced married couples to apply for two Apple Card accounts rather than the one account they may have applied for through other card companies. Comparing the results has exposed a significant issue: Apple awards some men credit limits that are 10x-20x their female spouses.

The issue was initially exposed by David Heinemeier Hansson on Twitter:

Hansson’s prominence as creator of Ruby on Rails and founder of Basecamp help insure that his tweet storm wasn’t ignored. 

We joined in: 

But, more importantly, Steve Wozniak did as well:

There are two major issues to highlight:

This is a problem:

We can’t think of a credible reason for the Hansson’s or Wozniak’s experience. If two people share tax returns, accounts and assets, they should have the same creditworthiness. Clearly that isn’t the way that the credit scoring algorithm behind Apple Card works. We think it’s likely that this disparity wasn’t intentionally coded by humans in the algorithm. But it may have been created by humans based on the historical data that the algorithm learned from (the AI learns from historical credit scoring data and that historical data may be sexist). Either way, it is clearly the responsibility of humans to discover bias in the system—and they failed to do that. 

Goldman Sachs states that: “Our credit decisions are based on a customer’s creditworthiness and not on factors like gender, race, age, sexual orientation or any other basis prohibited by law.” That’s clearly not factual given these examples. It may be true, though, that Goldman’s credit decisions are unintentionally based on gender because the algorithm is inferring gender implicitly rather than explicitly. 

It’s possible that Goldman’s credit decisions mimic the rest of the industry—the whole industry could be sexist and it may only have been discovered through Apple Card. Since Apple looks at each customer uniquely, each individual needs to sign up for an Apple Card, not sign up for joint accounts as is common in the rest of the credit card industry. This structure has allowed married couples like the Hanssons and Wozniaks to compare credit scores. Of course this raises the question: has no one within the banking system evaluated credit scores to see if there was any unintentional  bias before?

This is Apple’s problem.

Some commenters on Twitter highlighted that Goldman Sachs, as the bank behind the AppleCard, is responsible for credit scoring. This is likely true. But it is Apple that has created the brand contract with its customers. And that means it is Apple’s issue to fix. Full stop. 

When the Apple brand is on a product, it is an Apple product. That’s true for hardware, software, ads, schwag—everything. And for good reason. There’s nothing more valuable at Apple than the brand itself. So Apple needs to own this issue itself. Apple never says “Well, that’s as good as our partners can deliver” with a hardware product—it pushes its component suppliers to be better. It needs to do the same with its services like Apple Card. If Goldman can’t provide credit scoring that upholds the Apple brand contract with its customers, it needs to find a new banking partner.

The problem isn’t the algorithm—it is doing what it is told to do. The problem is the people who manage the algorithm—they aren’t telling the algorithm to do the right thing. Apple and Goldman Sachs need to create a new system that creates the right boundaries for AI design and provides management processes to govern and hold the right people accountable. 

Managing machine employees is new and different. Even the best companies in the world with the best experience with AI will stumble as Apple and Goldman Sachs are now. The question for all is: how will you create an AI governance program to manage your machine employees?

Share on email
Share on facebook
Share on linkedin
Share on twitter