What started with a viral Twitter thread metastasized into a regulatory investigation of Goldman Sach’s credit card practices after a prominent software developer called attention to differences in Apple Card credit lines for male and female customers.
David Heinemeier Hansson, a Danish entrepreneur and developer, said in a series of tweets last week that his wife, Jamie Hansson, was denied a credit line increase for the Apple Card, despite having a higher credit score than he did.
“My wife and I filed joint tax returns, live in a community-property state, and have been married for a long time. Yet Apple’s black box algorithm thinks I deserve 20x the credit limit she does,” Hansson tweeted.
Hansson detailed the couple’s efforts to bring up the issue with Apple’s customer service, which resulted in a formal internal complaint. Representatives repeatedly assured the couple there was no discrimination, citing the algorithm that makes Apple Card’s credit assessments. Jamie Hansson’s credit limit was ultimately increased to equal his, but he said this failed to address the root of the problem.
Hansson’s tweets caught the attention of Linda Lacewell, superintendent of New York’s State Department of Financial Services, who announced Saturday that her office would investigate the Apple Card algorithm over claims of discrimination.
“This is not just about looking into one algorithm,” she wrote in a Medium post. “DFS wants to work with the tech community to make sure consumers nationwide can have confidence that the algorithms that increasingly impact their ability to access financial services do not discriminate and instead treat all individuals equally and fairly.”
Apple did not immediately respond to a request for comment from the Washington Post.
With the spread of automation, more and more decisions about our lives are made by computers, from credit approval to medical care to hiring choices. The algorithms — formulas for processing information or completing tasks — that make these judgments are programmed by people and thus often reproduce human biases, unintentionally or otherwise, resulting in less favorable outcomes for women and people of color. But the public, and even companies themselves, often have little visibility into how algorithms operate.
Past iterations of Google Translate have struggled with gender bias in translations. Amazon was forced to jettison an experimental recruiting tool in 2017 that used artificial intelligence to score candidates because the prevalence of male candidates resulted in the algorithm penalizing resumes that included “women’s” and downgrading candidates who attended women’s colleges. A study published last month in Science found that racial bias in a widely used health-care risk-prediction algorithm made black patients significantly less likely than white patients to get important medical treatment.
“It does not matter what the intent of the individual Apple reps are, it matters what the algorithm they’ve placed their complete faith in does,” Hansson tweeted. “And what it does is discriminate.”
Dozens of people shared similar experiences after Hansson’s tweets went viral, including Apple cofounder Steve Wozniak, who indicated his credit limit is 10 times greater than his wife’s. The outcry prompted Goldman Sachs to issue a response Sunday stressing that credit assessments are made based on individual income and creditworthiness, which could result in family members having “significantly different credit decisions.”
“In all cases, we have not and will not make decisions based on factors like gender,” Andrew Williams, a spokesman for Goldman Sachs, said in a statement.