Posted on

Is an Algorithm Less Racist Than a Loan Officer?

In 2015, Melany Anderson’s 6-year-old daughter came home from a play date and asked her mother a heartbreaking question: Why did all her friends have their own bedrooms?

Ms. Anderson, 41, a pharmaceutical benefits consultant, was recently divorced, living with her parents in West Orange, N.J., and sharing a room with her daughter. She longed to buy a home, but the divorce had emptied her bank account and wrecked her credit. She was working hard to improve her financial profile, but she couldn’t imagine submitting herself to the scrutiny of a mortgage broker.

“I found the idea of going to a bank completely intimidating and impossible,” she said. “I was a divorced woman and a Black woman. And also being a contractor — I know it’s frowned upon, because it’s looked at as unstable. There were so many negatives against me.”

Then, last year, Ms. Anderson was checking her credit score online when a pop-up ad announced that she was eligible for a mortgage, listing several options. She ended up at, a digital lending platform, which promised to help Ms. Anderson secure a mortgage without ever setting foot in a bank or, if she so desired, even talking to another human.

In the end, she estimated, she conducted about 70 percent of the mortgage application and approval process online. Her fees totaled $4,000, about half the national average. In November 2019, she and her daughter moved into a two-bedroom home not far from her parents with a modern kitchen, a deck and a backyard. “We adapted to the whole Covid thing in a much easier way than if we were still living with my parents,” Ms. Anderson said this summer. “We had a sense of calm, made our own rules.”

Credit…Bryan Anselm for The New York Times

Getting a mortgage can be a harrowing experience for anyone, but for those who don’t fit the middle-of-last-century stereotype of homeownership — white, married, heterosexual — the stress is amplified by the heightened probability of getting an unfair deal. In 2019, African Americans were denied mortgages at a rate of 16 percent and Hispanics were denied at 11.6 percent, compared with just 7 percent for white Americans, according to data from the Consumer Finance Protection Bureau. An Iowa State University study published the same year found that L.G.B.T.Q. couples were 73 percent more likely to be denied a mortgage than heterosexual couples with comparable financial credentials.

Digital mortgage websites and apps represent a potential improvement. Without showing their faces, prospective borrowers can upload their financial information, get a letter of pre-approval, customize loan criteria (like the size of the down payment) and search for interest rates. Software processes the data and, and if the numbers check out, approves a loan. Most of the companies offer customer service via phone or chat, and some require that applicants speak with a loan officer at least once. But often the process is fully automated.

Last year, 98 percent of mortgages originated by Quicken Loans, the country’s largest lender, used the company’s digital platform, Rocket Mortgage. Bank of America recently adopted its own digital platform. And so-called fintech start-ups like Roostify and Blend have licensed their software to some of the nation’s other large banks.

Reducing — or even removing — human brokers from the mortgage underwriting process could democratize the industry. From 2018 to 2019, Quicken reported a rise in first-time and millennial home buyers. Last year, said, it saw significant increases in traditionally underrepresented home buyers, including people of color, single women, L.G.B.T.Q. couples and customers with student loan debt.

“Discrimination is definitely falling, and it corresponds to the rise in competition between fintech lenders and regular lenders,” said Nancy Wallace, chair in real estate capital markets at Berkeley’s Haas School of Business. A study that Dr. Wallace co-authored in 2019 found that fintech algorithms discriminated 40 percent less on average than face-to-face lenders in loan pricing and did not discriminate at all in accepting and rejecting loans.

If algorithmic lending does reduce discrimination in home lending in the long term, it would cut against a troubling trend of automated systems — such as A.I.-based hiring platforms and facial recognition software — that turn out to perpetuate bias. Faulty data sources, software engineers’ unfamiliarity with lending law, profit motives and industry conventions can all influence whether an algorithm picks up discriminating where humans have left off. Digital mortgage software is far from perfect; the Berkeley study found that fintech lenders still charged Black and Hispanic borrowers higher interest rates than whites. (Lending law requires mortgage brokers to collect borrowers’ race as a way to identify possible discrimination.)

“The differential is smaller,” Dr. Wallace said. “But it should be zero.”


Credit…Benjamin Rasmussen for The New York Times started in 2016 and is licensed to underwrite mortgages in 44 states. This year, the company has underwritten about 40,000 mortgages and funds roughly $2.5 billion in loans each month. After a Covid-19 slump in the spring, its fund volume for June was five times what it was a year ago.

With $270 million in venture funding, the company generates revenue by selling mortgages to about 30 investors in the secondary loan market, like Fannie Mae and Wells Fargo. The company attracts customers as it did Ms. Anderson: buying leads from sites like Credit Karma and NerdWallet and then marketing to those customers through ads and targeted emails.

In 2019, saw a 532 percent increase in Hispanic clients between the ages of 30 and 40 and a 411 percent increase in African-Americans in the same age bracket. Its married L.G.B.T.Q. client base increased tenfold. “With a traditional mortgage, customers feel really powerless,” said Sarah Pierce,’s head of operations. “You’ve found a home you love, and you’ve found a rate that’s good, and somebody else is making the judgment. They’re the gatekeeper or roadblock to accessing financing.” Of course, is making a judgment too, but it’s a numerical one. There’s no gut reaction, based on a borrower’s skin color or whether they live with a same-sex partner.

Trevor McIntosh, 35, and Brennan Johnson, 31, secured a mortgage for their Wheat Ridge, Colo., home through in 2018. “We’re both millennials and we need to immediately go online for anything,” said Mr. Johnson, a data analyst. “It seemed more modern and progressive, especially with the tech behind it.”

Previously, the couple had negative home buying experiences. One homeowner, they said, outright refused to sell to them. A loan officer also dropped a bunch of surprise fees just before closing. The couple wasn’t sure whether prejudice — unconscious or otherwise — was to blame, but they couldn’t rule it out. “Trevor and I have experienced discrimination in a variety of forms in the past, and it becomes ingrained in your psyche when interacting with any institution,” said Mr. Johnson. “So starting with digital, it seemed like fewer obstacles, at least the ones we were afraid of, like human bias.” ( introduced me to Ms. Anderson, Mr. McIntosh and Mr. Johnson, and I interviewed them independently.)

Digital lenders say that they assess risk using the same financial criteria as traditional banks: borrower income, assets, credit score, debt, liabilities, cash reserves and the like. These guidelines were laid out by the Consumer Finance Protection Bureau after the last recession to protect consumers against predatory lending or risky products.

These lenders could theoretically use additional variables to assess whether borrowers can repay a loan, such as rental or utility payment history, or even assets held by extended family. But generally, they don’t. To fund their loans, they rely on the secondary mortgage market, which includes the government-backed entities Freddie Mac and Fannie Mae, and which became more conservative after the 2008 crash. With some exceptions, if you don’t meet the standard C.F.P.B. criteria, you are likely to be considered a risk.

Fair housing advocates say that’s a problem, because the standard financial information puts minorities at a disadvantage. Take credit scores — a number between 300 and 850 that assesses how likely a person is to repay a loan on time. Credit scores are calculated based on a person’s spending and payment habits. But landlords often don’t report rental payments to credit bureaus, even though these are the largest payments that millions of people make on a regular basis, including more than half of Black Americans.

For mortgage lending, most banks rely on the credit scoring model invented by the Fair Isaac Corporation, or FICO. Newer FICO models can include rental payment history, but the secondary mortgage market doesn’t require them. Neither does the Federal Housing Administration, which specializes in loans for low and moderate-income borrowers. What’s more, systemic inequality has created significant salary disparities between Black and white Americans.

“We know the wealth gap is incredibly large between white households and households of color,” said Alanna McCargo, the vice president of housing finance policy at the Urban Institute. “If you are looking at income, assets and credit — your three drivers — you are excluding millions of potential Black, Latino and, in some cases, Asian minorities and immigrants from getting access to credit through your system. You are perpetuating the wealth gap.”

For now, many fintech lenders have largely affluent customers.’s average client earns over $160,000 a year and has a FICO score of 773. As of 2017, the median household income among Black Americans was just over $38,000, and only 20.6 percent of Black households had a credit score above 700, according to the Urban Institute. This discrepancy makes it harder for fintech companies to boast about improving access for the most underrepresented borrowers.

Software has the potential to reduce lending disparities by processing enormous amounts of personal information — far more than the C.F.P.B. guidelines require. Looking more holistically at a person’s financials as well as their spending habits and preferences, banks can make a more nuanced decision about who is likely to repay their loan. On the other hand, broadening the data set could introduce more bias. How to navigate this quandary, said Ms. McCargo, is “the big A.I. machine learning issue of our time.”

According to the Fair Housing Act of 1968, lenders cannot consider race, religion, sex, or marital status in mortgage underwriting. But many factors that appear neutral could double for race. “How quickly you pay your bills, or where you took vacations, or where you shop or your social media profile — some large number of those variables are proxying for things that are protected,” Dr. Wallace said.

She said she didn’t know how often fintech lenders ventured into such territory, but it happens. She knew of one company whose platform used the high schools clients attended as a variable to forecast consumers’ long-term income. “If that had implications in terms of race,” she said, “you could litigate, and you’d win.”

Lisa Rice, the president and chief executive of the National Fair Housing Alliance, said she was skeptical when mortgage lenders said their algorithms considered only federally sanctioned variables like credit score, income and assets. “Data scientists will say, if you’ve got 1,000 bits of information going into an algorithm, you’re not possibly only looking at three things,” she said. “If the objective is to predict how well this person will perform on a loan and to maximize profit, the algorithm is looking at every single piece of data to achieve those objectives.”

Fintech start-ups and the banks that use their software dispute this. “The use of creepy data is not something we consider as a business,” said Mike de Vere, the chief executive of Zest AI, a start-up that helps lenders create credit models. “Social media or educational background? Oh, lord no. You shouldn’t have to go to Harvard to get a good interest rate.”

In 2019, ZestFinance, an earlier iteration of Zest AI, was named a defendant in a class-action lawsuit accusing it of evading payday lending regulations. In February, Douglas Merrill, the former chief executive of ZestFinance, and his co-defendant, BlueChip Financial, a North Dakota lender, settled for $18.5 million. Mr. Merrill denied wrongdoing, according to the settlement, and no longer has any affiliation with Zest AI. Fair housing advocates say they are cautiously optimistic about the company’s current mission: to look more holistically at a person’s trustworthiness, while simultaneously reducing bias.

By entering many more data points into a credit model, Zest AI can observe millions of interactions between these data points and how those relationships might inject bias to a credit score. For instance, if a person is charged more for an auto loan — which Black Americans often are, according to a 2018 study by the National Fair Housing Alliance — they could be charged more for a mortgage.

“The algorithm doesn’t say, ‘Let’s overcharge Lisa because of discrimination,” said Ms. Rice. “It says, ‘If she’ll pay more for auto loans, she’ll very likely pay more for mortgage loans.’”

Zest AI says its system can pinpoint these relationships and then “tune down” the influences of the offending variables. Freddie Mac is currently evaluating the start-up’s software in trials.

Fair housing advocates worry that a proposed rule from the Department of Housing and Urban Development could discourage lenders from adopting anti-bias measures. A cornerstone of the Fair Housing Act is the concept of “disparate impact,” which says lending policies without a business necessity cannot have a negative or “disparate” impact on a protected group. H.U.D.’s proposed rule could make it much harder to prove disparate impact, especially stemming from algorithmic bias, in court.

“It creates huge loopholes that would make the use of discriminatory algorithmic-based systems legal,” Ms. Rice said.

H.U.D. says its proposed rule aligns the disparate impact standard with a 2015 Supreme Court ruling and that it does not give algorithms greater latitude to discriminate.

A year ago, the corporate lending community, including the Mortgage Bankers Association, supported H.U.D.’s proposed rule. After Covid-19 and Black Lives Matter forced a national reckoning on race, the association and many of its members wrote new letters expressing concern.

“Our colleagues in the lending industry understand that disparate impact is one of the most effective civil rights tools for addressing systemic and structural racism and inequality,” Ms. Rice said. “They don’t want to be responsible for ending that.”

The proposed H.U.D. rule on disparate impact is expected to be published this month and go into effect shortly thereafter.

Many loan officers, of course, do their work equitably, Ms. Rice said. “Humans understand how bias is working,” she said. “There are so many examples of loan officers who make the right decisions and know how to work the system to get that borrower who really is qualified through the door.”

But as Zest AI’s former executive vice president, Kareem Saleh, put it, “humans are the ultimate black box.” Intentionally or unintentionally, they discriminate. When the National Community Reinvestment Coalition sent Black and white “mystery shoppers” to apply for Paycheck Protection Program funds at 17 different banks, including community lenders, Black shoppers with better financial profiles frequently received worse treatment.

Since many clients still choose to talk with a loan officer, the company says it has prioritized staff diversity. Half of its employees are female, 54 percent identify as people of color and most loan officers are in their 20s, compared with the industry average age of 54. Unlike many of their competitors, the loan officers don’t work on commission. They say this eliminates a conflict of interest: When they tell you how much house you can afford, they have no incentive to sell you the most expensive loan.

These are positive steps. But fair housing advocates say government regulators and banks in the secondary mortgage market must rethink risk assessment: accept alternative credit scoring models, consider factors like rental history payment and ferret out algorithmic bias. “What lenders need is for Fannie Mae and Freddie Mac to come out with clear guidance on what they will accept,” Ms. McCargo said.

For now, digital mortgages might be less about systemic change than borrowers’ peace of mind. Ms. Anderson in New Jersey said that police violence against Black Americans this summer had deepened her pessimism about receiving equal treatment.

“Walking into a bank now,” she said, “I would have the same apprehension — if not more than ever.”

Read More

Posted on

Pandemic’s Costs Stagger the Nursing Home Industry

Even before they became deadly petri dishes for the worst pandemic in generations, many nursing homes were struggling to stay afloat and provide quality care.

But since the start of the coronavirus outbreak, nursing home operators have had to spend more money on protective equipment for staff and technology to connect residents with relatives who are no longer allowed to visit. Their revenues have shrunk because they are admitting fewer new residents in hopes of reducing the risk of infection.

The result is that some nursing homes, which often run on razor-thin profit margins, may be unable to pay their rent and other bills without government help.

“It could be a huge economic mess,” said Charlene Harrington, a professor emerita of nursing at the University of California, San Francisco. “It is possible that many nursing home chains could go bankrupt with the virus.”

Presbyterian Homes and Services, a Minnesota-based nonprofit operator of 16 nursing homes, estimates that the average 72-bed nursing home is spending an additional $2,265 a day on personal protective gear and an additional $1,500 a day on extra nursing staff.

At Meadow Ridge, a retirement community in Redding, Conn., with 62 nursing-home beds, executives have been forced to use Amazon or outside vendors to buy protective gear, said Kimberly Held, the community’s director of nursing.

“The pricing is unbelievable,” Ms. Held said. “I did have to order ponchos as a backup plan if we ran out of gowns. When we received our N95s, it felt like Christmas,” she added, referring to a type of mask that has been in short supply.

Nursing homes care for about 1.5 million people in the United States, and 70 percent of the 15,400 facilities are run for profit. While the financial picture for the industry, which also includes homes run by government agencies and nonprofits, was hardly rosy before the virus struck, it was especially precarious for many for-profit nursing homes.

Reimbursements from government programs like Medicaid are a main source of revenue for nursing homes, but operators have long complained those payments have not kept pace with the cost of care.

The industry is increasingly relying on the government for another form of support: The Department of Housing and Urban Development guarantees $20 billion in mortgages to more than 2,300 nursing homes — about 15 percent of the country’s total, up from about 5 percent a quarter-century ago. (Last year, the $146 million collapse of Rosewood Care Centers was the biggest default in the history of the mortgage-guarantee program.)

It can by pricey just to keep the doors open.

For-profit nursing homes often rent their properties under long-term leases from real estate investment trusts, known as REITs; investment firms; or private equity shops. A review of regulatory filings found that six major health care REITs — Sabra, Welltower, National Health Investors, Omega Healthcare Investors, LTC and CareTrust — had a business interest in more than 1,500 nursing homes.

The ownership structure has proved lucrative to investors in major health care REITs, which typically own a mix of nursing homes, elder care facilities and medical buildings. But those long-term leases can be problematic during an economic slowdown, because many include clauses to increase their rent every year, according to regulatory filings.

“There wasn’t a lot of wiggle room in these lease deals,” said David Stevenson, a professor of health policy at the Vanderbilt University School of Medicine who has studied the nursing home industry. Mr. Stevenson was talking broadly about the industry and not any specific company.

Advocates for the elderly say care inevitably suffers when nursing homes face financial trouble. More than a half-million nursing home residents lived in facilities rated below average or much below average in the federal government’s five-star rating system.

Some of the details in the current crisis have been grim: An anonymous tip led to the discovery of 17 bodies in the on-site morgue of one complex in New Jersey, and the death toll has since risen to 70 people. That facility, the Andover Subacute and Rehabilitation Center I and II nursing homes, is part of small chain in New Jersey and Pennsylvania operated by Alliance Healthcare, which rents the properties from affiliates of a Chicago-area firm, Altitude Investments, according to regulatory filings.

Altitude has had “more frequent communication” with the operator over the past month and has offered to provide assistance, said William Rothner, the firm’s president. He said Alliance had not requested any.

The lowest-rated homes were disproportionately operated for profit. Nearly half the residents of for-profit nursing homes lived in ones where the federal government found below-average staffing levels, compared with 23 percent of the residents of government or nonprofit facilities, according to a New York Times analysis of government data.

Genesis HealthCare, one of the country’s largest for-profit operators, exemplifies many of the pressures.

It rents the property for more than 70 percent of the 357 nursing homes it operates in the United States. Genesis’ shares trade for under $1, in part because of investor concern over its $1.6 billion in debt and the $5 billion outstanding on the value of its long-term leases. And nearly half the properties operated by Genesis scored two stars or lower in the government rankings.

Lori Mayer, a company spokeswoman, did not address Genesis’ financial situation and said any update would come when Genesis reports earnings next month. The company, she said, has taken a number of precautions to limit the spread of the highly contagious virus, which has torn through nursing homes from Washington State to New York.

The Times has identified more than 4,000 nursing homes and other long-term care or elder facilities across the United States with coronavirus cases, based on reports by states and counties. More than 36,500 residents and staff members at those facilities have contracted the virus, and more than 7,000 have died. The Times, however, has been able to identify at least 1,700 nursing homes on that list of facilities, which have reported 4,000 deaths.

In hopes of avoiding devastating outbreaks, nursing home operators have had to admit fewer residents. The pandemic has rippled in other ways, too: Many hospitals are postponing elective procedures that usually require short rehabilitation stints, taking away another source of clients.

“A very large fraction of their most profitable business is post-surgical rehabilitation,” said Howard Gleckman, a senior fellow at the Urban Institute. “And most of that business is gone.”

Nursing home advocates have said the industry may need $15 billion from the federal government to ride out the crisis. The Centers for Medicare and Medicaid Services, a federal regulator, filled in some of that gap by advancing payments to nursing homes and providing up to $1.5 billion in aid. But industry executives said it was not enough.

“I do think there will need to be more money,” said Rick Matros, chief executive officer of Sabra Health Care REIT, which holds the leases on 296 nursing homes and several hundred other senior living centers. “The financial stress is real.”

Mr. Matros said his firm was prepared to provide rent relief to operators if necessary. Most of Sabra’s tenants should not need that help, but that could change if the crisis continues for several more months.

“We are looking at each tenant individually, and some are stronger than others,” Mr. Matros said.

But Ms. Harrington, the professor who has written about the impact of for-profit nursing homes on resident care, is concerned about operators’ taking federal money to stabilize their balance sheets or pay Wall Street landlords at the expense of taking better care of residents and staff.

“I am hoping that nursing homes will use some of their money for hazard pay to workers and also to bring in and train a lot of additional care providers,” she said.

Reporting was contributed by Mitch Smith, Karen Yourish, Sarah Almukhtar and Danielle Ivory.

Read More

Posted on

Racing to Head Off Evictions and Foreclosures

The financial shock from the coronavirus pandemic threatens the housing security of millions of Americans, prompting federal, state and local officials — and even judges and the police — to move quickly to ward off foreclosures and evictions.On Wednesday, the federal agency overseeing Fannie Mae and Freddie Mac, the giant government-run …

Read More