When Chris Robinson applied to move into a California senior living community five years ago, the property manager ran his name through an automated screening program that reportedly used artificial intelligence to detect “higher-risk renters.” Robinson, then 75, was denied after the program assigned him a low score — one that he later learned was based on a past conviction for littering.
Not only did the crime have little bearing on whether Robinson would be a good tenant, it wasn’t even one that he’d committed. The program had turned up the case of a 33-year-old man with the same name in Texas — where Robinson had never lived. He eventually corrected the error but lost the apartment and his application fee nonetheless, according to a federal class-action lawsuit that moved towards settlement this month. The credit bureau TransUnion, one of the largest actors in the multi-billion-dollar tenant screening industry, agreed to pay $11.5 million to resolve claims that its programs violated fair credit reporting laws.
Landlords are increasingly turning to private equity-backed artificial intelligence (AI) screening programs to help them select tenants, and resulting cases like Robinson’s are just the tip of the iceberg. The prevalence of incorrect, outdated, or misleading information in such reports is increasing costs and barriers to housing, according to a recent report from federal consumer regulators.
Even when screening programs turn up real data, housing and privacy advocates warn that opaque algorithms are enshrining high-tech discrimination in an already unequal housing market — the latest example of how AI can end up amplifying existing biases.
Watch The Lever
Make sure you’re subscribed to The Lever on YouTube to get our latest video reports and other special content.
Alongside the TransUnion lawsuit, at least four other tenant screening companies, many of which purport to predict “rental risk” through the use of AI, are currently facing more than 90 federal civil rights and consumer lawsuits, according to a Lever review of court records. The outcomes of those cases, along with potential new rules from federal agencies, could help set the tone for coming regulatory battles over AI, as concerns mount over its proliferating uses.
Last month, 15 state attorneys general submitted a letter urging regulators to ensure that “applicants for housing have access to all the data that is being used to make determinations of their tenant ‘worthiness’” — and that screening companies are complying with civil rights law. Federal regulators are currently considering additional regulations on tenant screening programs.
But such measures are staunchly opposed by lobbyists for the real estate, property management, and consumer data industries — the latter of which have also fought state legislation to rein in the use of Big Data in housing, employment, and other high-stakes decisions.
The Consumer Data Industry Association, a lobbying group for screening and credit reporting companies, has reported spending more than $400,000 so far this year lobbying in states considering legislation to increase transparency in the development and use of AI.
In a June 2021 letter to federal consumer regulators, the industry group argued against the need for additional oversight of AI in financial technologies. By incentivizing accurate and predictive tools that create profit-making opportunities, the letter said, “the marketplace itself inherently regulates AI systems.”
“They’re Not Neutral At All”
The explosion of automated tenant screening programs has gone hand in hand with the consolidation of housing in the wake of the 2008 foreclosure crisis. The average American now spends more than a third of their income on housing, a trend driven in part by Wall Street landlords that hike rents, collect fees, and increasingly turn to automated systems to manage their sprawling, nationwide portfolios.
An estimated 2,000 third-party screening companies offer mega-landlords, who often lack staff on the ground, a faster alternative to traditional background checks.
The technology has also attracted the interest of private equity and venture capital, with billions of dollars pouring into companies with names like Turbo Tenant, RentSpree, and LandlordStation — the latter of which proclaims, “We work hard to make your life a little bit easier!”
The costs of screening reports vary, but they’re often paid for by tenants, and those who receive scores low enough for a “conditional acceptance” are often forced to pay higher deposits, according to reporting by ProPublica.
Most screening companies say their algorithms rely on the same types of records that many landlords would otherwise check themselves, including credit score reports and criminal histories. That longtime practice has already increased barriers to high-quality housing for many people of color. Available research has found that criminal records are generally not a good predictor of how someone will behave as a tenant, whereas housing instability is closely associated with recidivism.
Many cities already limit landlords’ use of background checks in housing applications. But when those decisions are outsourced to unregulated algorithms, “It moves existing problems with access to housing even further out of reach of accountability and transparency,” said Hannah Holloway, director of policy and research at the TechEquity Collaborative, a nonprofit surveying the impact of tenant screening programs.
Holloway gives the example of the screening company Naborly, which says it compares “a tenant’s unique characteristics” to the characteristics of the landlord’s rental property. A sample report available online evaluates applicants in categories such as “income and employment stability” and “consumer behavior analysis,” which are used to produce a series of ratings indicating the predicted likelihood of a tenant paying late, moving out early, or a range of other outcomes.
“We don’t know what their data sources are, or how often they’re scrubbing that information and updating it,” Holloway said. And while some characteristics may be fairly objective, she noted, “If I’m a tenant, I have no idea how they’re using that information to come up with a prediction about whether I’ll damage the property or miss a payment.”
Naborly did not respond to a request for comment.
Screening companies argue that the decision about whether to accept a tenant ultimately lies with the landlord. But a recent behavioral study using simulated screening reports found that landlords relied primarily on the scores returned, rather than the underlying data — even though the underlying data often contained critical context, such as when a criminal charge or eviction lawsuit had ultimately been dismissed.
Wonyoung So, the author of the study and a doctoral candidate in MIT’s Department of Urban Studies and Planning, calls this “automation bias.” According to So, “Automated decision-making systems seem to offer these neutral recommendations, but they’re not neutral at all.”
“Not An Excuse For Lawbreaking Behavior”
In April, the Consumer Financial Protection Bureau (CFPB) and three other federal agencies released a joint statement asserting that automated systems are “not an excuse for lawbreaking behavior” — and that they would enforce civil rights, consumer protection, and fair competition laws in relation to these technologies.
The CFPB has received thousands of consumer complaints about screening reports, and along with the Federal Trade Commission, recently concluded collecting public input about the programs — a possible first step toward further rulemaking.
While consumer groups and data analytics experts weighed in to urge greater oversight and algorithmic audits from federal agencies, some industry lobbying groups painted the move as regulatory overreach.
Learn All Our Investigative Tricks
Score a copy of our Citizens’ Guide to Following the Money and Holding the Powerful Accountable, free with a paid subscription. The e-book gives you all the tools and tricks our reporting team uses to scrutinize power.
The National Multifamily Housing Council, which represents large landlords and screening companies, cautioned against “reporting measures that unduly interrupt necessary operational and property management practices.”
The Consumer Data Industry Association, which earlier warned federal regulators against “allowing states free rein to restrict use of AI,” has also repeatedly opposed states’ initial efforts at reining in AI — including a California bill that would have required landlords and property management companies to conduct and submit to state regulators annual impact assessments of the tools. The bill died in committee earlier this year.
In response to questions from The Lever, a spokesperson for the group provided a statement that said, “To preserve the existing housing stock and continue to build up supply, property owners must be able to assess the reliability of a prospective resident to pay rent.”
“The Statutes Just Aren’t Keeping Up”
Ironically, screening companies have long marketed their services as helping landlords reduce the risk of lawsuits by basing decisions on objective data.
For more than a decade, industry groups have argued that algorithms don’t discriminate. In 2013, the Consumer Data Industry Association submitted an amicus brief in a Supreme Court case pertaining to the so-called disparate impact standard under fair housing law — which holds that apparently neutral policies can still have discriminatory effects.
The lobbying group argued in the brief that making companies liable for disparate impact would negatively impact its members, who provide “race-neutral predictive information,” and end up forcing landlords into “a Hobson’s choice” between foregoing that information or facing lawsuits.
That case settled, but a subsequent Supreme Court case, which the industry group also weighed in on, upheld that proof of intentional discrimination isn’t required to bring fair housing claims — the impact of an action or policy is what matters.
Now, screening companies themselves are subject to a host of federal lawsuits alleging that far from being race-neutral, they are flouting civil rights law through criteria that disproportionately deny people of color from housing.
According to one suit filed in 2018, a Connecticut man unable to walk or care for himself following an accident had his housing application denied due to a screening report from CoreLogic Rental Solutions, now known as SafeRent Solutions. The man, Mikhail Arroyo, had sought to move in with his mother, who was told only that he had “disqualifying” criminal records. She could get no further information about what those records were — an alleged violation of federal fair credit reporting laws.
Arroyo’s record consists of a single charge for retail theft when he was 20 years old, and it was ultimately withdrawn, according to the complaint. His attorneys also argue that disqualifying applicants based on an arrest record is a violation of fair housing law, because people of color are more likely to be arrested in Connecticut and nationwide.
Another pending federal lawsuit against SafeRent claims that the company is discriminating against Black and Hispanic rental applicants who use federally funded housing vouchers.
SafeRent’s algorithm is proprietary, but the company says that it relies on factors including bankruptcy records, eviction histories, and credit scores. The complaint notes that about 45 percent of Black consumers and 32 percent of Hispanic consumers have subprime credit scores, compared to 18 percent of white consumers.
In its marketing, the company asserts that applicants awarded high scores generally “pay on time, treat the property with care, and stay for longer periods, all of which help management maximize net operating income.”
But SafeRent does not consider whether applicants have housing vouchers, according to the lawsuit — even though those vouchers typically cover most of tenants’ rent, dramatically increasing their ability to meet payments.
Attorneys for SafeRent and CoreLogic did not respond to a request for comment on the litigation.
Eric Dunn, litigation director at the National Housing Law Project and one of the attorneys representing Arroyo, said these scenarios underscore the need for updated regulation and enforcement.
Federal fair credit reporting law, which governs the information collected by credit reporting agencies and provides consumers with periodic free access, “still talks about going into an office and looking through a manila folder,” he said. “The statutes just aren’t keeping up with the way that the industry operates.”
As faceless mega-landlords and proprietary algorithms gain greater control over access to housing, Dunn and other advocates are calling on regulators to crack down on predatory fees, ensure tenants have the right to review and correct their files, or even pause use of the programs until they can be evaluated.
“It cannot be overstated that much of the technology fueling this rise of digitized tenant screening services is a black box,” wrote MIT’s So and four colleagues in a comment to regulators last month. “We recommend that regulators establish a federal moratorium on tenant screening services until such services can be proven safe, fair, and non-discriminatory.”