Two years in the past, Mary Louis submitted an utility to hire an residence at Granada Highlands in Malden, Massachusetts. She favored that the unit had two full bogs and that there was a pool on the premises. However the landlord denied her the residence, allegedly resulting from a rating assigned to her by a tenant-screening algorithm made by SafeRent.
Louis responded with references to show 16 years of punctual hire funds, to no avail. As a substitute she took a unique residence that price $200 extra a month in an space with the next crime fee. However a class-action filed by Louis and others final Might argues that SafeRent scores based mostly partly on data in a credit score report amounted to discrimination in opposition to Black and Hispanic renters in violation of the Honest Housing Act. The groundbreaking laws prohibits discrimination on the premise of race, incapacity, faith, or nationwide origin and was handed in 1968 by Congress every week after the assassination of Martin Luther King Jr.
That case continues to be pending, however the US Division of Justice final week used a quick filed with the courtroom to ship a warning to landlords and the makers of tenant-screening algorithms. SafeRent had argued that algorithms used to display screen tenants aren’t topic to the Honest Housing Act, as a result of its scores solely advise landlords and don’t make choices. The DOJ’s transient, filed collectively with the Division of Housing and City Improvement, dismisses that declare, saying the act and related case regulation depart no ambiguity.
“Housing suppliers and tenant screening firms that use algorithms and knowledge to display screen tenants are usually not absolved from legal responsibility when their practices disproportionately deny individuals of coloration entry to truthful housing alternatives,” Division of Justice civil rights division chief Kristen Clarke mentioned in a assertion.
Like in lots of areas of enterprise and authorities, algorithms that assign scores to individuals have develop into extra widespread within the housing trade. However though claimed to enhance effectivity or establish “higher tenants,” as SafeRent advertising materials suggests, tenant-screening algorithms may very well be contributing to traditionally persistent housing discrimination, regardless of a long time of civil rights regulation. A 2021 research by the US Nationwide Bureau of Financial Analysis that used bots utilizing names related to completely different teams to use to greater than 8,000 landlords discovered important discrimination in opposition to renters of coloration, and significantly African People.
“It’s a aid that that is being taken critically—there’s an understanding that algorithms aren’t inherently impartial or goal and deserve the identical degree of scrutiny as human decisionmakers,” says Michele Gilman, a regulation professor on the College of Baltimore and former civil rights lawyer on the Division of Justice. “Simply the truth that the DOJ is in on this I feel is a giant transfer.”
A 2020 investigation by The Markup and Propublica discovered that tenant-screening algorithms typically encounter obstacles like mistaken id, particularly for individuals of coloration with widespread final names. A Propublica evaluation of algorithms made by the Texas-based firm RealPage final yr instructed it may possibly drive up rents.

