AI tenant tool SafeRent settles $2.3M housing bias lawsuit

By Suswati Basu

AI tenant tool SafeRent settles $2.3M housing bias lawsuit

Experts call the case a landmark for addressing AI bias in housing and civil rights advocacy.

SafeRent Solutions, an AI-driven tenant screening platform, has settled a class action lawsuit filed in Massachusetts.

On Wednesday (Nov. 20), U.S. District Judge Angel Kelley gave final approval to a $2.3 million settlement agreement.

The lawsuit alleged that SafeRent's AI algorithm unfairly assigned lower scores to Black and Hispanic applicants, as well as individuals using housing vouchers, compared to other prospective tenants.

However, the settlement does not include any admission of wrongdoing by SafeRent Solutions. In a statement, the company said that while it "continues to believe the SRS Scores comply with all applicable laws, litigation is time-consuming and expensive."

Bloomberg cited the agreement, which stated that SafeRent would not be able to include in reports "a SafeRent Score or an accept/decline recommendation based on a tenant screening score."

Consequently, SafeRent is now required to ensure that its customers certify that "the rental applicant for whom they are requesting a SafeRent Score is not currently a recipient of any publicly-funded federal or state housing voucher."

AI bias 'gravely concerning to fair housing'

Brian Corman, a partner at Cohen Milstein and head of the firm's fair housing litigation efforts, played a key role in negotiating the settlement. He said in a statement: "Federal and state housing voucher programs were established to give recipients, who are disproportionately Black and Hispanic renters, more choice in where they live.

"The changes SafeRent has agreed to make are key to ensuring the original intention of the nation's voucher programs, helping to erase historic discrimination in the housing markets."

Christine E. Webber, co-chair of Cohen Milstein's civil rights and employment practice, added that the court's decision represents a landmark case for the home rental and property management industries.

She said: "Decision-making algorithms, such as the ones at issue here, are often opaque.

"Vendors who develop these algorithms are not willing to disclose all the data they consider or how the data is weighted in score modeling. This is gravely concerning to fair housing, employment, and civil rights advocates as potentially discriminatory bias can be easily coded into automated decision-making platforms."

Webber explained that the ability to hold vendors accountable was essential for full enforcement of the civil rights laws.

Previous articleNext article

POPULAR CATEGORY

corporate

9375

tech

10672

entertainment

11439

research

5178

misc

12086

wellness

9208

athletics

12100