Nov 27, 2024 Story by: Editor
Mary Louis was excited to move into a new apartment in Massachusetts in the spring of 2021, but her hopes were dashed when she received an email stating that a “third-party service” had denied her application. The decision was based on an algorithm designed to assess rental applicants. This algorithm later became the subject of a class-action lawsuit, led by Louis, alleging discrimination based on race and income.
A federal judge approved a settlement in the case, marking a significant development in the fight against algorithmic bias. The company behind the algorithm, SafeRent Solutions, agreed to pay over $2.2 million and adjust parts of its screening system that the lawsuit claimed were discriminatory. While the settlement does not admit fault, SafeRent stated that it “continues to believe the SRS Scores comply with all applicable laws” but acknowledged that litigation is costly and time-consuming.
The case sheds light on the growing use of artificial intelligence and algorithms in important life decisions, including housing, employment, and healthcare. While these systems have been in use for years, they often operate without significant oversight, raising concerns about potential biases.
Louis’ attorneys argued that SafeRent’s algorithm unfairly penalized applicants like her by failing to account for housing vouchers, which demonstrate an ability to pay rent. They also alleged that the system relied too heavily on credit scores, and disadvantaged low-income renters, particularly Black and Hispanic applicants, who often face systemic barriers leading to lower median credit scores. Christine Webber, one of the attorneys, emphasized that even when algorithms are not intentionally programmed to discriminate, the data they use can still result in biased outcomes, saying, “The same effect [can occur] as if you told it to discriminate intentionally.”
When her application was denied, Louis tried to appeal the decision, citing her history of 16 years of timely rent payments with references from two landlords. However, her appeal was rejected, with the management company stating, “We do not accept appeals and cannot override the outcome of the Tenant Screening.” Louis expressed her frustration, saying, “Everything is based on numbers. You don’t get the individual empathy from them. There is no beating the system. The system is always going to beat us.”
While some state lawmakers have pushed for stronger regulations on AI systems, such efforts have largely failed to gain traction, leaving lawsuits like Louis’ as critical steps in holding companies accountable. In this case, Louis’ attorneys, supported by the U.S. Department of Justice, argued that SafeRent’s algorithm played a significant role in limiting access to housing, even though it did not make the final decisions. The court agreed, denying SafeRent’s motion to dismiss.
The settlement requires SafeRent to remove its scoring feature from tenant screening reports in certain cases, such as when applicants use housing vouchers. Any future scoring systems developed by the company must be validated by a third party approved by the plaintiffs.
Following the ordeal, Louis’ son found her an apartment through Facebook Marketplace. Although the rent was $200 higher and the neighborhood less desirable, she moved in and has continued to support her family. Reflecting on her experience, Louis said, “I’m not optimistic that I’m going to catch a break, but I have to keep on keeping on, that’s it. I have too many people who rely on me.”
This case highlights the broader issue of algorithmic bias and its impact on marginalized communities, demonstrating the importance of ensuring fairness and accountability in AI-driven systems. Source: AP News