April 20, 2022
By: Arielle Seidman and Anthony Martin
Automated and algorithmic decision-making tools have become run-of-the-mill in everything from loan and apartment applications to employment searches and university acceptances. Such tools provide increased efficiency, accuracy, and customer satisfaction for, among many others, banks and financial institutions. These innovations have sped up personal, business, and mortgage loan processes while eliminating many of the inefficiencies of manual practices, including inaccuracies, security risks, and long wait times.
However, as algorithmic loan origination software has developed, the increased reliance on purchased data from third parties and the use of inferred user data has caught the attention of law enforcement for its potential as both a tool for discriminatory practices and a threat to consumer privacy. The problem is not a new one—the EU’s General Data Protection Regulation (GDPR) and Canada’s proposed Consumer Privacy Protection Act include notification, consent, and explanation requirements for loan providers. Yet in the United States, the newly developed digital privacy regimes in states like California and Colorado have so far avoided direct interference with these software solutions.
That is about to change. In late 2021, the FTC released its Statement of Regulatory Priorities for 2022, which emphasized two main areas for which they anticipate a renewed focus on their rulemaking authority: (1) definition of unfair and deceptive practices, including those within Section 5 of the FTC Act; and (2) “abuses stemming from surveillance-based business models,” including “limiting intrusive surveillance, and ensuring that algorithmic decision-making does not result in unlawful discrimination.” This follows statements from earlier in 2021 wherein the FTC declared that automated decision-making programs with discriminatory effects could constitute a violation of Section 5 of the FTC Act.
The day before the FTC released their priorities, District of Columbia Attorney General Karl A. Racine proposed sweeping new legislation aimed at prohibiting the use of “algorithms that produce biased or discriminatory results and lock individuals . . . out of critical opportunities.” See the full press release here. Among the algorithmic processes that Attorney General Racine specifically identified are those used in “obtaining a mortgage, automobile financing, student loans, [and] any application for credit.” The proposed legislation would, among other things, require companies that use predictive algorithms to
- Audit those programs on an annual basis to identify discriminatory patterns;
- Send annual reports to the Attorney General’s office;
- Disclose the basis of their algorithmic programming, including the sources of all third-party data;
- Provide to all potential loan applicants a notice that details the use of personal information in any algorithmic decision-making;
- Provide an in-depth explanation upon rejection of a request for credit; and
- Give applicants that have been denied credit certain rights to correct information used by the algorithm.
Violations of the proposed law would carry civil penalties of up to $10,000 per violation, which, when applied across any systemic use of algorithmic programs could become prohibitively costly. The D.C. bill has been referred to the Committee on Government Operations and Facilities of the Washington D.C. Council.
D.C. is not alone. In February, New Jersey revived a Senate bill that, as written, would specifically target providers of loans, credit, financial assistance, or insurance who employ automated decision-making programs that have a discriminatory effect. While the law does not contain a civil remedy, as written, it would allow for criminal fines to be assessed against a violator for an act of “unlawful discrimination.” The draft bill, which has been referred to the Senate Commerce Committee, can be found here.
The D.C. and New Jersey proposals, while the most sweeping we have seen yet, are not the only ones on the horizon. While most states that have entered the foray have so far focused on the use of automated decision-making programs by government bodies, Colorado, Illinois, and New York have all proposed more limited measures to curtail their use by commercial entities. And rulemaking under existing statutes will continue to expand.
For example, the California Privacy Rights Act (CPRA), which created the California Privacy Protection Agency (CPPA), empowers the Agency to issue new regulations pertaining to twenty-two topics, including:
Issuing regulations governing access and opt-out rights with respect to businesses’ use of automated decision-making technology, including profiling and requiring businesses’ response to access requests to include meaningful information about the logic involved in such decision-making processes, as well as a description of the likely outcome of the process with respect to the consumer.
(Cal. Civ. Code Sec. 1798.185(a)(16)). While formal rulemaking activities of the CPPA have not yet begun, preliminary rulemaking is well underway, with the initial comment period having closed in 2021, and stakeholder sessions set to begin in May. A calendar for the formal rulemaking process has not yet been announced, but final rules are expected to come into force in 2023.
It is clear that automated decision-making is not going anywhere—the efficiencies, speed, and security that they provide would be impossible to abandon. But the focus of the FTC and state agencies on the potential negative impacts is not going anywhere either. Lenders should be proactive in anticipating and minimizing the likely effects of these new laws and regulations. This should include:
- Participation in rulemaking at all levels. Whether through the state Attorneys General’s offices or the FTC, active participation in comment periods, stakeholder sessions, and informal communications will be critical in ensuring the practicalities and costs of compliance with any new regulations are manageable.
- Staying abreast of changes in international law. The EU and UK have repeatedly been out in front on issues of both digital privacy and competition. The regulatory schemes employed abroad give a decent roadmap to the types of regulation companies in the US can expect to see soon.
- Developing robust record-keeping practices that include inputs from third-party data providers, and considering internal audits of any automated decision-making programs. Whether by rulemaking like that proposed by the FTC and California, or by legislation like that proposed in D.C. and New Jersey, higher reporting requirements are coming. Being prepared now can save a compliance disaster in the future.
- Demanding audits and transparency from third-party data providers. Lenders likely will be held accountable under any new compliance regime for the actions of their data and software providers. Recent claims of certain data brokers skirting regulations and providing falsified data means discovering any such fallacies now will prevent a costly enforcement and compliance battle later.
Buchalter’s experienced attorneys can assist in engaging clients directly with state Attorneys General offices and in providing guidance relating to any of these proposed courses of action. For more information about the upcoming changes to regulations and laws described above or for questions regarding formal or informal participation in rulemaking processes, please contact one of the attorneys listed below.
About the authors-
Arielle A. Seidman is a member of the White Collar & Investigations and Litigation Practices in the firm’s Orange County office with experience in antitrust investigations and compliance counseling.
Anthony Martin is a member of the White Collar & Investigations and Litigation Practices in the firm’s Scottsdale office. Prior to joining Buchalter, he was the Acting United States Attorney for District of Arizona and the Chief Deputy West Virginia Attorney General.
This communication is not intended to create or constitute, nor does it create or constitute, an attorney-client or any other legal relationship. No statement in this communication constitutes legal advice nor should any communication herein be construed, relied upon, or interpreted as legal advice. This communication is for general information purposes only regarding recent legal developments of interest, and is not a substitute for legal counsel on any subject matter. No reader should act or refrain from acting on the basis of any information included herein without seeking appropriate legal advice on the particular facts and circumstances affecting that reader. For more information, visit www.buchalter.com.