0 votes
asked ago by (56.2k points)
edited ago by
These two items were independently released on the same day.

1) Consumer Protection: Congress Should Consider Enhancing Protections around Scores Used to Rank Consumers
GAO-22-104527

Consumer scores are indicators that group consumers based on their past actions and characteristics. Score creators use public records and nonpublic information such as purchase histories to create scores. Businesses and other entities use these scores to segment or rank individuals to predict how they will behave in the future. For example, businesses use certain scores to target advertising toward consumers most likely to purchase a particular product or service. Consumers may benefit from such scores by receiving targeted discounts or deals that they might not have otherwise received. Consumer scores have a variety of uses, although the full range of uses is unknown. GAO identified a number of score uses [e.g., marketing, health care administration, higher education, debt collection, insurance, fraud prevention, identity verification, legal due diligence, criminal justice].

The risks that consumer scores can pose include potential bias and adverse effects, and the scores generally lack transparency. The data used to create scores may contain racial biases—for example, one study found Black patients were assigned lower risk scores than White patients with the same health care needs, predicting less of a need for a care management program. The use of consumer scores can also have potential negative outcomes for some consumers, who may be charged higher prices or targeted for less desirable financial products. Further, consumers are generally unaware of the ways in which they are scored—which prevents them from knowing how their personal information is being used and responding to negative consequences.

No federal law expressly governs the creation, sale, and use of all consumer scores. Federal consumer protection laws can help to ensure that consumer scores are based on accurate information and used in a fair and transparent manner, but these laws only apply in certain circumstances. For example, whether a law applies to a particular score may depend on the information used to create the score, the source of the score, or the purpose for which the score is used. Without congressional consideration of whether consumer scores should be subject to additional consumer protections, consumers may continue to be at risk of being adversely affected by the use of these scores and may have limited options for recourse.

The growing use of consumer scores to make decisions affecting consumers has raised questions among some in Congress and others about their usage and potential risks. Scores are generated using various pieces of information about consumers, which can include public data. Some may derive from complex methodologies using technologies such as artificial intelligence.

GAO was asked to review how predictive consumer scores are used and regulated. This report examines (1) how such scores are used, (2) the potential risks to consumers, and (3) federal consumer protections for scores. The review is focused on selected types of scores, some of which may fall outside of the Fair Credit Reporting Act. GAO analyzed publicly available information from the websites of a nongeneralizable sample of 49 consumer scores, selected based on literature reviews and stakeholder interviews; reviewed studies by academics and consumer advocates; interviewed score creators, industry organizations, consumer advocates, and federal officials; and reviewed applicable laws and regulations.

Congress should consider implementing appropriate consumer protections for consumer scores beyond those currently afforded under existing federal laws. Among the issues that should be considered are the rights of consumers to view and correct data used in the creation of scores and to be informed of scores' uses and potential effects.

https://www.gao.gov/products/gao-22-104527

2) CFPB Acts to Protect the Public from Black-Box Credit Models Using Complex Algorithms
Companies relying on complex algorithms must provide specific and accurate explanations for denying applications

Today, the Consumer Financial Protection Bureau (CFPB) confirmed that federal anti-discrimination law requires companies to explain to applicants the specific reasons for denying an application for credit or taking other adverse actions, even if the creditor is relying on credit models using complex algorithms. The CFPB published a Consumer Financial Protection Circular to remind the public, including those responsible for enforcing federal consumer financial protection law, of creditors’ adverse action notice requirements under the Equal Credit Opportunity Act (ECOA).

“Companies are not absolved of their legal responsibilities when they let a black-box model make lending decisions,” said CFPB Director Rohit Chopra. “The law gives every applicant the right to a specific explanation if their application for credit was denied, and that right is not diminished simply because a company uses a complex algorithm that it doesn’t understand.”

Data harvesting on Americans has become voluminous and ubiquitous, giving firms the ability to know highly detailed information about their customers before they ever interact with them. Many firms across the economy rely on these detailed datasets to power their algorithmic decision-making, which is sometimes marketed as “artificial intelligence.” The information gleaned from data analytics has a broad range of commercial uses by financial firms, including for targeted advertising and in credit decision-making.

Law-abiding financial companies have long used advanced computational methods as part of their credit decision-making processes, and they have been able to provide the rationales for their credit decisions. However, some creditors may make credit decisions based on the outputs from complex algorithms, sometimes called “black-box” models. The reasoning behind some of these models’ outputs may be unknown to the model’s users, including the model’s developers. With such models, adverse action notices that meet ECOA’s requirements may not be possible.

ECOA protects individuals and businesses against discrimination when seeking, applying for, and using credit. To help ensure a creditor does not discriminate, ECOA requires that a creditor provide a notice when it takes an adverse action against an applicant, which must contain the specific and accurate reasons for that adverse action. Creditors cannot lawfully use technologies in their decision-making processes if using them means that they are unable to provide these required explanations.

Today’s Circular makes clear that:

-- Federal consumer financial protection laws and adverse action requirements should be enforced regardless of the technology used by creditors. For example, ECOA does not permit creditors to use technology that prevents them from providing specific and accurate reasons for adverse actions. Creditors’ use of complex algorithms should not limit enforcement of ECOA or other federal consumer financial protection laws.

-- Creditors cannot justify noncompliance with ECOA based on the mere fact that the technology they use to evaluate credit applications is too complicated, too opaque in its decision-making, or too new. Creditors who use complex algorithms—including artificial intelligence or machine learning technologies—to engage in credit decisions must still provide a notice that discloses the specific, principal reasons for taking adverse actions. There is no exception for violating the law because a creditor is using technology that has not been adequately designed, tested, or understood.

-- Whistleblowers play a central role in uncovering information about companies using technologies, like black-box models, in ways that violate ECOA and other federal consumer financial protection laws. Having clear, actionable information is critical for the CFPB and other consumer protection enforcers. The CFPB encourages tech workers to provide the agency with information, and they can visit the CFPB’s Whistleblower Program webpage to learn more.

Along with whistleblowers, government partners are also vital to the CFPB’s enforcement efforts. For example, the CFPB is closely monitoring the work of the National Institute of Standards and Technology , within the U.S. Department of Commerce , and other governmental bodies around the world, to assess the benefits and risks associated with emerging technologies.

The risks associated with decision-making technologies extend beyond adverse action notices and ECOA. Recently, the CFPB began taking a close look at the use of automated valuation models within the home appraisal process to ensure home valuations are accurate and fair.

News release: https://www.consumerfinance.gov/about-us/newsroom/cfpb-acts-to-protect-the-public-from-black-box-credit-models-using-complex-algorithms/
Circular: https://www.consumerfinance.gov/compliance/circulars/circular-2022-03-adverse-action-notification-requirements-in-connection-with-credit-decisions-based-on-complex-algorithms/

Please log in or register to answer this question.

...