Amended
IN
Assembly
March 25, 2021 |
Introduced by Assembly Member Chau |
December 07, 2020 |
Existing law establishes the Department of Financial Protection and Innovation, headed by the Commissioner of Financial Protection and Innovation. Under existing law, the department has charge of the execution of specified laws relating to various financial institutions and financial services.
Existing law, the California Fair Employment and Housing Act, protects and safeguards the right and opportunity of all persons to seek, obtain, and hold employment without discrimination, abridgment, or harassment on
account of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, marital status, sex, gender, gender identity, gender expression, age, sexual orientation, or military and veteran status.
Existing law regulates the use of personal information, including the California Consumer Privacy Act of 2018, which grants a consumer various rights with regard to personal information relating to that consumer that is held by a business. The act requires a business that collects personal information about a consumer to disclose the consumer’s right to delete personal information in a form that is reasonably accessible to consumers and in accordance with a specified process.
This bill would enact the Automated Decision Systems Accountability Act of 2021. The bill
would require a business in California that provides a person, as defined, with a program or device that uses an automated decision system (ADS) to take affirmative steps to ensure that there are processes in place to continually test for biases during the development and usage of the ADS, conduct an ADS impact assessment on its program or device to determine whether the ADS has a disproportionate adverse impact on a protected class, as specified, examine if the ADS in question serves reasonable objectives and furthers a legitimate interest, and compare the ADS to alternatives or reasonable modifications that may be taken to limit adverse consequences on protected classes. The bill would require a business, by March 1, 2023, and annually thereafter, to submit a report to the Department of Financial Protection and Innovation providing specified information about its ADS impact assessment. The bill would also require a business, if it makes any significant modification to an ADS, to reconduct an ADS impact
assessment under these circumstances.
The bill would require the department, by January 1, 2023, to develop a procedure for businesses to use in making the required reports and to make general information on the reporting process available on its internet website. The bill would require the department, if a business fails to comply with these procedures, to send a written notice to the business regarding its failure to comply, and would require the business to submit the report within 60 days of the date of that notice. The bill would make violations of these provisions subject to a civil penalty.
The bill would also require the department, by March 1, 2023, to establish an Automated Decision Systems Advisory Task Force, composed of various representatives from the public and private sectors, for the purpose of reviewing and
providing advice on the use of automated decision systems in businesses, government, and various other settings.
The Legislature finds and declares all of the following:
(a)State law protects the rights of all persons in a variety of contexts without discrimination on account of certain protected characteristics, such as on the basis of race, religious creed, color, national origin, ancestry, physical disability, mental disability, medical condition, genetic information, or marital status, among other characteristics, as described in Section 51 of the Civil Code.
(b)The rise of big data has raised concerns about the use of algorithmic or automated decision systems to make hiring and other
workplace decisions, eligibility decisions, insurance eligibility, lending decisions, and marketing decisions quickly, automatically, and fairly.
(c)If the underlying data used for an algorithm or automated decision system is biased, incomplete, or discriminatory, the decisions made by using such devices has the potential to result in massive inequality.
(d)The state has a legitimate and substantial interest in ensuring that automated decision systems used do not result in discrimination.
(e)Therefore, the Legislature finds that it is necessary to require a review of the use of algorithmic decision systems also known as automated decision systems (ADS) in order to detect and prevent discrimination.
This act shall be known and may be cited as the Automated Decision Systems Accountability Act of 2021.
For the purposes of this title, the following definitions apply:
(a)“Automated decision system” or “ADS” means a computational process, including one derived from machine learning, statistics, or other data processing or artificial intelligence techniques, that makes a decision or facilitates human decision making, that impacts persons.
(b)“Automated decision system impact assessment report” or “ADS impact assessment report” means a report containing, but not limited to, the content enumerated in paragraph (3) of subdivision (a) of Section 1798.402.
(c)“Business” means a digital or software company that creates or distributes an ADS.
(d)“Department” means the Department of Financial Protection and Innovation.
(e)“Person” means an individual, firm, association, organization, partnership, limited liability company, business trust, corporation, or public entity of any kind.
(a)A business in California that provides a person with a program or device that uses an ADS shall do all of the following:
(1)Take affirmative steps to ensure that there are processes in place to continually test for biases during the development and usage of the ADS.
(2)Conduct an assessment on its program or device that uses an ADS to do all of the following:
(A)Determine whether the ADS under review has a disproportionate adverse impact on a protected class, as described in subdivision (b) of Section 51. A business may contract
with a third party to independently create the ADS impact assessment for the purpose of providing an additional level of credibility.
(B)Examine if the ADS in question serves reasonable objectives and furthers a legitimate interest.
(C)Compare the ADS to alternatives or reasonable modifications that may be taken to limit adverse consequences on protected classes.
(3)On or before March 1, 2023, and annually thereafter, a business shall submit an ADS impact assessment report to the department, in a format developed by the department pursuant to subdivision (b), which includes all of the following:
(A)The name, vendor, and version of the automated decision system
and a description of the general capabilities of the automated decision system, including reasonably foreseeable capabilities outside the scope of its designed use.
(B)The type or types of data inputs that the technology uses; how those data are generated, collected, and processed; and the type or types of data the system is reasonably likely to generate.
(C)A description of the purpose of the automated decision system, including what decision or decisions it supports, and its intended benefits, including any data or research demonstrating those benefits, relative to other automated and nonautomated approaches.
(D)A clear use and data management policy, including protocols for all of the following:
(i)How and when the automated decision system can be deployed or used and by whom.
(ii)What practices are in place in order to limit the collection and retention of information to that which is directly relevant and necessary for the specified purpose.
(iii)What information about the automated decision system is and will be available to consumers, and the extent to which consumers have and will have access to the results of the automated decision system and may correct or object to its results.
(iv)What processes will be required prior to the use of the automated decision system.
(v)How automated
decision system data will be stored and accessed to mitigate security risks and threats.
(vi)How the business will ensure that all those who operate the automated decision system or access its data are knowledgeable about and able to ensure compliance with the use and data management policy prior to use of the automated decision system.
(E)A description of any third-party engagement, or action by the business, to conduct legitimate, independent, and reasonable tests of the automated decision system to assess the risks posed to the privacy or security of personal information of consumers and the risks that the automated decision system may result in or contribute to inaccurate, unfair, biased, or discriminatory decisions impacting consumers.
(F)A description of a mitigation plan to address any potential disparate impact of the automated decision system on a protected class.
(4)If a business makes any significant modification to an ADS, the business shall reconduct an ADS impact assessment and resubmit the results of that assessment to the department no later than 60 days from the modification.
(b)On or before January 1, 2023, the department shall develop a procedure, including a form, if necessary, for businesses to use in making the reports required pursuant to this section. The department also shall make general information on the reporting process accessible on its internet website on or before January 1, 2023.
(c)If a business
fails to comply with this section, the department shall send a written notice to the business of its failure to comply. The business shall have 60 days from the date of the written notice in which to comply, by completing the report and submitting it to the department. Failure by a business to submit the required report shall result in a civil penalty.
On or before March 1, 2023, the department shall establish an Automated Decision Systems Advisory Task Force for the purpose of reviewing and providing advice on the use of automated decision systems in businesses, government, and various other settings. The task force shall consist of all of the following:
(a)Two representatives from advocacy organizations that represent consumers or protected classes of communities, as described in subdivision (b) of Section 51.
(b)Two members from state or local government agencies.
(c)Two representatives from digital or software
companies who use or create automated decision systems.
(d)Two representatives from universities or research institutions with expertise in automated decision systems.