Current through the 2024 Legislative Session.
Section 11547.5 - [Effective 1/1/2025] [Effective until 1/1/2025] Deepfakes(a) For the purpose of this section: (1) "Artificial intelligence" has the same definition as in Section 11546.45.5 of the Government Code.(2) "Deepfake" means audio or visual content that has been generated or manipulated by artificial intelligence which would falsely appear to be authentic or truthful and which features depictions of people appearing to say or do things they did not say or do without their consent.(3) "Digital content forgery" means the use of technologies, including artificial intelligence and machine learning techniques, to fabricate or manipulate audio, visual, or text content with the intent to mislead.(4) "Digital content provenance" means the verifiable chronology of the original piece of digital content, such as an image, video, audio recording, or electronic document.(5) "Secretary" means the Secretary of Government Operations.(b) For purposes of informing the coordinated plan, as described in subdivision (c), and upon appropriation by the Legislature, the Secretary of Government Operations shall evaluate all of the following: (1) The impact of the proliferation of deepfakes on state government, California-based businesses, and residents of the state.(2) The risks, including privacy risks, associated with the deployment of digital content forgery technologies and deepfakes on state and local government, California-based businesses, and residents of the state.(3) Potential privacy impacts of technologies allowing public verification of digital content provenance.(4) The impact of digital content forgery technologies and deepfakes on civic engagement, including voters.(5) The legal implications associated with the use of digital content forgery technologies, deepfakes, and technologies allowing public verification of digital content provenance.(6) The best practices for preventing digital content forgery and deepfake technology to benefit the state, California-based businesses, and California residents, including exploring whether and how the adoption of a digital content provenance standard could assist with reducing the proliferation of digital content forgeries and deepfakes.(c) The secretary shall develop a coordinated plan to accomplish all of the following:(1) Investigate the feasibility of, and obstacles to, developing standards and technologies for state departments for determining digital content provenance.(2) Increase the ability of internet companies, journalists, watchdog organizations, other relevant entities, and members of the public to meaningfully scrutinize and identify digital content forgeries and relay trust and information about digital content provenance to content consumers.(3) Develop or identify mechanisms for content creators to cryptographically certify authenticity of original media and nondeceptive manipulations.(4) Develop or identify mechanisms for content creators to enable the public to validate the authenticity of original media and nondeceptive manipulations to establish digital content provenance without materially compromising personal privacy or civil liberties.(d) On or before October 1, 2024, the secretary shall report to the Legislature on the potential uses and risks of deepfake technology to the state government and California-based businesses.(1) The secretary's report shall include the coordinated plan required by subdivision (c), including recommendations for modifications to the definitions of digital content forgery and deepfakes.(2) A report submitted pursuant to this subdivision shall be submitted in compliance with Section 9795.(e) This section shall remain in effect only until January 1, 2025, and as of that date is repealed, unless a later enacted statute, that is enacted before January 1, 2025, deletes or extends that date.Amended by Stats 2024 ch 843 (AB 2885),s 4, eff. 1/1/2025.Added by Stats 2022 ch 885 (SB 1216),s 1, eff. 1/1/2023.This section is set out more than once due to postponed, multiple, or conflicting amendments.