
Governor Gavin Newsom has signed an executive order establishing new guardrails for artificial intelligence companies operating in California, putting the state on a direct collision course with feder
California Governor Gavin Newsom has signed an executive order introducing a series of meaningful regulations on artificial intelligence companies operating within the state, positioning California as one of the most assertive voices in a national debate that is growing louder by the day. The move places Newsom in direct opposition to the Trump administration, which has taken a hands-off approach to AI oversight and has proposed penalizing states that attempt to implement their own regulatory frameworks.
Newsom framed the order as a necessary response to the risks that come with rapid, unregulated technological advancement, pointing to California’s long identity as a hub of innovation while arguing that leadership in technology must be accompanied by accountability. The contrast with Washington’s posture could hardly be more stark.
What the order actually does
The executive order centers on 4 specific regulatory measures that collectively aim to bring greater transparency, accountability and safety to how AI companies interact with the state and the public.
1. Enhanced vetting for government contracts. AI companies seeking state contracts will now face a more rigorous review process before being approved to work with California‘s government. The state is making clear that access to public contracts is contingent on demonstrating that a company’s technology meets meaningful standards of responsibility.
2. Mandatory safety disclosures. Alongside the vetting process, AI companies will be required to disclose what safeguards they have in place to prevent their technology from being used to generate harmful content. The urgency of this requirement was underscored by a recent legal case in which three teenagers in Tennessee filed a lawsuit against Elon Musk’s xAI after an AI application produced nonconsensual, sexually explicit images of them — a stark illustration of what can happen when protective measures are absent.
3. Independent contracting standards. The order grants California the authority to conduct its own assessments of AI companies, independent of federal guidance. If the federal government designates a company as a supply chain risk, California retains the right to review that determination and reach its own conclusion about whether to continue or discontinue a contracting relationship. This provision is a direct assertion of state autonomy in an area where the federal government has shown little appetite for oversight.
4. Watermarking of AI-generated content. State officials will be required to watermark content produced using generative AI tools, creating a visible distinction between material created by humans and that generated by artificial intelligence. The regulation is aimed at reducing the spread of disinformation at a moment when AI-generated content is increasingly difficult to distinguish from authentic human output.
Why the timing matters
The executive order arrives during a period of intensifying national debate over who should govern AI and how. The Trump administration has not only declined to impose federal regulations but has actively sought to constrain states from filling that gap, raising concerns among technologists, civil rights advocates and public officials who believe the current pace of AI deployment outstrips existing protections.
The risks that motivate California’s action are well documented. The unchecked spread of AI-generated misinformation, the potential for the technology to displace workers at scale and the documented cases of AI being used to produce exploitative content have all contributed to a growing consensus that the status quo is not sustainable. OpenAI’s own leadership has publicly raised the possibility that AI could eventually develop capabilities that surpass human control — a scenario that, however distant, adds weight to the argument that regulatory frameworks should be established before they are urgently needed rather than after.
For California, the message is deliberate: innovation and responsibility are not mutually exclusive, and the state intends to demonstrate that both can coexist.