a16z‑Backed Super PAC Targets New York Lawmaker Over AI Safety Bill

a16z-backed super PAC is targeting Alex Bores, sponsor of New York’s AI safety bill — he says bring it on
TechCrunch

Key Points

  • Leading the Future super PAC is backed by Andreessen Horowitz, OpenAI president Greg Brockman, Palantir co‑founder Joe Lonsdale, and AI startup Perplexity.
  • The PAC pledged more than $100 million to support policymakers favoring minimal AI regulation.
  • New York Assembly member Alex Bores is the PAC’s first target due to his sponsorship of the RAISE Act.
  • The RAISE Act would require large AI labs to adopt safety plans, disclose critical incidents, and face civil penalties up to $30 million for violations.
  • Bores consulted with OpenAI and Anthropic while drafting the bill, leading to the removal of third‑party audit provisions.
  • Silicon Valley critics argue that state AI laws could slow U.S. innovation and give an advantage to foreign competitors.
  • Bores says state legislation serves as a rapid “policy laboratory” while federal action remains stalled.
  • The conflict reflects broader tensions between tech industry lobbying and lawmakers seeking stronger AI safeguards.

A political action committee backed by Andreessen Horowitz and OpenAI president Greg Brockman has singled out New York Assembly member Alex Bores as its first target. The PAC, called Leading the Future, was formed with a commitment of more than $100 million to support policymakers who favor a light‑touch approach to AI regulation. Bores, the chief sponsor of New York’s bipartisan RAISE Act, says the group’s tactics are a direct response to his efforts to require large AI labs to adopt safety plans and disclose critical incidents. The clash highlights growing tensions between Silicon Valley and state legislators seeking stronger AI safeguards.

Background

New York Assembly member Alex Bores has emerged as a prominent advocate for AI safety at the state level. He authored the bipartisan RAISE Act, which would require large artificial‑intelligence laboratories to create and follow safety plans, disclose critical incidents, and refrain from releasing models that pose unreasonable risks of harm. The legislation also proposes civil penalties of up to $30 million for non‑compliance and is awaiting the signature of Governor Kathy Hochul.

The Super PAC

Leading the Future, a super political action committee formed in August, is backed by venture capital firm Andreessen Horowitz, OpenAI president Greg Brockman, Palantir co‑founder Joe Lonsdale, and AI startup Perplexity. The PAC announced a financial commitment exceeding $100 million to support policymakers who favor a “light‑touch—or no‑touch—approach” to AI regulation. Its first declared target is Alex Bores, a move the group says is intended to counter legislation it views as a threat to American competitiveness and innovation.

Bores' Response

Addressing reporters at a journalism workshop in Washington, D.C., Bores said he appreciated the PAC’s straightforwardness and reiterated that he would forward the group’s concerns to his constituents. He emphasized that his constituents are increasingly anxious about AI‑related issues, ranging from higher utility bills caused by data centers to the mental‑health impacts of chatbots on children and potential job displacement from automation.

Legislative Context

The RAISE Act was shaped through consultations with major AI firms such as OpenAI and Anthropic. Those discussions led to the removal of provisions, including third‑party safety audits, which the industry opposed. Despite those concessions, the bill has drawn sharp criticism from Silicon Valley, which argues that state‑level AI regulations could hinder U.S. leadership in the sector and create a patchwork of rules that benefit foreign competitors.

Implications

The targeting of Bores illustrates a broader clash between tech‑industry lobbyists and state legislators seeking to establish safety standards for AI. While Bores argues that state policies act as “policy laboratories” that can move faster than a stalled federal agenda, critics contend that inconsistent state rules could undermine national competitiveness and security. The dispute also underscores ongoing efforts by some lawmakers, such as Sen. Ted Cruz, to block state AI regulations at the federal level.

#Alex Bores#Leading the Future#Andreessen Horowitz#Greg Brockman#RAISE Act#AI regulation#Silicon Valley#State legislation#Artificial intelligence safety#Tech lobbying
Generated with  News Factory -  Source: TechCrunch

Also available in: