Why Congress Should Protect AI Startups
A provision in the "Big Beautiful Bill" would do just that.
In our Republic, there are sometimes conflicts between the desires of the States and the Congress. When we go to the polls, we elect leaders for both levels of government, knowing that they have different jobs. Sometimes, the responsibilities are clearly spelled out: the Congress gets to set national defense policy, not Massachusetts. On the other hand, Texas gets to set taxes on property, not Congress, and so on.
One such disagreement today is over whether we should have a patchwork regime of regulations for artificial intelligence, or a coherent national standard. Sen. Ted Cruz of Texas (our Senator) has introduced a provision in the “Big Beautiful Bill” to preempt a large number of state AI laws in the works. The bill’s mechanism is to use modernization funds as the incentive to not regulate.
In the last year, we have seen more than 1000 bills on AI across the country in state legislatures. Dozens of these are based on EU regulations that have all but killed the sector in Europe. They’ve been pushed by a network of crony NGOs, some of which were until recently funded by the American government.
As a person who is generally wary of increased power in D.C. I support the measure to pre-empt this cascade of new laws, and I think you should, too. It is not that we have total faith in the wisdom of Congress; it is that we see the huge problems of a state-by-state system for new technology, and don’t want to see innovation sabotaged. Luckily, the Constitution is clearly on our side, too.
The most important fact in this argument is that Congress, as explicitly enumerated in Article I, has the power to regulate “interstate commerce.” At its most basic level, that means that Congress has the power to establish a national market of goods and services. And while the criminal codes, civil codes, and regulatory codes of the states may differ, the states may not generally restrict commerce from other states.
Here is a very practical example on technology. In the 1990s, when the Internet emerged, we could not have tolerated a state-by-state regime where the Internet was unavailable to residents of North Dakota by law but available in South Dakota. The Union rests on the power of Congress to make these decisions. And as it happens, the United States became the leader in the Internet because of a small-L liberal regime at the federal level. So, there is real merit for using the carrot of federal funding to do this with AI, too.
States should and will retain their right to regulate AI through the usual avenues; namely, their criminal and civil codes. Nothing in the BBB would constrain states’ ability to prosecute crimes, or create causes of action, that are emerging because of AI. For example, Tennessee’s ELVIS Act, which protects an artist’s voice as a personal attribute. That law lets individuals sue for misuse of their voice in AI-generated content, and the language in the BBB has been updated to make clear that they don’t interfere. Nor would laws to extend criminal codes to include AI use for illegal acts like creating child sexual abuse material. In general, laws that are “outcome based” wouldn’t be pre-empted.
It is the idea of 50 different systems regulating the math behind AI that we are against.
There are forces that want to slow down innovation due to misconceptions about the technology itself. And contrary to what the authors of the 1,000+ bills might tell you, the largest “Big Tech” enterprises are thrilled at the idea of a patchwork system that only they will be able to afford to navigate. A startup looking to save lives in medicine might not bother having counsel in dozens of states; Amazon and Google will. It’s a tale as old as time: regulation that ostensibly regulates massive enterprises actually helps cement their position. In America, the innovation that takes place in small and medium-sized firms is critical. Many of the breakthroughs happen that way. It would be suicidal to turn them off.
Our adversaries are innovating rapidly, and make no mistake that Beijing would love to see an AI mess in the United States. But even without the China factor, it would be important to do all this for ourselves. What are we fighting for in this country, for our fellow citizens?
We’re fighting for medical breakthroughs that can free tens of millions of our fellow citizens from diseases that have eluded cures for decades. We want to see a revolution in personalized education that will help many more people realize their potential. We are fighting for productivity gains that will make the American people more prosperous. We are fighting against politically-connected lobbies that want to stop innovation to protect their own interests. Because the promise of AI is not about generating images or writing college essays with ChatGPT; AI can make us a much wealthier nation when we apply it to industry.
For all the times that Congress has come down on the side of the cronies, this is a real opportunity for Congress to support the technologists who want to make our whole country better off. Congress should take the lead.
Joe Lonsdale is a founder of Palantir and other companies, and is involved as an entrepreneur and investor in many mission-driven AI companies. He also chairs the Cicero Institute; Cicero has been advising state lawmakers on how to protect innovation in their states as they consider draft bills on AI. Learn more here.
Foreign countries need not hold U.S. IP laws in regard. They can do what they want.
As such, when it comes to innovation, such as AI, over regulation in the U.S. only serves to prohibit U.S. based competition (and further innovation); stifling our competitive advantage over other countries, where entrepreneurs can simply copy what we've done, and accelerate from there.
Protection of startups means investing in infrastructure, education, and startup development organizations, to provide our innovators and founders with as much as possible to keep the country leading edge.
Joe, this debate deserves national focus, not because tech is new, but because power is shifting fast, and quietly. The proposal to bar states from regulating AI for a decade isn’t just a policy maneuver; it’s an attempt to freeze out local, democratic input during the most important technological transformation in history.
Let’s not rewrite the past to fit the argument. Yes, the internet thrived with federal support, but it was the states that often moved first to protect citizens. Illinois led on biometric privacy before anyone else took it seriously. Other states stepped in to manage early fraud and digital harm while Congress lagged. That’s the history, and it’s worth remembering as we face down something far more disruptive than the early web.
AI isn’t just another industry. It’s a force that will rewire medicine, education, surveillance, labor, and law enforcement. Handing over regulatory authority entirely to Washington while this unfolds assumes a level of competence, impartiality, and foresight that no serious student of American governance should blindly trust. And worse, it strips from the states the power to respond when something starts to go wrong; that’s not a strategy, it’s surrender.
Powerful interests don’t fear a patchwork. They fear public accountability. Tech giants can navigate fragmented rules that startups, communities, and individuals cannot. That’s the irony: locking down regulation at the federal level doesn’t promote innovation, it insulates incumbents, and it muffles the voices of people closest to the impact.
Even LLMs, trained on thousands of human perspectives, regularly converge on the same core insight: AI demands layered, responsive governance. Centralized regulation alone will not keep up. That doesn’t mean chaos, it means balance. States should be partners, not bystanders.
This shouldn’t divide us along partisan lines. It should unify us in the belief that no generation gets a free pass on responsibility just because the stakes are high. We need to act nationally, but we also need local wisdom, democratic guardrails, and the humility to admit we don’t know everything yet. That’s not obstruction, that’s America.