
The main focus of the special session that begins Thursday is to close a roughly $1 billion budget shortfall, but the unseasonable return to the Capitol also gives lawmakers a chance to work on an unrelated issue: what to do about a controversial AI law set to take effect next February.
Colorado’s first-in-the-nation anti-discrimination law sets rules for businesses and governments that use AI systems in some of their decision-making. But with the business community worried about implementation, the impending law is dividing Democrats who control the statehouse.
Gov. Jared Polis added the AI law to the list of issues lawmakers can take up in their special session, saying it will give them “space to make adjustments.”
“Whether they want to improve it, make it work better. I think there's a number of different ideas,” said the governor. “The danger in waiting is that it's scheduled to take effect in February.”
When Polis signed the law back in 2024, he did so with the expectation that lawmakers would revise the policy before it could take effect. The governor created an AI impact task force to develop recommendations for the 2025 legislative session. But lawmakers failed to find a compromise that would pass at the statehouse and a business-led effort to delay the law’s implementation by roughly a year also failed.
For a while this summer, it looked like Colorado’s whole discussion could be moot. Republicans in Congress tried to put a ten-year time-out on AI laws in their tax-cuts-and-spending bill, blocking states from regulating any “artificial intelligence models, artificial intelligence systems, or automated decision systems.” That effort also ultimately failed.
Fast forward to the upcoming special legislative session. The biggest battle lines continue to be between consumer rights advocates and labor groups, who want robust consumer protections, and the business community and tech developers, who worry the law is unworkable and will stifle innovation. Prior to Polis calling the special session, a large coalition of interests, including Frontier Airlines and other businesses, education groups like the community college system, and the Colorado Medical Society, were urging Polis to bring lawmakers back to the capitol.
In a letter to the governor, the coalition said the legislature needs to address the significant risks they say the law poses, not just to the tech industry.
“This law creates unexpected and costly problems for organizations simply using everyday AI-enabled software, from K-12 schools and universities to hospitals, banks, and local governments. These institutions face heavy and costly burdens for compliance and increased liability concerns for the use of common AI-powered platforms in basic operational functions. These impacts represent a substantial and unnecessary strain on core public services and institutions that were never the intended targets of this legislation.”
Advocates for the policy say they have always been open to changes, but don’t want the law’s fundamental priorities watered down.
“Making sure that workers and consumers get information about when (algorithmic decision systems are) being used and they get information about what data of theirs is being processed and analyzed by these systems,” said Matt Scherer, head of the workers' rights project at the Center for Democracy and Technology, a nonprofit focused on technology policy. CDT supports provisions in the current law that allow people to correct their data if it’s wrong, and requirements that developers include tools to ensure systems don’t violate people’s rights.
Lawmakers will have a broad scope of choices for what to do on AI during the session. As of now, four bills are in the works. They range from one that would keep the current law largely intact, but narrow its scope, to others that would trade its specifics for more general policy statements.
For his part, Gov. Polis is not openly taking sides about which approach he favors.
In a statement to CPR News, he said he would work with anyone to find the right path forward on a new policy framework that “addresses bias while also spurring innovation,” and that he also would consider an implementation delay.
“There is clear motivation in the legislature to take action now to protect consumers and promote innovation, all without creating new costs for the state or unworkable burdens for Colorado businesses and local governments, and I thank legislators for taking the issue so seriously, and applaud the work that’s being done in both chambers,” said the governor.
Here’s what we know about the competing bills:
‘AI Sunshine Act’
This bill comes from the same Democratic lawmakers who wrote the original law, SB24-205. Senate Majority Leader Robert Rodriguez has taken the lead on the policy.
The proposal would narrow the original law by focusing on the disclosures AI developers must give to the companies and other entities that use their systems, including warning them about risks that their programs could violate the Civil Rights Act or consumer protection laws. It also holds developers and deployers jointly responsible if an AI system does violate existing law, unless a developer can prove that the entity using their system misused it.
The act also aims to ensure Coloradans know when and how an AI algorithm is used for decisions in areas like housing, loans, or college acceptance. And people would need to have an avenue to correct flawed data.
“This will align the incentives to encourage developers to work with the deployers to proactively combat bias in the systems. So it's going to ensure that Coloradans know when these systems are impacting decisions in their lives,” said Democratic Rep. Brianna Titone of Arvada, who will be one of the main sponsors.
Titone said the goal is to have a foundation the state can build on for future regulations as the technology evolves, and that AI developers should make their systems with a concern for safety in mind.
“Why should we treat AI differently than a physical product like a car or a mattress or a baby crib? We have standards for these kinds of things to protect people using them from being hurt,” she said.
A more limited approach, with bipartisan support
Democratic Rep. William Lindstedt and Democratic Sen. Judy Amabile have teamed up for another proposal that has attracted some Republicans as well. Amabile voted for SB24-205 but said it’s clear now that the policy goes too far.
“It’s creating technical requirements that are un-doable and … compliance requirements that are going to be almost impossible for these companies to comply with.”
Their proposal would require that, in certain cases, consumers are notified when they’re interacting with an AI system. They would also get information about the technology’s developer and the deployer so they could contact them if needed. It also states that existing civil rights laws and consumer protections apply to AI technology.
The bill does not require the developer to disclose information about the AI system to the deployer, nor does it give people the right to correct data used by algorithmic systems.
“I think our bill does a lot to protect consumers,” said Amabile. “But also, I would say any time we pass legislation, we want to try and prevent unintended consequences and that’s a duty of care we owe to the state of Colorado.”
With two competing Democratic-backed bills, there will be a lot of wrangling to see which, if any, crosses the finish line and gets to Polis’ desk. The short span of a special session, which will last a minimum of three days, also leaves less time for deal-making.
This pared-back approach from some Democrats to AI regulations doesn’t sit well with supporters of the current law. Titone likened the proposal to putting a tea bag in a swimming pool: “They're gonna try to call that tea. But I know better.”
One Republican’s vision: delay, and dial back
Republican Rep. Ron Weinberg of Loveland, who voted against the current law and belonged to the implementation task force, will introduce a bill to push back its effective date by a year and a half, to August 2027. It would also exempt businesses from complying if they have fewer than 250 employees or less than $5 million in annual revenue. Local governments with fewer than 100,000 residents would also be exempt.
Weinberg’s proposal also limits the law’s scope of consequential decisions to employment and public safety, and excludes all of the other categories it currently covers, including banking, healthcare, housing, insurance, essential government services, legal services and education.
A totally different approach: instead of regulating AI itself, add it to existing civil rights laws
Republican Sen. Mark Baisley, who also served on the governor’s AI impact task force, wants his colleagues to fully repeal the 2024 law and instead update the state’s anti-discrimination laws.
Baisley, who’s running for governor, said discrimination against protected classes is unacceptable, whatever the source.
His bill would add “simple language to existing non-discriminatory language in state statute, that says you can't use technology to discriminate against people either. And that's all.”
Baisley said he submitted his proposal over the summer to the conservative American Legislative Exchange Council to share as model language with lawmakers in other states. He said if Colorado’s current law does go into effect this February, it’ll have a negative impact on AI developers and innovators in the tech industry.
“It requires that they turn over their intellectual property to the Attorney General's office to review their software and make sure it does not have the ability to discriminate. I understand that, but I think it's the wrong approach. I've long believed it's the wrong approach.”
Baisley voted for SB24-205, but said he thinks fears of how AI might distort decisions and harm people’s lives are overblown.
“All of the concerns that are being directed at AI very readily should simply be directed at all software technologies,” he said. “It’s because AI is new and mysterious to people that it has them shook up.”