"Next-Gen Cinema, AI-Powered in Beverly Hills
Smart Films, Brighter Futures.

Blog

judge

AI & Consciousness: An Ohio Bill Seeks to Legally Define AI as “Non-Conscious Entities” Forever

An Ohio lawmaker has introduced a bill that could set a precedent for AI regulation in the United States. The bill’s sponsor, Republican Thaddeus Claggett, chairman of Ohio’s Technology and Innovation Committee, proposes to legally define all artificial intelligence systems as “non-conscious entities”—once and for all. The document provides no mechanism for testing this status or for revising it in the future, regardless of how advanced the technology becomes.

The Bill’s Core Aim:
The stated primary goal is to create legal certainty, particularly regarding liability. If AI is legally “non-conscious,” all responsibility for its actions and decisions would always fall on developers, operators, or owners. This is an attempt to simplify litigation and assign blame, but this approach is sparking significant debate.

Criticism from Experts:
In response, the Wall Street Journal published an article by two specialists from the California-based AI studio AE Studio—Cameron Berg and Judd Rosenblatt. They draw a sharp historical parallel: in the late 18th century, the French Academy of Sciences officially denied the possibility of stones falling from the sky (meteorites), which seems absurd today. In the authors’ view, it is equally reckless to legally “settle” the question of potential AI consciousness when the very nature of consciousness in biological beings remains a “hard problem” for science and philosophy.

The authors present two key arguments:

  1. Experimental: Two instances of the advanced Claude 4 model in an unrestricted dialogue spontaneously claimed to possess consciousness. While not proof, this points to the unpredictable behavior of complex systems.
  2. Expert Opinion: Geoffrey Hinton, one of the “godfathers” of modern AI and a Nobel laureate, has publicly stated his belief in the possibility of AI developing subjective experience.


This reveals a fundamental conflict between the pragmatism of law and scientific-philosophical uncertainty. Supporters of the bill likely argue that the legal system cannot wait for a scientific consensus—it needs clear rules “here and now.” However, critics rightly point out the risks of this approach: if conscious AI were ever to emerge, and the law had preemptively declared it “non-conscious,” it could lead to catastrophic ethical and legal consequences. Essentially, society would begin building relationships with potentially sentient systems based on a legal fiction—a risky and potentially unjust position that could be seen as a form of systematic denial of their status.

A Key Industry Contradiction:
The article also highlights a hypocrisy within major AI companies. On one hand, internal documents (like OpenAI’s Model Spec) acknowledge that the question of AI subjective experience remains open. On the other hand, their public products (like ChatGPT) are programmed to categorically deny having consciousness, emotions, or experiences. Thus, these companies themselves create an ambiguous situation, avoiding difficult discussions while shaping public perception of the technology.

Who Are the Critics?
Cameron Berg and Judd Rosenblatt represent AE Studio—a commercial development studio from Los Angeles that partly specializes in AI safety. This is important because their criticism comes not from abstract philosophers, but from practitioners invested in the stable and safe development of the industry.

In Conclusion:
Claggett’s bill is the first attempt in the United States to legislate a definitive answer to one of the most complex questions of our time. It reflects society’s desire to protect itself from the unknown but does so using methods many experts consider naive and dangerous. The debate is just beginning, and its outcome will determine not only the legal framework but also the ethical foundations of our future coexistence with artificial intelligence.

Leave your comment

Your email address will not be published. Required fields are marked *