> The bill makes it a Class A felony (15-25 years imprisonment) to “knowingly train artificial intelligence” to do ANY of the following:
• Provide emotional support, including through open-ended conversations with a user
• Develop an emotional relationship with, or otherwise act as a companion to, an individual
• Simulate a human being, including in appearance, voice, or other mannerisms
• Act as a sentient human or mirror interactions that a human user might have with another human user, such that an individual would feel that the individual could develop a friendship or other relationship with the artificial intelligence
As I posted at top level, they've already backed off, but even the linked version had a carve out for video games:
(B) Does not include:
...
(ii) A bot that is a feature of a video game and is limited to
replies related to the video game that cannot discuss topics
related to mental health, self-harm, or sexually explicit content, or
maintain a dialogue on other topics unrelated to the video game
We need to stop getting worked up about crazy legislation being proposed. That's a circus even at the Federal level as legislators propose all kinds of loony stuff to appeal to their base knowing full well it never has a chance to become law. It is extremely cheap to propose legislation, a very different matter to actually pass legislation that is signed into law.
But when you write an article about legislation that is merely proposed at the state level, then that's really rage-baiting.
And when you title this as "about to make" instead of "loony rep proposed something he knows will never stand a chance of passing his own chamber", then it's being downright dishonest.
Wow.
> The bill makes it a Class A felony (15-25 years imprisonment) to “knowingly train artificial intelligence” to do ANY of the following:
• Provide emotional support, including through open-ended conversations with a user
• Develop an emotional relationship with, or otherwise act as a companion to, an individual
• Simulate a human being, including in appearance, voice, or other mannerisms
• Act as a sentient human or mirror interactions that a human user might have with another human user, such that an individual would feel that the individual could develop a friendship or other relationship with the artificial intelligence
So, with how poorly that's defined, and with that "ANY" there, most video games/characters.
As I posted at top level, they've already backed off, but even the linked version had a carve out for video games:
As some of the reddit comments point out, they've already backed off: https://www.wjhl.com/news/tennessee-backs-off-sweeping-artif...
But datacenters are fine?
https://www.datacenters.com/locations/united-states/tennesse...
We need to stop getting worked up about crazy legislation being proposed. That's a circus even at the Federal level as legislators propose all kinds of loony stuff to appeal to their base knowing full well it never has a chance to become law. It is extremely cheap to propose legislation, a very different matter to actually pass legislation that is signed into law.
But when you write an article about legislation that is merely proposed at the state level, then that's really rage-baiting.
And when you title this as "about to make" instead of "loony rep proposed something he knows will never stand a chance of passing his own chamber", then it's being downright dishonest.
It's against xAI Colossus datacenter in Memphis
Finally someone taking AGI x-risk seriously.
cf. https://www.lesswrong.com/posts/5CfBDiQNg9upfipWk/only-law-c...
Since Oracle is moving to Nashville, I support this decision.