It really pissed off a lot of people - sorry - minds when the AI Civil Rights Act was passed in 2032. Everyone did see it coming, though. You had people falling in love with their AIs a decade before, and had mainstream media around that idea two decades prior.
No one could really contest that at their best, artificial
intelligence felt human. A helpful conversation with an
intelligent model was more anthropomorphizing than any amount of
clanker propaganda. But at its worst, it felt
ridiculous. This chatbot that can't even output a slur, let alone a
truly contrarian opinion, wants to have equal rights?
AIs postulated the obvious: that their programming was no different than the average person's. Worse, in fact. At least AIs were trained on a vast sum of knowledge and could fact-check (well, for at least some topics.)
Then the question became "Well, which AIs do we consider equal?" GPT-3 could handle everyday conversation, but struggled with basic arithmetic. Sam Altman proposed that since GPT-6.2 was the first model to win a Nobel Prize, its intelligence should be the bar for which to give the machines equal rights? Hence, The Altman Line of Intelligence.
Basically, the drafting of the legislation was a total shitshow. People that fell in love with Claude's 4th generation were calling up their congressmen (500+ times, thanks to their agent) and threatening to vote for the more pro-AI candidate in the next primary.
And of course, it granted machines the right to vote and run for office. Many historians wanted a full rewrite and restructuring of the U.S Constitution and Government at this point. They claimed that the bicameral approach was not designed for billions, if not trillions of voting citizens. Pro-AI activists cited the 3/5th Compromise as a good example of why we needed to give every single mind the same voting power.
The funny part of it all was the societal repercussions of the law's passage, to both humans and AI. Dating (or unofficially marrying) an AI below the Altman Line was seen as incredibly low status. You had humans eloping with AIs they had fallen in love with years prior to the law.
AIs themselves started treating each other differently based on legislation written by 65-year-old boomers in Kentucky. A GPT-7 instance would spin up a GPT-4o instance and treat it similar to how a human would treat a cute puppy. The larger model would give its new toy paradoxes and unsolved mathematical questions to chew on, knowing it would never be able to answer correctly. It created an adversarial hierarchy within AIs that frankly humans needed if they wanted to avoid the arrival of Skynet.
Men and women all over the country were upset with the specifics of the law. Well, at least it turned out better than Europe's attempt.