Aside from the obvious Law is there another blocker for this kind of situation?
I Imagine people would have their AI representatives trained to each individual personal beliefs and their ideal society.
What could that society look like? Or how could it work. Is there a term for this?
This wouldn’t give power to the people. This would give power to the AI companies. “Oh, the AI was able to read a lot of support for AI development out of everyone’s request to fix the roads.”
Most people would think about whatever benefits them in the moment, but rarely think about how to actually make it work. AI does not have the insight or grasp of reality to create an actual solution. Someone would need to interpret those requests, and that gives a lot of power to that person. “Yeah, the AI totally said higher taxes for everyone but me and the big business that bribed me.”
Fortunately, nobody would be willing to cede their power to an AI network, so it would never actually happen.
Make sense AI companies would be sure have that much power in this situation.