I consider this an impossibility as a general objective. People even of the same species have different axiomatic assumptions about the universe. And they can even be completely contradictory. Now it maybe possible that one could develop an ethical calculus for a given set of axioms. Problem is figuring out what those axioms are and which ones are good.
To some degree even if you never see it this is actually a relevant in universe plot point.
Indeed, just as math proofs start with several axioms, so can ones of ethical calculus. Essentially, we can try and quantify these driving forces, then build computer models and run huge finite element analysis with them to see what the model says. We can create models that gradually increase in scope, until, years down the line, we have something that can model whole nations. We can use the same methodology as the economic models, to a degree the two are actually interlinked. One can make the argument that all behaviour is economic behaviour, on a fundamental level.
Probably be depended on species /culture but building better system is an option.
Indeed, I'm assuming we would start with Humanity and then apply some sort of secondary Xeno-whatever project to adapt it for another species.
According to our studies we should put robots in charge of everything. Humans (and other races) are too broken and poorly designed.
Or in other words this sound really hard. I cannot imagine a consistent system where money (or some social value item) do not eventually take over as the means for determining leadership/power. At least with humans, barring magic or something. Be hard to write about.
I'm going to post some interesting quotes, which may tweak you interest:
Law had been one of those things Peter detested. "There's no such thing as two identical acts," he'd told Toby. "All actions have different outcomes. I steal a diamond necklace from a rich guy who's forgotten he own it, nobody cares. You steal a loaf of bread from a factory that makes millions of them every day, and you get sent to prison. It makes no sense. Every act should be judged entirely on its own."
In the world before artificial intelligence, this had been impossible, so there was law. Justice, however, was one of the few places in Peter's utopia where he allowed AI, so Shylif and Sebastine Coley found themselves standing in a marble courthouse but not in front of a traditional judge or jury.
- Lockstep, pg 245
"I'm a proxy, not a representative. I didn't want the appointment, but it turns out that I vote, mod, and buy exactly like about fifty million other people. I can be relied on to think and vote the way they would if they were in the council. At least until I get jaded and compromised. I'm only here for another year."
- Lockstep, pg 255
There were political parties, but they were ah hoc and lasted for only one sitting session, which was four years. During that time, the ministers ran sophisticated simulations based on their own or their constituents' biases and beliefs, and tried to enlist support for initiatives based on the results. Even then, there were no direct votes; the ministers played matching games of the would-fixing-A-improve-B-would-fixing-B-improve-A sort.
...lucky the political translation layer they'd given him provided a different view through his glasses. Some of the politicos were literally turning green - not with envy but with approval, which the subtitles translated in various ways: that fellow over there was happy that Toby was telling the truth, while the woman on the left of him had just had her worst predictions confirmed. Other ministers were yellow, still others crimson, and several had turned black, apparently signifying that they were not psychoculturally capable of actually absorbing the meaning of what he'd just said.
Above them all, the interface was showing a disklike balance-of-power meter, which was currently tilting around like a top. Everything was still in play, apparently.
...They heard that. The interface's feedback layer flooded him with restatements of his own words: he knew what he'd just said; now the interface was telling him what each minister had heard - what the words he'd said meant to them after being filtered through their stated expectations and hopes, known prejudices and biases, cognitive deficits and so on. The interface proposed a set of re-wordings that it thought would custom-translate him meaning to them, but it was a bewildering jumble that he had no time to review. He signalled yes to it and the re-wordings went out.
- Lockstep, pg 257
Institutions are information processing systems created to promote specific values. Once they exist, these systems(club, company, government or church) became values in and of themselves. Then new systems are created to support them in turn. We call this constant cycling of systems "history".
- Lady of Mazes, pg 107
Livia found that after a few queries and after flipping through a few views to try to find something, her local view was beginning to anticipate her. The parkland mutated spontaneously, showing paths, buildings, labels and reticles indicating rest stops and fountains; and people began appearing. The first few were serlings: inscape agents designed to help search for information. She asked one of them who the other people in her view were.
"People who share your interests and activities," said the man-shaped agent. "Or who just like the same places. When you use inscape you accumulate a profile based on what you've done and where you've gone. Inscape locates people with similar or complementary profiles and brings you close." It moved its open palms together.
- Lady of Mazes, pg 127
She appeared human except for one feature: her eyes glowed with inner light, a subtle and entrancing gold. "She is a vote."...
"I'm the aggregate personality of a particular constituency within the Archipelago. Just an average person, in the most literal sense."...
"In modern and ancient ages they used vote in humans to run their institutions, but you could never guarantee that the person you voted for really had the same agenda as you. Aggregate personalities like Filament solve that problem. They really are the constituency, in a sense. So when they get together you know your interests are being looked after."
"...But it's not really a top down thing. Inscape is designed so that like minded people doing similar things form stable nodes of activity. When such a node becomes large enough, a vote spontaneously appears as a high level behaviour of the network. There's one of us for each interest group in the Archipelago. And the entity that emerges out of our interactions is called the Government."
- Lady of Mazes, pg 134
"...The Good Book is the result of massive simulations of whole societies - what happens when billions of individuals follow various codes of conduct. It's simple: if most people use the rules in the Book most of the time, a pretty much utopian society emerges spontaneously on the macro level."
The Book was like magic. Sophia had wanted Livia to try it out, so she did to be polite. Using it was like play-acting. Livia found she could slip easily into some roles but had more difficulty with others. One day she was the Courier, and people came to her with packages for her to deliver until she met someone whose role changed hers. The next day she was designated the Tourist, and she did nothing but explore Brand New York until she met a Visitor, at which point her role changed to Tour Guide. That was all very simple, she thought, any idiot could have designed a system like this. but every now and then she caught glimpses of something more - something extraordinary. Yesterday she had run through a chain of roles and ended up as Secretary. Reviewing the Secretary's role in the Book, she found that she should poll inscape for anyone nearby who had one of the roles of Boss, Lawyer, Researcher or about five other alternates. She did, and went to meet a woman who had the odd, unfamiliar role of Auditor.
Livia met the Auditor in a restaurant. Five other people were there, too; all had been summoned to this meeting by their roles, but nobody had any idea why, so they compared notes. One man said he'd been given the role of Messenger three days before, and couldn't shake it. He was being followed by a small constellation of inscape windows he'd accumulated from other roles. When he distributed these, they turned out to all relate to an issue of power allotment in Brand New York that the votes were dragging their heels on. Suddenly the Auditor had a task. As Secretary, Livia began annotating her memory of the meeting. In under an hour they had a policy package with key suggestions, and suddenly their roles changed. A man who'd been the Critic suddenly became the Administrator. According to the rules of the Book, the could enact policy provided conversion to Administrator was duly witnessed by enough other users.
This was amazing. After a while, through, Livia had realised that far larger and more intricate interactions were occurring via the Book all the time. It was simply that few or none of the people involved could see more than the smallest part of them.
- Lady of Mazes, pg 152
She saw a tangle of glowing threads like hair splitting into existence in front of her. Livia shut her eyes to sharpen the image, and found herself immersed in a whirling vortex made up of sharp lines, almost like arrows that pointed and rotated. She reached out her and and grabbed at one.
Towers of data flickered into being around her. The arrow flattened out, broadened, became a plain. Thousands of other lines stood up out of that plain, like a forest.
She moved her virtual body through the forest, checking the tiny labels on some of the lines: Resistance, Capacitance, said one; Condensers, designs and uses, said another. Instead of a forest she imagined she was sailing across a sea of technologies, able with a gesture to pull any invention or principle to herself and, as if she was hauling in a net full of fish, come up with all the other technologies that it necessitated. She grabbed one at random (Ballistics, it said) and pulled.
With it in hand, new options appeared as floating reticles around her. The tech locks were a multidimensional database, and the technological dependencies were just one way to cut the data. If she chose another view, she could see the anthropology and politics that spears, bows, and cannon each entailed. She dropped ballistics to explore more; to her surprise, even the five senses were listed here as technologies. They led her to the politics of the human body, and of other body plans: four footed, winged, finned. The tech locks made no distinction between biology and mechanism.
Each technology equated to some human value or set of values, she saw. She'd known that. But on Earth, in the Archipelago and everywhere else, technologies came first, and values changed to accommodate them. Unlike the locks, values were the keys to access or shut away technologies.
"But how do you work?" She dismissed the database view, and found herself looking at a set of genetic algorithms, compact logical notations. They didn't describe particular machine designs, but rather specifications; in practice, sims would evolve machinery to particular cases and according to local conditions and resources. The locks could work anywhere.
The specifications were the key, they relied on the database and couldn't be duplicated without it. They told how and when to employ energy fields to suppress various powers and macro effects. In Teven, the sims seemed to evolve machines to manipulate programmable matter. Raw materials couldn't be dug out of the ground in a coronal, since the ground only went down a metre or so. What metals or inorganic compounds were available were actually composed of bulk quantum dots which mimicked the qualities of the real thing: with a single command, a chunk of virtual iron could be transformed into pseudo-sulphur or silicon, or given characteristics that no natural element possessed. To disable any device, all the tech locks had to do was change its material composition. All this required was a command sent through inscape.
The locks proclaimed that there were no neutral technologies. The devices and methods people used didn't just represent certain values - they were those values, in some way.
- Lady of Mazes, pg 255
Does that mean we can steal their stuff and reverse engineer it?
Aka what all the ad companies are researching IRL.
Indeed.
Aka what they should teach in school right now damn it!
Do not get me started on the educational system.