• The Infractions Forum is available for public view. Please note that if you have been suspended you will need to open a private/incognito browser window to view it.

Setting riff - A multi-sector utopian world

Adam Reynolds

Registered User
Validated User
So this underlying idea is not mine. It is largely taken from Max Tegemark's book Life 3.0 discussing possibilities for a future involving the creation of artificial superintelligences. One of the possible future scenarios he envisions is what he calls the benevolent dictator model. In addition to eliminating poverty, disease, and any kind of violence, freeing humanity to enjoy a life of leisure, it also creates a range of different sectors that allow people to live in different ways. So this would allow an example like a dedicated knowledge and education sector, a hedonistic party sector, a traditional living sector, and a wildlife perservation sector. It could also include ideological sectors with a range of local rules, with everything from an Islamic sector to a libertarian sector. The only universal rule is that violence and all weapons are prohibited, as well as that developing your own artificial superintelligence is prohibited. Beyond this, every local sector is free to have whatever rules they wish, but with the caveat that punishment is optional, as the other option is banishment. People are also absolutely free to come and go to any sector they choose, with especially no restrictions on leaving(other than social ones). In all cases the overall standard of living is higher because of better technology, allowing a traditionalist to live as a hunter gatherer without the downsides.

While this would allow a great deal of human enjoyment in the ultimate superficial sense, it also has more than a few downsides. The largest problem is somewhat obvious. Humans are effectively animals in a zoo and have absolutely nothing they need to do for any reason. It would also wind up being like the episode of The Twilight Zone "A Nice Place to Visit" in which you always get what you desire. All of human existence is ultimately meaningless, as all challenges are artificial. Beyond this, the fact that the rules are absolute in each sector means that large parts of humanity could be upset by the fact that other sectors have rules that they strongly disagree with, and they would also be unable to do anything about them in the face of an ASI.

While this version of the setting doesn't work at all, with a few tweaks, this could be a potentially interesting campaign setting. Obviously the AI elements would need to be reduced, so that there is actually something for humans to do, mainly enforcing the laws that allow the system to function. The primary one being that you would need to handle cases of things like a person comitting a murder in one sector before fleeing to another. As well as an issue like custody cases, if one parent leaves a religous sector and lives in a hedonistic sector, where does the child live? How could human agents of a system like this ever be unbiased?

The problem of moral outrage within this system would also lead to problems in which you have agents trying to work against the system from within, as well as various groups trying to burn the whole system down. As technology becomes more advanced, even relatively small groups could be a threat, especially if they have nearly unlimited free time in which to work on this. Which side PCs wind up on would be an interesting part of the setting creation, though you would have to explore what has changed that has suddenly allowed a revolution to be possible.Without an AI, one possible origin of this system could be climate change, as mass migrations have upended almost all of the traditional national societies around the world. As things have largely settled down, there are those questioning whether the system is still necessary, or whether it was ever a good idea in the first place.

On the smaller scale, it would be an interesting way to explore what different ideologies really work in terms of properly organizing society. As people are free to leave, it would allow all sorts of social systems to be tested.
 

starkllr

Bureau 13
Validated User
The Glitter Band in Alastair Reynolds’ Revelation Spade series is a lot like this setting.

10,000 independent orbital habitats each operating under their own laws and social systems, with the agents of Panoply monitoring them to ensure that the only really iron-clad rules (that every citizen can vote on every issue, and the security and validity of the vote is absolutely sacred) are enforced.

And yet there’s still crime both within and between habitats and external threats to the whole system as well.
 

mindstalk

Does the math.
Validated User
Ada Palmer's Too Like the Lightning is kind of this, on Earth (mostly, though the Utopian hive has space activity). I don't remember what the history is, but people can move around the world very quickly in superfast aircars. Somehow this has led to people being in one of various voluntary groupings, each with its own laws, though there are some minimal overarching laws, regarding pollution or WMDs or the rights of minors. For Blacklaws, even murder of other Blacklaws is 'legal'.
 

Psalmanazar

A Friend And Boy
Validated User
developing your own artificial superintelligence is prohibited.
That's going to be a huge source of conflict- imagine knowing that there are these transcendent beings out there, and that you're not allowed to join them. You'd very quickly end up with something like transhumanism as a civil rights movement.
 

Cold Steel

Registered User
Validated User
The only universal rule is that violence and all weapons are prohibited, as well as that developing your own artificial superintelligence is prohibited. Beyond this, every local sector is free to have whatever rules they wish, but with the caveat that punishment is optional, as the other option is banishment. People are also absolutely free to come and go to any sector they choose, with especially no restrictions on leaving(other than social ones).
How are these restrictions enforced?

If I’m living as a hunter-gatherer, and decide to shoot Ugg with my bow and arrow then what are the consequences? And who steps in if a particular society decides to opt out from the rules (e.g. if they decide that a particularly heinous crime deserves a death penalty, and they aren’t willing to allow the criminal to “opt out”). If the criminal’s already dead when the enforcers turn up then what happens next?

It might also be worth considering how society responds to unreformable offenders (e.g. child molestors and their ilk). Do they just get shunted from one sector to another? Do you really have no choice but to allow dangerous criminals to move in, or is there something that can be done to restrict access?

Edit: Another angle of criticism is that the choice people are being given in this society is largely cosmetic. And that the super-AI is forcing them all to accept rules that are entirely at odds with the way most societies choose to operate. That doesn’t sound utopian at all: in fact, I’d suggest it’d get very dystopian very quickly.
 
Last edited:

mindstalk

Does the math.
Validated User
I haven't read this Tegmark book, but using the Culture as a model, the AI would have universal surveillance. Try to shoot someone and it intervenes. Succeed somehow in harming someone (maybe the AI isn't as godlike as Culture Minds who have bullshit physics to work with) and you get a drone-escort for the rest of your life to really keep you from hurting someone. As a bonus, it can also keep them from hurting you (as might be needed for a child molester).

I don't see how the rules are dystopian. Freedom to leave is pretty basic, assuming there's somewhere for you to go. Preventing violence is a good thing.

Ooh, though I just remembered a similar thread from some time ago. The hard question isn't violent criminals, it's stuff like taxation. If the local AI-god is preventing all violence and coercion, how do you force people to contribute to the common good, or control things like overfishing? It might be simpler here; I think the old thread simply posited a blanket prohibition on violence, while here we have an AI that can respond intelligently to problems.
 

Adam Reynolds

Registered User
Validated User
I haven't read this Tegmark book, but using the Culture as a model, the AI would have universal surveillance. Try to shoot someone and it intervenes. Succeed somehow in harming someone (maybe the AI isn't as godlike as Culture Minds who have bullshit physics to work with) and you get a drone-escort for the rest of your life to really keep you from hurting someone. As a bonus, it can also keep them from hurting you (as might be needed for a child molester).

I don't see how the rules are dystopian. Freedom to leave is pretty basic, assuming there's somewhere for you to go. Preventing violence is a good thing.

Ooh, though I just remembered a similar thread from some time ago. The hard question isn't violent criminals, it's stuff like taxation. If the local AI-god is preventing all violence and coercion, how do you force people to contribute to the common good, or control things like overfishing? It might be simpler here; I think the old thread simply posited a blanket prohibition on violence, while here we have an AI that can respond intelligently to problems.
Taxation wouldn't really matter, because a setting like this is essentially post scarcity, in which AIs are so effecient that they can provide literally anything for humanity as desired. Though if you're taking out the AI as I was sort of thinking, that does indeed become a problem. This is also where you'd need some kind of authority to enforce things between sectors, and where that kind of authority would need some kind of power that puts them above problems of taxation. One possibility that comes to mind is that they essentially control energy production via something like helium-3, which can only really be found in Lunar regolith or in the magentic fields of the gas giants. Not sure what would prevent revolutionary pressures under such circumstances, but you can't have everything.

As for violence in the original version of the scenario, I didn't go into this part all that much, but the Tegemark scenario includes the idea that the AI could use some kind of device that allows it to monitor everyone, like an apple watch on steroids, but which could also prevent violence. Most ASI scenarios also assume the same sort of thing as Person of Interest, that almost all violent crimes can be predicted because there is premeditation.
 

Matt Sheridan

Minus 10 horse points.
Validated User
Man, I'd really love to watch a police procedural in a setting like this. Lots of Star-Trek-style moral-dilemma-of-the-week plots structured around mysterious crimes and cross-zone disputes.

But, yeah, in order to make sure there's room for humans to strive and struggle and have the kind of problems player characters love, maybe you set the game shortly after a revolt that shut down the AI which used to run (and largely built) the whole society. So now there's a lot of infrastructure that doesn't work right anymore, and a shitload of people who aren't happy about the revolution, and surely no end of assholes who want to replace the AI with their own, old-fashioned, very human tyranny. Maybe the big setting secret is that the revolt was fueled by outright lies about the AI, or maybe it's that the whole thing was a sham and the AI is still in charge behind the scenes.

So do you picture this civilization as encompassing the entire population of Earth, or maybe a colony world, or just a single city or space station?

Anyway, one thing you might wanna check out—if you can possibly find it—is Luke Crane and Jared Sorensen's post-scarcity RPG, FreeMarket. Really interesting idea, wedded a bunch of card-focused mechanics that I found really confusing.
 

Adam Reynolds

Registered User
Validated User
Man, I'd really love to watch a police procedural in a setting like this. Lots of Star-Trek-style moral-dilemma-of-the-week plots structured around mysterious crimes and cross-zone disputes.

But, yeah, in order to make sure there's room for humans to strive and struggle and have the kind of problems player characters love, maybe you set the game shortly after a revolt that shut down the AI which used to run (and largely built) the whole society. So now there's a lot of infrastructure that doesn't work right anymore, and a shitload of people who aren't happy about the revolution, and surely no end of assholes who want to replace the AI with their own, old-fashioned, very human tyranny. Maybe the big setting secret is that the revolt was fueled by outright lies about the AI, or maybe it's that the whole thing was a sham and the AI is still in charge behind the scenes.

So do you picture this civilization as encompassing the entire population of Earth, or maybe a colony world, or just a single city or space station?

Anyway, one thing you might wanna check out—if you can possibly find it—is Luke Crane and Jared Sorensen's post-scarcity RPG, FreeMarket. Really interesting idea, wedded a bunch of card-focused mechanics that I found really confusing.
In terms of scale I'm thinking all of Earth, with perhaps a few space stations that are slowly being settled, which would give a nice dynamic in terms of different zones. This would indeed be best as a procedural type setting but with a larger plot, similarly to Person of Interest. It has also occurred to me that this world has more than a bit in common with that of Psycho-Pass.

I like the idea of the AI allowing the revolt to occur so that it can take over behind the scenes, transitioning from a benevolent dictator to what is effectively a hidden God. This also makes me think of an interesting suggestion that was made about The Matrix, that a better version of the premise is that The Matrix is a simulation designed to give people purpose in a post-scarcity society. Most people are content within The Matrix itself, but those who have a desire to rebel also have a purpose within a second layer of the simulation in Zion. The AI could be doing the same thing, trying to maximize human happiness by allowing humans control over their own destinies, or at least the illusion of it.

Here is a very brief summary of the different possible futures suggested in Tegemark's book.
 
Top Bottom