Would you accept life under an objective supercomputer overlord?

I've genuinely thought about this myself and asked some of my friends and family members a similar question.

I'm going to choose the coward pussy option and say idk. The concept of being ruled by a benevolent God of our own design, free from strife and conflict, is definitely attractive and I can see why you'd want that.

I want to say no, I would prefer humanity be "free" to choose its own path, but really that just sounds like hubris. Are we really free in any meaningful way?

This is a very philosophical question.
 
  • Agree
Reactions: Whatevermancer
What about a scenario where AI effectively controls things on the macro scale but gives nearly full control to humans on the micro scale? So each city district would have nearly full control over its bylaws (providing the administration lives within the district). Small town would have nearly full self governing, but public services such as utilities and advanced medical care would be handled by the AI.
 
I wouldn't care that much either way, but ultimately it would be kind of pointless.

Unless the AI specifically intervenes to stunt human growth such that there's never any conflict eventually it's going to have to start making tough decisions that aren't so different from what we deal with now.

For example, what's going to happen in territorial dispute?
This period of peace is going to lead to population growth in certain (perhaps nig centric) areas, they're going to start running out of room or needing more resources.

Does the AI start instituting forced population control? Does it start dedicating extra resources, or expanding their area thus taking from other districts? Do they get a set amount of resources, and the quality of life in these areas drop and misery proliferates but they have to suffer because of the decisions their ancestors made? Does the AI allow them to collapse into civil war? Does the computer promote reddit-style bugmanism, using propaganda techniques to discouraging reproduction and any type of aggressive behaviors that might rock the status quo?


And that's just barely scratching the surface, but the point is that you couldn't ever have endless peace and prosperity because humans want to propagate and they don't want to die, and you'll eventually have to apply a selective denial process to one of those two desires, based upon some kind of value system. Whether it be by force, by manipulation, or by allowing them to duke it out.

Making some new god only passes the buck.
 
Couldn't be any worse than living under the thumb of a bunch of corrupt, elitist, degenerate oligarchs who control the government, the corporate world and the media like we do at present.
^ agree
Sounds like a planned economy to me. Would probably suit me just fine.
 
Current reality? I probably would see the supercomputer as the least awful option.

It's far from what I would call ideal however.

The big problem with the supercomputer is at some point technicians have to check up on it, and this is where human bullshit could be introduced. Anything man built, man can destroy.

I would rather find some way to make God an objective, undeniable reality even for non-Christians. God can't be corrupted, and does not need checkups. God also can't be blue screened.
 
last ned (2).jpeg
I have read that plot before, yes i would like to become a citizen of The Culture
 
Probably. I wish we didn't have to, but a computer is less fallible than mankind. You could have the most perfect government in the world and within 2-3 generations people will fuck it up with their internal scheming and bureaucratic nonsense. But a computer is autistic that it won't have that problem. It can keep carrying out its rule nigh-indefinitely, and its rule will have the same qualities it always did. If people try and subvert the system by going around the computer, the computer can adapt and purge them for corruption.

I despise transhumanism, but there's no path in humanity's future that doesn't involve a giant supercomputer ruling over us beside human extinction or anprim. For instance, WEFism leads to a supercomputer ruling except they want it to preserve their evil rule over the planet forever.

It says something about our universe that the best option for our prosperity is to create a god for ourselves that we all can physically see and hear.
 
In my more demented moments, I've developed an idea of a possible future that is an ever-perfecting totalitarianism. As AI, even if it is limited to the mere pseudo-sentience of what exists today, will become capable of identifying dissent in methods we don't even know. More worrisome, it will almost certainly also be able to identify methods of indoctrination which will also slowly become near-perfect.

Couple this in with biological and genetic research, and a Brave New World-esque situation is nearly inevitable. Future society will have not one master AI, but probably millions of them monitoring ever single action you do, every word you say or type, etc. All of this will be analyzed to find moments where you begin to experience any form of breaking from the acceptable social state. And then, advanced forms of re-conditioning will simply restore you back to the acceptable social state as easily as fixing a flat tire.

The system would be self-perpetuating because once the conditioning is properly developed, it wouldn't need to be secret. People would be conditioned to welcome the manipulation, to fear being outside of the acceptable social state that they would never voluntarily do so.

And so it would remain, a future of smiling face perfectly happy and totally oblivious to the horror of their situation.
 
  • Like
Reactions: Aunt Carol
I like to think (and
the sooner the better!)
of a cybernetic meadow
where mammals and computers
live together in mutually
programming harmony
like pure water
touching clear sky.


I like to think
(right now, please!)
of a cybernetic forest
filled with pines and electronics
where deer stroll peacefully
past computers
as if they were flowers
with spinning blossoms.


I like to think
(it has to be!)
of a cybernetic ecology
where we are free of our labors
and joined back to nature,
returned to our mammal
brothers and sisters,
and all watched over
by machines of loving grace.
 
How many troons would pool together their tugboat to pay to DDOS it into oblivion?
 
How many troons would pool together their tugboat to pay to DDOS it into oblivion?
They would probably be struck down by orbital lasers before the attempt. Assuming the computer thought they were a threat a all and couldn't just tweak its version of kiwiflare
 
Back