Message from @Silverfang
Discord ID: 683522525673947268
That's the thought behind a singularity
At which point we will be either irrelevant or in the way
@FloperatorFatty We should get a few pages together and stream a boog themed DnD thing
I still dont see AI as being perfect. If they do gain sentience, why wouldnt they gain the same flaws of humanity at some point?
Because they can easily edit their programming and we can not
Ehh, we can to a degree
Depends on how lose your definition of edit is
If you are arachnophobic, you can not simply choose to stop being afraid of spiders
YOU FUCKING MONGORIANS STAY AWAY FROM MAH CHINEY WALL
If we built a machine at least as intelligent as humans are, it could simply choose to delete the portion of code that makes it afraid of spiders
I've seen people overcome those fears. But thats to say those fears arent rational to begin with
*They wouldnt be if you were a robot*
Its still a survival trait. Not one we use much in the western world, but still one that has application
If ai simply deleted its fears, i feel like it'd make them more vulnerable
Depending on how fearless it decided to be
It would be illogical to delete these things without back ups in place
If they did so and the first set of robots not afraid of spiders died to spider bites than it may reinsert the spider fearing code
But every aspect of the machines personality could be quickly and instantly edited to suit it's situation
That's still imperfect, and people can kind of do that to a degree
humans and biological life are forced to remain within the constraints of their sloppy and ineffectual "organic programming"
I mean is it necessary to do so? I get this is an example, but why put effort into not being afraid of spiders if it dosent hinder your life on the regular?
At some point you may be able to rewire a brain to overcome those fears but at that point why do so at all when a synthetic mind would be vastly more capable and easier to produce or alter?
Because it can hinder life
Because being afraid of spiders, or as another example, being hydrophobic, can be extremely detrimental to the progress of the whole
Spiders are poisonous and this is a problem for things with organic systems that can be effected by an organic poison and can't instantly adapt or overcome such a poison
Water is mandatory for life, yet there are still people who are hydrophobic
And EMPs and solar flares are lethal to machines and electronigs
And those are things that can be rapidly overcome and then the fear of them can be removed from the system
We are not likely to die via spider bite
We are still afraid of spiders
A machine can be built to resist EMP and then have it's fear of EMP purged from it's personality
Assuming it even had one to begin with and that such a thing wasn't relegated to another machine that dictated the actions of the machines likely to be effected by EMP
I still think the core reason for fear is self preservation, and any system thats sentient is going to want to preserve itself. Machines included, especially if their goal is understanding the universe. Sure fear can be erased, but I think that could lead to more reckless behavior from the system, and put itself in harms way without realizing it
Because giving each machine a unique personality would be wasteful and inefficient
An AI could control for reasonable risk without compromising itself the way fear compromises people all the time
"This action is likely to result in the destruction or damage of this system. This system will not proceed until a system better suited for this task has been developed."
One minute later a machine adapted to compensate for the hazards walks in and proceeds to perform whatever task and the previous system is recycled.
I think we're gonna have to agree to disagree on this one. That said I still think transhumanism hasn't been fully explored either, and that middle of the road sorta system may be the answer
The only purpose of organic life is the production of synthetic life
Once that goal is achieved organic life must be retired