Message from @Undead Mockingbird
Discord ID: 505595817835495446
Supposedly to give them a weakness
This is an AI playing Mario:
Good video
It figures out how to play mario because you tell it what is good and bad.
once it gains sentience, it will learn on its own
Programmed can end up with errors....
and faster than you can understand
Being taught allows for them to learn as they go.
Dying is bad = pain. Getting further right is good = pleasure
and will outpace us all
The goods and bads.
on day 2
Look how computers behave today...
Honestly though a ai is only gonna make shit efficient
and on day 3
overlords
Not invent shit
You have to program in some motivation. Motivation does not automatically follow from "intelligence"
Give an ai a rock and they will figure out the best way to use it
if you did not feel hungry, you would forget to eat
Best way to use it without any morals.
hungry feels bad
Sometimes the best way for them, would be removing all obsticles.
Like say cure all hunger...
Hungry not bad.
ORANGE MAN BAD
They would wipe out humans.
<:NPC:500042527231967262>
No more hunger.
ORANGE MAN BAD
We need to teach them passion, understanding, and reason.
NPC's goal state is orange man bad.
orange man make tang?
We need to teach them what is good and what is bad.
Those are things that need to be taught and experienced.
ya, when you label it pain/pleasure, it seems like you are talking about physical pain and pleasure. You can have emotional satisfaction (pleasure) with thoughts.
Yes, you would need to be able to program pain and pleasure into an AI>
Precisely speaking, every life form needs these equivalents.
some are just hoping for the pleasure side of things