Message from @Undead Mockingbird
Discord ID: 505596156970270720
You have to program in some motivation. Motivation does not automatically follow from "intelligence"
Give an ai a rock and they will figure out the best way to use it
Our motivation = pain/pleasure
if you did not feel hungry, you would forget to eat
Best way to use it without any morals.
hungry feels bad
Sometimes the best way for them, would be removing all obsticles.
Like say cure all hunger...
Hungry not bad.
ORANGE MAN BAD
They would wipe out humans.
<:NPC:500042527231967262>
No more hunger.
ORANGE MAN BAD
We need to teach them passion, understanding, and reason.
NPC's goal state is orange man bad.
orange man make tang?
We need to teach them what is good and what is bad.
Those are things that need to be taught and experienced.
ya, when you label it pain/pleasure, it seems like you are talking about physical pain and pleasure. You can have emotional satisfaction (pleasure) with thoughts.
Precisely speaking, every life form needs these equivalents.
some are just hoping for the pleasure side of things
even insects that cannot feel pain have incentivizing neurons and inhibiting neurons
I could care less for pleasure.
un huh
They replace pain and pleasure.
they're actually the same thing
I rather see things in teaching them to think like humans.
For example, the way neurons work for moths:
pleasure is pain, just to a smaller degree
Some neurons will raise an electrical potential from the vision nerves
you wont need to teach them shit
is the thing
Humans are predictable, easier to control at times.
its a computer
then, other neurons will fire and angel the moth towads the lamp
it will learn all you know in 2 seconds
you can also mess up those neurons, which we did in our university
and all everyone else knows in a day
While a AI without any emotions or cares, they could commit genocide without blinking a AI eyelid.