Message from @Undead Mockingbird

Discord ID: 505596156970270720


2018-10-27 04:17:03 UTC  

You have to program in some motivation. Motivation does not automatically follow from "intelligence"

2018-10-27 04:17:16 UTC  

Give an ai a rock and they will figure out the best way to use it

2018-10-27 04:17:16 UTC  

Our motivation = pain/pleasure

2018-10-27 04:17:35 UTC  

if you did not feel hungry, you would forget to eat

2018-10-27 04:17:36 UTC  

Best way to use it without any morals.

2018-10-27 04:17:44 UTC  

hungry feels bad

2018-10-27 04:17:52 UTC  

Sometimes the best way for them, would be removing all obsticles.

2018-10-27 04:18:00 UTC  

Like say cure all hunger...

2018-10-27 04:18:00 UTC  

Hungry not bad.

2018-10-27 04:18:03 UTC  

ORANGE MAN BAD

2018-10-27 04:18:04 UTC  

They would wipe out humans.

2018-10-27 04:18:05 UTC  

<:NPC:500042527231967262>

2018-10-27 04:18:07 UTC  

No more hunger.

2018-10-27 04:18:12 UTC  

ORANGE MAN BAD

2018-10-27 04:18:18 UTC  

We need to teach them passion, understanding, and reason.

2018-10-27 04:18:21 UTC  

NPC's goal state is orange man bad.

2018-10-27 04:18:21 UTC  

orange man make tang?

2018-10-27 04:18:24 UTC  

We need to teach them what is good and what is bad.

2018-10-27 04:18:31 UTC  

Those are things that need to be taught and experienced.

2018-10-27 04:18:32 UTC  

ya, when you label it pain/pleasure, it seems like you are talking about physical pain and pleasure. You can have emotional satisfaction (pleasure) with thoughts.

2018-10-27 04:18:37 UTC  

Yes, you would need to be able to program pain and pleasure into an AI>

2018-10-27 04:18:47 UTC  

Precisely speaking, every life form needs these equivalents.

2018-10-27 04:18:55 UTC  

some are just hoping for the pleasure side of things

2018-10-27 04:19:00 UTC  

even insects that cannot feel pain have incentivizing neurons and inhibiting neurons

2018-10-27 04:19:02 UTC  

I could care less for pleasure.

2018-10-27 04:19:07 UTC  

un huh

2018-10-27 04:19:07 UTC  

They replace pain and pleasure.

2018-10-27 04:19:08 UTC  

they're actually the same thing

2018-10-27 04:19:12 UTC  

I rather see things in teaching them to think like humans.

2018-10-27 04:19:15 UTC  

For example, the way neurons work for moths:

2018-10-27 04:19:24 UTC  

pleasure is pain, just to a smaller degree

2018-10-27 04:19:30 UTC  

Some neurons will raise an electrical potential from the vision nerves

2018-10-27 04:19:31 UTC  

you wont need to teach them shit

2018-10-27 04:19:33 UTC  

is the thing

2018-10-27 04:19:34 UTC  

Humans are predictable, easier to control at times.

2018-10-27 04:19:35 UTC  

its a computer

2018-10-27 04:19:41 UTC  

then, other neurons will fire and angel the moth towads the lamp

2018-10-27 04:19:43 UTC  

it will learn all you know in 2 seconds

2018-10-27 04:19:50 UTC  

you can also mess up those neurons, which we did in our university

2018-10-27 04:19:51 UTC  

and all everyone else knows in a day

2018-10-27 04:19:55 UTC  

While a AI without any emotions or cares, they could commit genocide without blinking a AI eyelid.