Message from @Dusty Morgan

Discord ID: 505595667977338890


2018-10-27 04:15:30 UTC  

For example, today's AIs work pretty much like that.

2018-10-27 04:15:38 UTC  

You hard code your goal states, what is good and bad.

2018-10-27 04:15:44 UTC  

Actually think about it... AI would be taught, not programmed.

2018-10-27 04:15:51 UTC  

And then neurel networks try to optimize the good and minimize the bad.

2018-10-27 04:15:52 UTC  

Dusty you just want a robot you can bang lol

2018-10-27 04:15:56 UTC  

It is hard to program something that translate to real world.

2018-10-27 04:15:58 UTC  

In a way that is how lifeforms are working.

2018-10-27 04:15:59 UTC  

Reminds me. There was this game that gave all ais a random fear and like

2018-10-27 04:16:00 UTC  

I can show you.

2018-10-27 04:16:03 UTC  

They would be taught faster than us...

2018-10-27 04:16:07 UTC  

what's the difference between being taught and programmed?

2018-10-27 04:16:12 UTC  

But they would be taught the same way as us.

2018-10-27 04:16:13 UTC  

Supposedly to give them a weakness

2018-10-27 04:16:15 UTC  

This is an AI playing Mario:

2018-10-27 04:16:26 UTC  

Good video

2018-10-27 04:16:29 UTC  

It figures out how to play mario because you tell it what is good and bad.

2018-10-27 04:16:32 UTC  

once it gains sentience, it will learn on its own

2018-10-27 04:16:35 UTC  

Programmed can end up with errors....

2018-10-27 04:16:40 UTC  

and faster than you can understand

2018-10-27 04:16:40 UTC  

Being taught allows for them to learn as they go.

2018-10-27 04:16:44 UTC  

Dying is bad = pain. Getting further right is good = pleasure

2018-10-27 04:16:44 UTC  

and will outpace us all

2018-10-27 04:16:46 UTC  

The goods and bads.

2018-10-27 04:16:46 UTC  

on day 2

2018-10-27 04:16:55 UTC  

Look how computers behave today...

2018-10-27 04:16:55 UTC  

Honestly though a ai is only gonna make shit efficient

2018-10-27 04:16:57 UTC  

and on day 3

2018-10-27 04:17:00 UTC  

overlords

2018-10-27 04:17:00 UTC  

Not invent shit

2018-10-27 04:17:03 UTC  

You have to program in some motivation. Motivation does not automatically follow from "intelligence"

2018-10-27 04:17:16 UTC  

Give an ai a rock and they will figure out the best way to use it

2018-10-27 04:17:16 UTC  

Our motivation = pain/pleasure

2018-10-27 04:17:35 UTC  

if you did not feel hungry, you would forget to eat

2018-10-27 04:17:36 UTC  

Best way to use it without any morals.

2018-10-27 04:17:44 UTC  

hungry feels bad

2018-10-27 04:17:52 UTC  

Sometimes the best way for them, would be removing all obsticles.

2018-10-27 04:18:00 UTC  

Like say cure all hunger...

2018-10-27 04:18:00 UTC  

Hungry not bad.

2018-10-27 04:18:03 UTC  

ORANGE MAN BAD

2018-10-27 04:18:04 UTC  

They would wipe out humans.