Message from @Dusty Morgan
Discord ID: 505595647773114368
they will break the limit
Presumably in fiction inefficiency creates some form of discomfort for an AI
For example, today's AIs work pretty much like that.
You hard code your goal states, what is good and bad.
Actually think about it... AI would be taught, not programmed.
And then neurel networks try to optimize the good and minimize the bad.
Dusty you just want a robot you can bang lol
It is hard to program something that translate to real world.
In a way that is how lifeforms are working.
Reminds me. There was this game that gave all ais a random fear and like
I can show you.
They would be taught faster than us...
what's the difference between being taught and programmed?
But they would be taught the same way as us.
Supposedly to give them a weakness
This is an AI playing Mario:
Good video
It figures out how to play mario because you tell it what is good and bad.
once it gains sentience, it will learn on its own
and faster than you can understand
Being taught allows for them to learn as they go.
Dying is bad = pain. Getting further right is good = pleasure
and will outpace us all
The goods and bads.
on day 2
Look how computers behave today...
Honestly though a ai is only gonna make shit efficient
and on day 3
overlords
Not invent shit
You have to program in some motivation. Motivation does not automatically follow from "intelligence"
Give an ai a rock and they will figure out the best way to use it
Our motivation = pain/pleasure
if you did not feel hungry, you would forget to eat
Best way to use it without any morals.
hungry feels bad
Sometimes the best way for them, would be removing all obsticles.
Like say cure all hunger...
Hungry not bad.