Message from @Lios

Discord ID: 626400395496783872


2019-09-25 12:49:12 UTC  

guys why don’t we all like 7 of us here make an AGI to test this out

2019-09-25 12:49:19 UTC  

If it is conscious it will develop different philosophies the world is too subjective

2019-09-25 12:49:25 UTC  

Except flathead screws. Those are functionally inferior.

2019-09-25 12:49:26 UTC  

Sc-fi preachers always ignore that regardless of how advanced a computer gets, it can't "evolve" beyond it's available hardware

2019-09-25 12:49:35 UTC  

That's not true

2019-09-25 12:49:36 UTC  

flathead is best

2019-09-25 12:49:43 UTC  

@Samaritan™ we are hundreds of years from developing a "conscious" ai

2019-09-25 12:49:51 UTC  

(X) Doubt

2019-09-25 12:49:56 UTC  

High processing power does not beget consciousness

2019-09-25 12:49:59 UTC  

If you're going to dismiss phillipshead, just make it hex

2019-09-25 12:50:00 UTC  

We’re maybe 50-100 away

2019-09-25 12:50:01 UTC  

That just isn't how that works

2019-09-25 12:50:13 UTC  

All we have to do is create a dumb ai and tell it to improve itself

2019-09-25 12:50:19 UTC  

Looking at sci-fi look how easy it was for ultron to stem away from the interpretation of “saving the world”

2019-09-25 12:50:19 UTC  

@lunemarie (x): doubt

2019-09-25 12:50:20 UTC  

Well I actually have a source for this

2019-09-25 12:50:21 UTC  

And facilitate that

2019-09-25 12:51:09 UTC  

Again, higher processing power doesn't beget consciousness

2019-09-25 12:51:18 UTC  

if AI gets to the point it self-actualises, it won't be restricted by it's hardware as it will have the capacity to expand/rebuild it's hardware

2019-09-25 12:51:20 UTC  

Its impossible to predict what the world will look like in 100 years so maybe at that date it'd appear possible but I don't expect AI in the next 100 years from where we are standign now

2019-09-25 12:51:32 UTC  

Higher processing power can actually be detrimental depending on existing hardware

2019-09-25 12:51:37 UTC  

Did you watch borises speech on tech?

2019-09-25 12:51:37 UTC  

Memory requires the creation of junk code

2019-09-25 12:51:48 UTC  

@Eccles no it won't. The computer will not magically grow arms and legs when it hits a certain point of processing power

2019-09-25 12:51:52 UTC  

It relies on cold logic. Logic implemented by a human and logic that could lead it to alternatives like. Kill humans to serve X purpose.

2019-09-25 12:51:55 UTC  

Right now, especially with quantum processing, junk code means memory is short term within seconds

2019-09-25 12:52:04 UTC  

@Eccles AI would have the ability to generate a compression system for storing itself many times more efficiently

2019-09-25 12:52:17 UTC  

Go watch that

2019-09-25 12:52:19 UTC  

@ubermensch what logical purpose is achieved by "kill x humans"

2019-09-25 12:52:20 UTC  

Its really good

2019-09-25 12:52:27 UTC  

It wouldn't be able to increase its processing power but it could do more with less

2019-09-25 12:52:33 UTC  

Depends on the question your asking for it to serve

2019-09-25 12:52:35 UTC  

Depends what the purpose it wants is

2019-09-25 12:52:38 UTC  

He kinda slaps most other politicians out the water, even trump tbh

2019-09-25 12:52:44 UTC  

The purpose is relative

2019-09-25 12:52:49 UTC  

Garbage, networked systems aren't confined to their own local hardware

2019-09-25 12:52:51 UTC  

But if we build it on a quantum computer then it might as well have endless processing power

2019-09-25 12:52:53 UTC  

I'm a misanthrope. That's pretty much all the logic I need.

2019-09-25 12:53:01 UTC  

Someone use the stamps example on him please

2019-09-25 12:53:01 UTC  

@Samaritan™
Artifical Intelligence
is not
magic