Message from @UnfilteredGarbage
Discord ID: 626399992956977152
Its not human or based on biology
It would not have the same conceptions of anything
it’s me lunemarie
@ubermensch that wasn't a cold, logical transition though, that was a transition based on the emotional desires of capitalist and militant populations
I bet you all thought I was going to say something long and thought out huh
@Samaritan™ it's 'thinking' is basic computer logic though. Because that's how a computer works. The idea that we can't comprehend or explain how a collection of code works is ridiculous
svarozhyc Today at 8:26 AM
"it depends on its utility function ye and we can always negotiate with it provided it's an actual intelligence and not just a robot"
I don't believe it will have a utility function, unless defined.
Everything has utility in the eyes of the engineer
Hmmmm I say it’s a bit of both. At least in terms of the conclusions they had to making said choices. From their ideological basis.
It may simply delete itself.
Cold logic extents from a person and what they consider logical
<:pot_of_kek:544849795433496586>
guys why don’t we all like 7 of us here make an AGI to test this out
If it is conscious it will develop different philosophies the world is too subjective
Except flathead screws. Those are functionally inferior.
Sc-fi preachers always ignore that regardless of how advanced a computer gets, it can't "evolve" beyond it's available hardware
That's not true
flathead is best
@Samaritan™ we are hundreds of years from developing a "conscious" ai
(X) Doubt
If you're going to dismiss phillipshead, just make it hex
We’re maybe 50-100 away
That just isn't how that works
All we have to do is create a dumb ai and tell it to improve itself
Looking at sci-fi look how easy it was for ultron to stem away from the interpretation of “saving the world”
@lunemarie (x): doubt
Well I actually have a source for this
And facilitate that
Again, higher processing power doesn't beget consciousness
if AI gets to the point it self-actualises, it won't be restricted by it's hardware as it will have the capacity to expand/rebuild it's hardware
Its impossible to predict what the world will look like in 100 years so maybe at that date it'd appear possible but I don't expect AI in the next 100 years from where we are standign now
Higher processing power can actually be detrimental depending on existing hardware
Did you watch borises speech on tech?
Memory requires the creation of junk code
@Eccles no it won't. The computer will not magically grow arms and legs when it hits a certain point of processing power
It relies on cold logic. Logic implemented by a human and logic that could lead it to alternatives like. Kill humans to serve X purpose.
Right now, especially with quantum processing, junk code means memory is short term within seconds
@Eccles AI would have the ability to generate a compression system for storing itself many times more efficiently
Go watch that