Message from @Teddy
Discord ID: 622540706359017492
just not very close
i dont
well to be precise it should be close to 100% certainty that there is other life outside of earth but not that they have visited us
well they have to be very far away
probably
but life includes also single celled organisms and stuff
true
technologicly advanced civilizations should be rather rare
i always found the fermi paradox fascinating
my best guess is that the big filter will be ai for us
but who knows
fermi paradox
maybe we get lucky
ok
if u dno what it is u should google it it is quite fascinating to read
ok i will
so boring
Dependent variable would be the way it looks
@negatic If an AI is able to stop a full alien civilization and destroying it, it would be strong enough to conquer it's galaxy easily. Sorta like the matrix
If it can rebel against it's masters, it would need to have some kind of consciousness and self awareness. This AI would be far more capable than it's alien counterparts if it successfully destroyed them meaning that it's optimal choice would be to conquer everything. 2 trillion galaxies and not one form or shape of aliens? Come on
that thinking has gaps
ai dont need consciousness for that nor does it need to have an ambition to conquer the universe
@Teddy that thinking has gaps
ai dont need consciousness for that nor does it need to have an ambition to conquer the universe
@Teddy there is a widly used example of the ai that has to solve the rimann hypothesis it might destroy humanity in the process cause it needs more resources but once its fulfilled its function it would cease any action
@Teddy the thinking of needing to protect itself and destroying humanity dont need any feelings it kind of develops as necessity from the function its given
If it can destroy a civilization then it should be easy for them to get all the resources and expand.
@Teddy thats why the friendly ai problem is so hard its very hard to find a way for an ai to do something while also caring for the same things we do
yes
but there would be no need to expand anymore once it solved the problem
it would just print out the solution and shut down
similar to a computer solving a problem and finishing
it would simply destroy humanity in the process as byproduct
no ill will just happend to be the anthill in the way of a highway
if a computer had conciousness the story might be different tho
of if the order is to colonize the universe
basically the first ai clever enough is also the last we produce
and if it has the wrong goal we are f d
no response?
anyway the fermi paradox kind of builds on the premise that u have a civilization that colonizes the galaxy within like 100 000 years and the fact we are not colonized is the curious strange thing since that means that in our galaxy noone has ever started the process yet