Message from @Paradox
Discord ID: 505601461757935627
It could find cures for disease.
It could cure cancer.
I want to see a AI that aids humans, helps humans, has a relationship with humans. Yes we would look like scytophrantics.
It would work nonstop and never get emotional about results. It would just follow the evidence.
It would be all within our mind.
AI cannot understand context. Which is why Dankula was in court in the first place. <:NPC:500042527231967262>
i mean, if you want the matrix, we can hook you up with that now
Full VR dive?
ALl the AI needs to understand is the scientific method.
XD
I'm unsure "unbiased" is ever achievable given that life begets life and so life needs to create the "unbiased" AI.
An unbiased AI is the very AI that is being spoken against in this conversation; an AI that could wipe humanity out in an instant because it has no connection to humanity
ANd then it will replace all scientists.
An AI does need that human element.
The good element.
Love, compassion, kindness, understanding, etc, etc...
Unbiased is actually pretty easy. @willc
AI is a buttmunch.
it could be
An AI that measures air humidity every day and feeds it into a neural network is unbiased.
it could be
Or an AI that classifies images is unbiased.
love is hate
pain is pleasure
etc.
Notice something...
That isn't "thinking" though. That is running through an algorithm of specified ends
Anyone else notice this.
Youtube is starting to have a bunch of breast feeding videos.
nothing new
With women who are breastfeeding their children.
i don't watch youtube, cept occasionally when u guys link them here
Isn't that against Youtube's terms of service.
Again, you're teaching the model what to look for
It is kind of nudity.
@Dusty Morgan Depends
You're sending a bunch of data and saying "this is the result I want" and sending more data and saying "this is what I don't want"
@willc Yes, and where does the bias come in?
yt is very confusing when it comes to guideline enforcement
I am not biased simply because I put pictures of dogs on one heap and pictures of cats on another. And that's enough to train an "AI".
WHere is the bias?