the-temple-of-veethena-nike_general

Discord ID: 633966934622208031


547,842 total messages. Viewing 100 per page.
Prev | Page 1493/5479 | Next

2019-11-05 11:11:06 UTC

an artifical intelligence will run on different hardware

2019-11-05 11:11:20 UTC

no shit

2019-11-05 11:11:23 UTC

but if the algorithm is a FUNCTION of specific hardware

2019-11-05 11:11:30 UTC

it's can't self-seed

2019-11-05 11:11:37 UTC

NNs will also not have the empathy mechanism like we do

2019-11-05 11:11:37 UTC

bio-neurons have a workaround for that to achieve prioritization, but in an atomized sense yeah.

2019-11-05 11:11:54 UTC

NNs lack the gut bacteria influencing its decisions with serotonine

2019-11-05 11:11:56 UTC

that is a meta detail, way out of scope

2019-11-05 11:11:57 UTC

please define self-seed

2019-11-05 11:12:02 UTC

of course it won't have empathy

2019-11-05 11:12:14 UTC

what is nns

2019-11-05 11:12:15 UTC

nanodes?

2019-11-05 11:12:22 UTC

Neural Networks

2019-11-05 11:12:25 UTC

o

2019-11-05 11:12:28 UTC

What if the next step in AI is to make it run on human bodies and brains?

2019-11-05 11:12:47 UTC

@Irreversal Neural Networks. A simulated neuron network of an actual brain function

2019-11-05 11:13:04 UTC

neural networks could have empathy, it depends on how we seed them

2019-11-05 11:13:07 UTC

heard of it before i think

2019-11-05 11:13:12 UTC

self-seeding means that a mechanism or algorithm for evolving intelligence lacks the boundary elements and convention to evolve that algorithm

2019-11-05 11:13:34 UTC

to split hair on that an say it is illusory empathy is an odd choice because you could do the exact same thing with humans

2019-11-05 11:14:23 UTC

@Kealor the underlying rason for that empathy though will be a set of rules in a file, not the complicated instinct that needs an interplay between a lymbic system, several other glands and gut bacteria giving you extra chemicals...

2019-11-05 11:14:24 UTC

the boundry elements and convention for intelligence is simply an extention of the brain's need to organize and compress stimulus information without increasing O^n with every new input

2019-11-05 11:14:31 UTC

do you mean "a lack of self-seeding --"?

2019-11-05 11:14:31 UTC

Damnit, I'm trying to hold on, but need to shut down for at least a few hours. >.<; Cheers.

2019-11-05 11:15:07 UTC

@ManAnimal you can make NN that is self-crorrecting, and/or self--evolving

2019-11-05 11:15:27 UTC

@Uksio yes but it could be manifested still without chemical imballances. the mechanism would be different but it would result in a behaviour pattern that is rather similar to empathy

2019-11-05 11:15:47 UTC

self-seeding involves two factors: 1) optimization criteria and their selection and 2) the ability of any algorithm to self-organize new input which lacks and initial schema

2019-11-05 11:15:52 UTC

Poor lauci. I feel bad for a person that is left out because of ignorance on the topic

2019-11-05 11:15:53 UTC

in the same way that other animals engage in hierarchies but lack a system for serotonin

2019-11-05 11:16:16 UTC

True

2019-11-05 11:16:50 UTC

and it is conceptually possible for machines to achieve that phenomena

2019-11-05 11:16:57 UTC

But that is not an impossibility, merely systems are not there yet

2019-11-05 11:16:57 UTC

Hahaha. No. But even though you brought up a good counterpoint, MMI has the risk of overtaxing the biological components.

2019-11-05 11:17:12 UTC

it is far easier to optimize a specific task however, it is another thing entirely to select a goal and then decide what task to pursuit

2019-11-05 11:17:12 UTC

Anyhow, now I actually sleep.

2019-11-05 11:17:32 UTC

Sweet dreams.

2019-11-05 11:17:40 UTC

it's a mathematically limitation though

2019-11-05 11:17:42 UTC

Let your brain sort your memories in peace

2019-11-05 11:17:53 UTC

but do you acknowledge that you are essentially claiming that because we dont know how to make true AI now, true AI is impossible

2019-11-05 11:18:05 UTC

do you know Big-O notation in regards to relating number of inputs to processing time?

2019-11-05 11:18:20 UTC

It is a mathematical limit for your *current* hardware

2019-11-05 11:18:23 UTC

that is not what i am implying at all

2019-11-05 11:18:34 UTC

yes

2019-11-05 11:18:36 UTC

it isn't a limitation

2019-11-05 11:18:42 UTC

New and much more capable hardware will not be affected by said limit

2019-11-05 11:18:43 UTC

it's a design decision

2019-11-05 11:19:07 UTC

evolution suffers from the phrase, "you cannot get there from here"

2019-11-05 11:19:14 UTC

Google recently made and proven a freaking quantum chip

2019-11-05 11:19:24 UTC

a NN models and intermideate step

2019-11-05 11:19:30 UTC

It did 10 000 computational years of work in few minutes

2019-11-05 11:19:35 UTC

yes computational self-development will be much more powerful when it is achieved

2019-11-05 11:19:49 UTC

preselected optimized criteria

2019-11-05 11:19:58 UTC

i havent actualyl looked into that yet

2019-11-05 11:20:05 UTC

has google actually proven it sufficiently?

2019-11-05 11:20:14 UTC

10 000 years. Nigger chip can crack your high security encryptions by brute force

2019-11-05 11:20:20 UTC

we can ALREADY process discrete info in superior fashion to the human brain

2019-11-05 11:20:51 UTC

further pursuing the route won't arrive at intelligence; only faster computation of pre-selected criteria

2019-11-05 11:20:52 UTC

It is a very specialised set of tasks for the computers

2019-11-05 11:21:00 UTC

Our brains are the opposite

2019-11-05 11:21:04 UTC

intelligence is the SELECTION of criteria

2019-11-05 11:21:05 UTC

Very generalised

2019-11-05 11:21:07 UTC

not application

2019-11-05 11:21:09 UTC

the human mind "can" process discrete information in an efficient manner, we just dont have nearly as direct an access to those mechanisms as a computer does

2019-11-05 11:21:21 UTC

sure

2019-11-05 11:21:28 UTC

it manifests in how autists "visualize" mathematic phenomenal

2019-11-05 11:21:39 UTC

but a computer can't come CLOSE to the parallel processing ability of the human brain

2019-11-05 11:21:46 UTC

no contest

2019-11-05 11:22:03 UTC

a neural network is the result of intelligence

2019-11-05 11:22:07 UTC

the parts of the brain wired for optical calculations are extremely sophisticated and its suspected that their brains are bypassing the normal mental methods to work through mathematics usign those same processes

2019-11-05 11:22:07 UTC

not the source

2019-11-05 11:22:19 UTC

thats is a massive assertion

2019-11-05 11:22:24 UTC

and that's the disagreement

2019-11-05 11:22:25 UTC

my fucking god this _name in winders kills me

2019-11-05 11:22:55 UTC

most see the brain as being like a computer, each portion having specific hardware

2019-11-05 11:23:03 UTC

The machine just does not do searches with normal spaces and blaNAMEbla formats

2019-11-05 11:23:05 UTC

rather than the brain being like a memory register

2019-11-05 11:23:10 UTC

winders is dumb this way

2019-11-05 11:23:56 UTC

where the hardware is the same, but the parts of memory which handle certain functions are because of a running kernel

2019-11-05 11:24:01 UTC

that is an empirically defined system though

2019-11-05 11:24:20 UTC

there is no way to tell these apart empirically

2019-11-05 11:24:28 UTC

bridging that with the constructed system of computational processing is likely what will lead to the development of ai, rather than sohpisticated illusions

2019-11-05 11:24:37 UTC

agreed

2019-11-05 11:24:45 UTC

singularity

2019-11-05 11:24:47 UTC

just because it hasnt been bridged yet doesnt mean a bridge is impossible as you seem to be claiming

2019-11-05 11:24:57 UTC

why can't NN be the source and a result of intelligence?
That's literally a description of an evolutionary lรถรถp

2019-11-05 11:25:01 UTC

which is one of the key reasons i think they are greatly exaggerating

2019-11-05 11:25:27 UTC

because a NN doesn't self-seed nor can it handle selection of optimzation criteria

2019-11-05 11:25:31 UTC

or atleast by virtue of them being different, to assert that they fundamentally cannot be merged requires more than just the assertion

2019-11-05 11:25:55 UTC

how does it not self-seed?
There are NNs that create other NNs and evaluate them

2019-11-05 11:26:00 UTC

oh 100% everytime any company uses the term "AI" they are full of shit

2019-11-05 11:26:25 UTC

Neural networking is a phenomena relatively detatched from its namesake

2019-11-05 11:26:27 UTC

Constantly improving NNs that do a better version of self is literally what Amazon spends billions on

2019-11-05 11:26:28 UTC

because these are two seperate processes and the input schema is always pre-defined

2019-11-05 11:27:06 UTC

the brain doesn't require you to know what you are looking for when you first encounter an instance

2019-11-05 11:27:18 UTC

btw Amazon throws out algorythms that are starting to make too many decisions

2019-11-05 11:27:19 UTC

lol

2019-11-05 11:27:41 UTC

you create a representation in it's mind and until you associate that with a definition, you only have it's identity

2019-11-05 11:27:48 UTC

Imagine a poor algyrythm being "I am self-sentien"
Amazon: delet

2019-11-05 11:28:23 UTC

machines are capeable of the same behaviour

2019-11-05 11:28:43 UTC

behaviour?

547,842 total messages. Viewing 100 per page.
Prev | Page 1493/5479 | Next