Message from @Czepa

Discord ID: 551247276895633422


2019-03-02 03:26:23 UTC  

this is the real trusth

2019-03-02 03:26:34 UTC  

corn is a weapon for geopolotics

2019-03-02 03:26:39 UTC  

follow the money

2019-03-02 03:27:19 UTC  

GMO food is toxic and causes lots of different problems so its marginal economic benefits are simply not worth it

2019-03-02 03:27:48 UTC  

@brocks Did you say trusth on purpose? ;P

2019-03-02 03:27:57 UTC  

cutbolt's speaking through the pinhole camera hole <:doom:532377533509926912>

2019-03-02 03:28:26 UTC  

“We’ve really built a system where we’re exporting grains in large numbers in order to have some influence around the world, not just with our friends but also with our enemies, by controlling the food supply,”

2019-03-02 03:28:30 UTC  

lol no typo

2019-03-02 03:28:58 UTC  

reeeaaaaalllllyyyyy <:doom2:534340313091670016>

2019-03-02 03:29:26 UTC  

i have crippling depression <:sadcat:511590629856378901>

2019-03-02 03:31:53 UTC  

_drinks a tall, cool glass of climate change_

2019-03-02 03:38:15 UTC  

gesundheit

2019-03-02 03:38:25 UTC  

*"Bees use abstract thought and symbolic language."* - Jon Lieff, M.D. - *<http://jonlieffmd.com/blog/the-remarkable-bee-brain-2>*

2019-03-02 03:39:10 UTC  

i like topic <:wave:511590630384992257>

2019-03-02 03:39:52 UTC  

stay away from here woz. take your trolls with you dickhead

2019-03-02 03:40:50 UTC  

**Roko's Basilisk**:
*"In July 2010, LessWrong contributor Roko posted a thought experiment similar to Pascal's Wager to the site in which an otherwise benevolent future AI system tortures simulations of those who did not work to bring the system into existence. This idea came to be known as "Roko's basilisk," based on Roko's idea that merely hearing about the idea would give the hypothetical AI system stronger incentives to employ blackmail."* - *<https://en.wikipedia.org/wiki/LessWrong#Roko's_basilisk>*

2019-03-02 03:42:02 UTC  

topic of roundtable 10:40 - 11:20ish or longer, Hyperintelligent AI and Roco's Basilisk

2019-03-02 03:42:48 UTC  
2019-03-02 03:46:37 UTC  

An excerpt from Vernor Vinge's (originator of the concept of the Technological Singularity) Essay, "The Coming Technological Singularity: How to Survive in the Post-Human Era" (source: *<http://www.aleph.se/Trans/Global/Singularity/sing.html>*):

https://cdn.discordapp.com/attachments/550157224782331914/551248977283252234/Screen_Shot_2019-03-01_at_7.44.49_PM.png

2019-03-02 03:47:07 UTC  

https://cdn.discordapp.com/attachments/550157224782331914/551249100440469517/STROVIA.png

2019-03-02 03:48:09 UTC  

https://cdn.discordapp.com/attachments/550157224782331914/551249359023505409/20190223_034822-1.jpg

2019-03-02 03:48:14 UTC  

URL to essay added ^

2019-03-02 03:56:38 UTC  

you guys are speculating way too much in AI. nobody fucking knows for now

2019-03-02 03:57:11 UTC  

Tehe

2019-03-02 03:57:25 UTC  

the greatest experts in AI will think you guys are way too naive about making such strong postulations

2019-03-02 03:57:40 UTC  

none of it is predictable atm

2019-03-02 03:59:48 UTC  

mooores law is irrelevant to machine learning

2019-03-02 04:01:00 UTC  

the problem is data, more than it is the transistor density

2019-03-02 04:01:34 UTC  

FPGAs are huge for what they are supposed to be

2019-03-02 04:02:47 UTC  

CPUs are stalling in their weight in processing

2019-03-02 04:03:15 UTC  

trinary is not quantum computing

2019-03-02 04:03:23 UTC  

yea, that is correct

2019-03-02 04:04:15 UTC  

oops lmao

2019-03-02 04:05:24 UTC  

https://cdn.discordapp.com/attachments/550157224782331914/551253702334480404/image0-3.jpg

2019-03-02 04:05:42 UTC  

quantum ccmputing is useful for a very select number of problems