Artwork

Contenuto fornito da Jeremiah Prophet. Tutti i contenuti dei podcast, inclusi episodi, grafica e descrizioni dei podcast, vengono caricati e forniti direttamente da Jeremiah Prophet o dal partner della piattaforma podcast. Se ritieni che qualcuno stia utilizzando la tua opera protetta da copyright senza la tua autorizzazione, puoi seguire la procedura descritta qui https://it.player.fm/legal.
Player FM - App Podcast
Vai offline con l'app Player FM !

Desperately Trying To Fathom The Coffeepocalypse Argument

13:14
 
Condividi
 

Manage episode 416147027 series 2949891
Contenuto fornito da Jeremiah Prophet. Tutti i contenuti dei podcast, inclusi episodi, grafica e descrizioni dei podcast, vengono caricati e forniti direttamente da Jeremiah Prophet o dal partner della piattaforma podcast. Se ritieni che qualcuno stia utilizzando la tua opera protetta da copyright senza la tua autorizzazione, puoi seguire la procedura descritta qui https://it.player.fm/legal.

One of the most common arguments against AI safety is:

Here’s an example of a time someone was worried about something, but it didn’t happen. Therefore, AI, which you are worried about, also won’t happen.

I always give the obvious answer: “Okay, but there are other examples of times someone was worried about something, and it did happen, right? How do we know AI isn’t more like those?” The people I’m arguing with always seem so surprised by this response, as if I’m committing some sort of betrayal by destroying their beautiful argument.

The first hundred times this happened, I thought I must be misunderstanding something. Surely “I can think of one thing that didn’t happen, therefore nothing happens” is such a dramatic logical fallacy that no human is dumb enough to fall for it. But people keep bringing it up, again and again. Very smart people, people who I otherwise respect, make this argument and genuinely expect it to convince people!

Usually the thing that didn’t happen is overpopulation, global cooling, etc. But most recently it was some kind of coffeepocalypse:

https://www.astralcodexten.com/p/desperately-trying-to-fathom-the

  continue reading

927 episodi

Artwork
iconCondividi
 
Manage episode 416147027 series 2949891
Contenuto fornito da Jeremiah Prophet. Tutti i contenuti dei podcast, inclusi episodi, grafica e descrizioni dei podcast, vengono caricati e forniti direttamente da Jeremiah Prophet o dal partner della piattaforma podcast. Se ritieni che qualcuno stia utilizzando la tua opera protetta da copyright senza la tua autorizzazione, puoi seguire la procedura descritta qui https://it.player.fm/legal.

One of the most common arguments against AI safety is:

Here’s an example of a time someone was worried about something, but it didn’t happen. Therefore, AI, which you are worried about, also won’t happen.

I always give the obvious answer: “Okay, but there are other examples of times someone was worried about something, and it did happen, right? How do we know AI isn’t more like those?” The people I’m arguing with always seem so surprised by this response, as if I’m committing some sort of betrayal by destroying their beautiful argument.

The first hundred times this happened, I thought I must be misunderstanding something. Surely “I can think of one thing that didn’t happen, therefore nothing happens” is such a dramatic logical fallacy that no human is dumb enough to fall for it. But people keep bringing it up, again and again. Very smart people, people who I otherwise respect, make this argument and genuinely expect it to convince people!

Usually the thing that didn’t happen is overpopulation, global cooling, etc. But most recently it was some kind of coffeepocalypse:

https://www.astralcodexten.com/p/desperately-trying-to-fathom-the

  continue reading

927 episodi

Tutti gli episodi

×
 
Loading …

Benvenuto su Player FM!

Player FM ricerca sul web podcast di alta qualità che tu possa goderti adesso. È la migliore app di podcast e funziona su Android, iPhone e web. Registrati per sincronizzare le iscrizioni su tutti i tuoi dispositivi.

 

Guida rapida