Artwork

Contenuto fornito da EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar. Tutti i contenuti dei podcast, inclusi episodi, grafica e descrizioni dei podcast, vengono caricati e forniti direttamente da EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar o dal partner della piattaforma podcast. Se ritieni che qualcuno stia utilizzando la tua opera protetta da copyright senza la tua autorizzazione, puoi seguire la procedura descritta qui https://it.player.fm/legal.
Player FM - App Podcast
Vai offline con l'app Player FM !

AI in 2025 – A global perspective, with Kai-Fu Lee

50:23
 
Condividi
 

Manage episode 458877467 series 2498265
Contenuto fornito da EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar. Tutti i contenuti dei podcast, inclusi episodi, grafica e descrizioni dei podcast, vengono caricati e forniti direttamente da EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar o dal partner della piattaforma podcast. Se ritieni che qualcuno stia utilizzando la tua opera protetta da copyright senza la tua autorizzazione, puoi seguire la procedura descritta qui https://it.player.fm/legal.

Kai-Fu Lee joins me to discuss AI in 2025. Kai-Fu is a storied AI researcher, investor, inventor and entrepreneur based in Taiwan. As one of the leading AI experts based in Asia, I wanted to get his take on this particular market.

Key insights:

  • Kai-Fu noted that unlike the singular “ChatGPT moment” that stunned Western audiences, the Chinese market encountered generative AI in a more “incremental and distributed” fashion.
  • A particularly fascinating shift is how Chinese enterprises are adopting generative AI. Without the entrenched SaaS layers common in the US, Chinese companies are “rolling their own” solutions. This deep integration might be tougher and messier, but it encourages thorough, domain-specific implementations.
  • We reflected on a structural shift in how we think about productivity software. With AI “conceptualizing” the document and the user providing strategic nudges, it’s akin to reversing the traditional creative process.
  • We’re moving from a training-centric world to an inference-centric one. Models need to be cheaper, faster and less resource-intensive to run, not just to train. For instance, his team at ZeroOne.ai managed to train a top-tier model on “just” 2,000 H100 GPUs and bring inference costs down to 10 cents per million tokens—a fraction of GPT-4’s early costs.
  • In 2025, Kai-Fu predicts, we’ll see fewer “demos” and more “AI-first” applications deploying text, image and video generation tools into real-world workflows.

Connect with us:

  continue reading

190 episodi

Artwork
iconCondividi
 
Manage episode 458877467 series 2498265
Contenuto fornito da EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar. Tutti i contenuti dei podcast, inclusi episodi, grafica e descrizioni dei podcast, vengono caricati e forniti direttamente da EPIIPLUS 1 Ltd / Azeem Azhar and Azeem Azhar o dal partner della piattaforma podcast. Se ritieni che qualcuno stia utilizzando la tua opera protetta da copyright senza la tua autorizzazione, puoi seguire la procedura descritta qui https://it.player.fm/legal.

Kai-Fu Lee joins me to discuss AI in 2025. Kai-Fu is a storied AI researcher, investor, inventor and entrepreneur based in Taiwan. As one of the leading AI experts based in Asia, I wanted to get his take on this particular market.

Key insights:

  • Kai-Fu noted that unlike the singular “ChatGPT moment” that stunned Western audiences, the Chinese market encountered generative AI in a more “incremental and distributed” fashion.
  • A particularly fascinating shift is how Chinese enterprises are adopting generative AI. Without the entrenched SaaS layers common in the US, Chinese companies are “rolling their own” solutions. This deep integration might be tougher and messier, but it encourages thorough, domain-specific implementations.
  • We reflected on a structural shift in how we think about productivity software. With AI “conceptualizing” the document and the user providing strategic nudges, it’s akin to reversing the traditional creative process.
  • We’re moving from a training-centric world to an inference-centric one. Models need to be cheaper, faster and less resource-intensive to run, not just to train. For instance, his team at ZeroOne.ai managed to train a top-tier model on “just” 2,000 H100 GPUs and bring inference costs down to 10 cents per million tokens—a fraction of GPT-4’s early costs.
  • In 2025, Kai-Fu predicts, we’ll see fewer “demos” and more “AI-first” applications deploying text, image and video generation tools into real-world workflows.

Connect with us:

  continue reading

190 episodi

Tutti gli episodi

×
 
Loading …

Benvenuto su Player FM!

Player FM ricerca sul web podcast di alta qualità che tu possa goderti adesso. È la migliore app di podcast e funziona su Android, iPhone e web. Registrati per sincronizzare le iscrizioni su tutti i tuoi dispositivi.

 

Guida rapida

Ascolta questo spettacolo mentre esplori
Riproduci