LLM Lecture: A Deep Dive into Transformers, Prompts, and Human Feedback

Votre vidéo commence dans 10
Passer (5)
Formation gratuite en FR pour les membres inscrits sur les sites de vidéos

Merci ! Partagez avec vos amis !

Vous avez aimé cette vidéo, merci de votre vote !

Ajoutées by admin
8 Vues
The first 500 people to use my link will receive a one month free trial of Skillshare! Get started today! ???? https://skl.sh/aicoffeebreakwithletitia01251
????????This video is an educational and historical deep dive into LLM research, where we would like to show you and convince you that while ChatGPT seemingly emerged overnight, it’s built on years of foundational work and innovation on LLM technology.

AI Coffee Break Merch! ????️ https://aicoffeebreak.creator-spring.com/

Thanks to our Patrons who support us in Tier 2, 3, 4: ????
Dres. Trost GbR, Siltax, Vignesh Valliappan, Michael, Sunny Dhiana, Andy Ma

Outline:
00:00 Lecture contents
01:30 Skillshare (Sponsor)
02:54 The Transformer
05:49 Tokenization
10:11 The Transformer layer
14:04 Attention
22:31 Position embeddings
27:42 Residual connections
29:55 How transformers learn language
34:40 Training Decoders (e.g., GPT)
40:46 Decoder inference / test time
43:55 Encoders (e.g., BERT)
47:45 Encoder-Decoders (e.g., T5).
53:24 Why LLMs need prompting
57:00 Vanilla prompting
01:01:35 Prompt tuning
01:03:03 In context (few-shot) learning
01:07:50 Chain-of-Thought
01:10:25 Retrieval Augmented Generation (RAG).
01:12:14 Beyond pre-training: Post-training techniques
01:13:50 Instruction Tuning
01:16:00 Preference Tuning with Human Feedback via DPO
01:19:20 RLHF key idea
01:21:14 Benchmarking LLMs: Emergent capabilities and how (not) to measure them
01:24:50 Multimodal extensions of LLMs
01:28:04 Gopnik’s Parable of Stone Soup AI

????Allison Gopnik's Stone Soup AI: https://simons.berkeley.edu/news/stone-soup-ai
????Alexei Efros’s talk at the @LaureateForum : https://youtu.be/K0gavpe0uDo?si=FdsURxkV39fyU1NV

▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀
???? Optionally, pay us a coffee to help with our Coffee Bean production! ☕
Patreon: https://www.patreon.com/AICoffeeBreak
Ko-fi: https://ko-fi.com/aicoffeebreak
Join this channel as a Bean Member to get access to perks:
https://www.youtube.com/channel/UCobqgqE4i5Kf7wrxRxhToQA/join
▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀▀

???? Links:
AICoffeeBreakQuiz: https://www.youtube.com/c/AICoffeeBreak/community
Twitter / X: https://twitter.com/AICoffeeBreak
LinkedIn: https://www.linkedin.com/in/letitia-parcalabescu/
Threads: https://www.threads.net/@ai.coffee.break
Bluesky: https://bsky.app/profile/aicoffeebreak.bsky.social
Reddit: https://www.reddit.com/r/AICoffeeBreak/
YouTube: https://www.youtube.com/AICoffeeBreak
Substack: https://aicoffeebreakwl.substack.com/
Web: https://explanationmark.de/letitia
https://aicoffeebreak.com

#AICoffeeBreak #MsCoffeeBean #MachineLearning #AI #research​

Video editing: Nils Trost
Catégories
prompts ia
Mots-clés
BPE Byte Pair Encoding, GPT architecture, Retrieval-Augmented Generation RAG

Ajouter un commentaire

Commentaires

Soyez le premier à commenter cette vidéo.