in

LLM Foundations (LLM Bootcamp)



On this video, Sergey covers the foundational concepts for giant language fashions: core ML, the Transformer structure, notable LLMs, and pretraining dataset composition.

Obtain slides from the bootcamp web site right here: https://fullstackdeeplearning.com/llm-bootcamp/spring-2023/llm-foundations/

Intro and outro music made with Riffusion: https://github.com/riffusion/riffusion

Watch the remainder of the LLM Bootcamp movies right here: https://www.youtube.com/playlist?list=PL1T8fO7ArWleyIqOy37OVXsP4hFXymdOZ

00:00 Intro
00:47 Foundations of Machine Studying
12:11 The Transformer Structure
12:57 Transformer Decoder Overview
14:27 Inputs
15:29 Enter Embedding
16:51 Masked Multi-Head Consideration
24:26 Positional Encoding
25:32 Skip Connections and Layer Norm
27:05 Feed-forward Layer
27:43 Transformer hyperparameters and Why they work so nicely
31:06 Notable LLM: BERT
32:28 Notable LLM: T5
34:29 Notable LLM: GPT
38:18 Notable LLM: Chinchilla and Scaling Legal guidelines
40:23 Notable LLM: LLaMA
41:18 Why embrace code in LLM coaching information?
42:07 Instruction Tuning
46:34 Notable LLM: RETRO

Constructing LLM Functions for Manufacturing // Chip Huyen // LLMs in Prod Convention

LLMOps (LLM Bootcamp)