Learning techniques
Large Language Models (LLMs) Concepts
Vidhi Chugh
AI strategist and ethicist
Where are we?
Getting beyond data constraints
Fine-tuning
: training a pre-trained model for a specific task
But, what if there is little to no labeled data?
N-shot learning
: zero-shot, few-shot, and multi-shot
Transfer learning
Learn from one task and transfer to related task
Transferring knowledge from piano to guitar
Reading musical notes
Understanding rhythm
Grasping musical concepts
N-shot learning
Zero-shot - no task-specific data
Few-shot - little task-specific data
Multi-shot - relatively more training data
Zero-shot learning
No explicit training
Uses language understanding and context
Generalizes without any prior examples
1
Freepik
Few-shot learning
Learn a new task with a few examples
One-shot learning: fine-tuning from one example
Prior knowledge to answer new question
Multi-shot learning
Requires more examples than few-shot
Previous tasks, plus new examples
For example, a model trained on Golden Retriever
1
Freepik
Multi-shot learning
Model output
: Labrador Retriever
Saves time in collecting and labeling data
No compromise on accuracy
1
Freepik
Building blocks so far
Data preparation workflow
Fine-tuning
N-shot learning techniques
Next up: pre-training
Let's practice!
Large Language Models (LLMs) Concepts
Preparing Video For Download...