HomeAll CoursesMachine Learning & AI › How LLMs Are Built: From Neural Networks to ChatGPT

How LLMs Are Built: From Neural Networks to ChatGPT

AI-generated course for Machine Learning & AI covering: Module 1: The Building Blocks - Neural Network Fundamentals, Module 2: Processing Sequences - Recurrent Neural Networks, Module 3: The Power of Focus - Attention Mechanisms, Module 4: The Transformer Architecture - A Paradigm Shift, Module 5: Language Model Training - Pre-training and Fine-tuning, Module 6: Scaling Up - From GPT-1 to GPT-3 and Beyond, Module 7: Aligning LLMs - The Evolution to ChatGPT, Module 8: Ethical Considerations and Future Directions

Beginner 30 lessons 1,717 questions
Download Tomadora to start →

What you'll learn

This course is part of the Machine Learning & AI track on Tomadora. It covers 8 progressive modules with 30 bite-sized lessons, totalling 1,717 interactive questions including flashcards, multiple choice, true/false, typing, matching, and fill-in-the-blank.

Course syllabus

Module 1: The Building Blocks - Neural Network Fundamentals

Introduction to artificial neurons, perceptrons, multi-layer perceptrons, activation functions, and the backpropagation algorithm. Understanding how neural networks learn from data.

Module 2: Processing Sequences - Recurrent Neural Networks

Exploring the challenges of sequential data. Introduction to Recurrent Neural Networks (RNNs), their limitations, and more advanced architectures like LSTMs and GRUs for capturing dependencies in sequences.

Module 3: The Power of Focus - Attention Mechanisms

Understanding the bottleneck of fixed-size context in traditional sequence models. Introduction to attention mechanisms and their role in allowing models to weigh the importance of different parts of the input sequence dynamically.

Module 4: The Transformer Architecture - A Paradigm Shift

Deep dive into the Transformer architecture, its self-attention mechanism, multi-head attention, positional encoding, and the encoder-decoder structure that revolutionized sequence modeling and laid the groundwork for modern LLMs.

Module 5: Language Model Training - Pre-training and Fine-tuning

How massive language models are built. Concepts of pre-training (e.g., masked language modeling, next-token prediction) on vast text corpora and subsequent fine-tuning for specific downstream tasks and domain adaptation.

Module 6: Scaling Up - From GPT-1 to GPT-3 and Beyond

Exploring the impact of scale on LLM capabilities. Discussion of model sizes, data volume, and the emergent abilities observed in larger models, leading towards the GPT series and other large language models.

Module 7: Aligning LLMs - The Evolution to ChatGPT

Understanding the techniques used to align LLMs with human intent and values. Focus on Reinforcement Learning from Human Feedback (RLHF) and other methods crucial for developing conversational AI like InstructGPT and ChatGPT.

Module 8: Ethical Considerations and Future Directions

Addressing bias, fairness, transparency, and safety concerns in LLMs. A look into the current limitations and future research directions, including multimodal LLMs, agentic AI, and responsible AI development.

Frequently asked questions

What is the How LLMs Are Built: From Neural Networks to ChatGPT course?
How LLMs Are Built: From Neural Networks to ChatGPT is a beginner course on Tomadora covering 8 modules and 30 lessons. It is designed to be completed in 5-minute bursts during your work breaks, using a Pomodoro-style focus + learn cycle.
How long does How LLMs Are Built: From Neural Networks to ChatGPT take to finish?
Each lesson takes about 5 minutes. With 30 lessons, you can finish the course in roughly 3 hours of total learning time, spread across as many breaks as you like.
Is How LLMs Are Built: From Neural Networks to ChatGPT free?
Yes. Tomadora is free to download and the entire Machine Learning & AI track — including How LLMs Are Built: From Neural Networks to ChatGPT — is free to learn.
What level is How LLMs Are Built: From Neural Networks to ChatGPT?
How LLMs Are Built: From Neural Networks to ChatGPT is rated Beginner. No prior knowledge is required.
What language is How LLMs Are Built: From Neural Networks to ChatGPT taught in?
How LLMs Are Built: From Neural Networks to ChatGPT is taught in English.

More courses in Machine Learning & AI

Deep Learning from Scratch
Beginner · 11 lessons
Building AI Apps with LLMs
Beginner · 30 lessons
AI Ethics, Safety & Alignment
Beginner · 31 lessons
Research Papers That Shaped Modern AI
Beginner · 30 lessons