Episode 8: How Do Large Language Models Learn? Training the Brain Behind ChatGPT

Episode 8: How Do Large Language Models Learn? Training the Brain Behind ChatGPT

e
Image with computer servers racks with an image of a digital brain and library books overlaid on it.

Welcome back to Mr. Fred’s Tech Talks! In this episode, we continue our Artificial Intelligence series and dive into the big question: How do Large Language Models actually learn?

If Episode 6 explained what AI is and Episode 7 showed us what Large Language Models are, Episode 8 is all about the training process, the hardware behind it, and what can go wrong when training goes sideways.


🔑 What You’ll Learn in This Episode:

  • How Large Language Models are trained step by step
  • Why tokens are the LEGO bricks of AI
  • The role of GPUs, servers, and massive data centers in powering AI
  • A library analogy that makes servers easy to understand
  • What happens when training goes bad: bias, hallucinations, and overfitting
  • A fun Tech Tip experiment you can try with ChatGPT to see it in action

🖥️ Key Highlights:

  • Training requires billions of practice rounds of “guess the next word.”
  • GPUs and servers in racks work together inside warehouse-sized data centers, using as much electricity as a small town.
  • Data quality matters—bad training leads to biased answers, made-up facts, or brittle models.
  • ChatGPT isn’t “thinking”—it’s predicting tokens, one after another.

💡 Tech Tip of the Episode:

Ask ChatGPT a simple question you know the answer to, then give it a twist.

Examples:

  • “Who was the first person to walk on the moon?” → Neil Armstrong
  • “Who was the first person to walk on the sun?” → Watch how it tries to “make sense” of nonsense.

This experiment shows how ChatGPT predicts patterns—not truth.


🎧 Listen & Watch:


🌐 Join the Conversation

What’s the wildest or funniest “hallucination” you’ve seen from ChatGPT? Drop a comment below or connect with me on:

more episodes

Picture of a treasure chest containing old technology

Season 2, Episode 3: Lost Tech Treasures – Yesterday’s Gadgets, Tomorrow’s Relics

Episode Overview What happens when yesterday’s cutting-edge technology becomes today’s forgotten artifact? In this episode of Mr. Fred’s Tech Talks, Mr. Fred opens the metaphorical junk drawer of technology to explore Lost Tech Treasures—old gadgets, forgotten tech habits, and tools that once required us to understand how technology worked instead of simply consuming it. From

Read more >
Get Me Coding Chicken Challenge Block Chicken

I Finally Shipped the Small Thing

There’s a funny pattern I’ve noticed over the years. Big ideas tend to get a lot of attention.Small, useful ideas tend to get delayed. This week, I finally shipped one of the small ones. If you’ve been around GetMeCoding for a while, you probably know the Chicken Challenge. It’s a hands-on, team-based activity that looks

Read more >
Scroll to Top

it only takes

5 Minutes and 22 Seconds