How Neural Networks Learn: Exploring Architecture, Gradient Descent, and Backpropagation

Neural networks drive many artificial intelligence applications today. This course will teach you what’s behind the magic—the dynamics of training neural networks, including backpropagation, gradient descent, and how to optimize network performance.

  • Last updated 6/2024
  • English
  • Published 2025-04-24
  • Location Online
  • Duration 32m 24s
What you'll learn

So, you understand neural networks conceptually—what they are and generally how they work. But you might still be wondering about all the details that actually make them work.

In this course, How Neural Networks Learn: Exploring Architecture, Gradient Descent, and Backpropagation, you’ll gain an understanding of the details required to build and train a neural network.

First, you’ll explore network architecture—made up of layers, nodes and activation functions—and compare architecture types.

Next, you’ll discover how neural networks adjust and learn to use backpropagation, gradient descent, loss functions, and learning rates.

Finally, you’ll learn how to implement backpropagation and gradient descent using Python.

When you’re finished with this course, you’ll have the skills and knowledge of neural network architectures and learning needed to build and train a neural network.

This course includes:

 

Course Overview

1m 51s

  • Course Overview | 1m 51s

 

Network Architecture: Layers, Nodes, and Activation Functions

11m 10s

  •  A Brief Look at Neural Network Architectures | 7m 37s
  •  Understanding How Activation Functions Work | 3m 33s

 

Backpropagation and Learning: How Neural Networks Adjust and Learn

19m 22s

  •  What Exactly Is Backpropagation? | 3m 15s
  •  The Role of Loss Functions | 2m 50s
  •  Mathematical Principles: Error Propagation and Gradient Descent | 4m 5s
  •  Learning Rates and Their Impact | 3m 28s
  •  Implementing Backpropagation and Gradient Descent Using Python | 5m 42s