Episodes
-
Make an AI sound like a YouTuber (LAB) #8
S1 E8 - 15m 17s
We're going to code a program that takes a one word prompt and then completes the sentence that sounds like something John Green would say. We’re going to collect transcription files from Vlogbrothers episodes, do some preprocessing, then set up a recurrent neural network (RNN), train our model, and test it!
Follow along: https://colab.research.google.com/drive/1f8ik5kSPEvDCcM7R_-Wb3AjifizVEs…
-
Natural Language Processing #7
S1 E7 - 12m 31s
We're going to talk about Natural Language Processing, or NLP, show you some strategies computers can use to better understand language like distributional semantics, and then we'll introduce you to a type of neural network called a Recurrent Neural Network or RNN to build sentences.
-
Unsupervised Learning #6
S1 E6 - 11m 41s
We’re moving on from artificial intelligence that needs training labels, called Supervised Learning, to Unsupervised Learning which is learning by finding patterns in the world. We’ll focus on the performing unsupervised clustering, specifically K-means clustering, and show you how we can extract meaningful patterns from data even when you don't know where those patterns are.
-
How to Make an AI Read Your Handwriting (LAB) #5
S1 E5 - 16m 56s
John Green Bot wrote his first novel! We’re going to program a neural network to recognize handwritten letters to convert the first part of John Green Bot’s novel into typed text.
Follow along: https://colab.research.google.com/drive/1NyYH1EPpaJlMBLK0fcKYz4icaD1SNS…
-
Training Neural Networks #4
S1 E4 - 12m 8s
We’re going to talk about how neurons in a neural network learn by getting their math adjusted, called backpropagation, and how we can optimize networks by finding the best combinations of weights to minimize error.
-
Neural Networks and Deep Learning #3
S1 E3 - 11m 29s
Artificial neural networks are better than other methods for more complicated tasks like image recognition, and the key to their success is their hidden layers. We'll talk about how the math of these networks work and how using many hidden layers allows us to do deep learning.
-
Supervised Learning #2
S1 E2 - 14m 57s
Supervised learning is the process of learning WITH training labels, and is the most widely used kind of learning with it comes to AI - helping with stuff like tagging photos on Facebook and filtering spam from your email. We’re going to start small today and show how just a single neuron (or perceptron) is constructed, and explain the differences between precision and recall.
-
What Is Artificial Intelligence? #1
S1 E1 - 11m 25s
Artificial intelligence is everywhere and it's already making a huge impact on our lives. It's autocompleting texts on our cellphones, telling us which videos to watch on YouTube, beating us at video games, recognizing us in photos, ordering products in stores, driving cars, scheduling appointments, you get the idea.
Extras + Features
-
Crash Course Artificial Intelligence
3m 30s
Host, Jabril Ashe will teach you the logic behind AI by tracing its history and examining how it’s being used today. We’ll even show you how to create some of your own AI systems with the help of co-host John Green Bot! We’ll also spend several episodes on an area of AI known as machine learning which has skyrocketed in popularity in recent years.
WETA Passport
Stream tens of thousands of hours of your PBS and local favorites with WETA Passport whenever and wherever you want. Catch up on a single episode or binge-watch full seasons before they air on TV.
Similar Shows
Counting from Infinity: Yitang Zhang and the Twin Prime Conjecture
Science and Nature
Butterfly Town, USA
Science and Nature
The Himalaya Connection
Science and Nature
Hiding in Plain Sight: Youth Mental Illness
Science and Nature
Saving the Ocean
Science and Nature
Crazy
Science and Nature
The Green Planet
Science and Nature
Human Footprint
Science and Nature
Genius by Stephen Hawking
Science and Nature
Ozone Hole: How We Saved the Planet
Science and Nature