
Aug 12, 2022 50m
What’s the massive breakthrough for Pure Language Processing (NLP) that has dramatically superior machine studying into deep studying? What makes these transformer fashions distinctive, and what defines “consideration?” This week on the present, Jodie Burchell, developer advocate for knowledge science at JetBrains, continues our speak about how machine studying (ML) fashions perceive and generate textual content.
This episode is a continuation of the dialog in episode #119. Jodie builds on the ideas of bag-of-words, word2vec, and easy embedding fashions. We speak in regards to the breakthrough mechanism known as “consideration,” which permits for parallelization in constructing fashions.
We additionally focus on the 2 main transformer fashions, BERT and GPT3. Jodie continues to share a number of sources that will help you proceed exploring modeling and NLP with Python.
Course Highlight: Building a Neural Network & Making Predictions With Python AI
On this step-by-step course, you’ll construct a neural community from scratch as an introduction to the world of synthetic intelligence (AI) in Python. You’ll discover ways to prepare your neural community and make predictions based mostly on a given dataset.
Matters:
- 00:00:00 – Introduction
- 00:02:20 – The place we left off with word2vec…
- 00:03:35 – Instance of dropping context
- 00:06:50 – Working at scale and including consideration
- 00:12:34 – A number of ranges of coaching for the mannequin
- 00:14:10 – Consideration is the idea for transformer fashions
- 00:15:07 – BERT (Bidirectional Encoder Representations from Transformers)
- 00:16:29 – GPT (Generative Pre-trained Transformer)
- 00:19:08 – Video Course Highlight
- 00:20:08 – How far have we moved ahead?
- 00:20:41 – Entry to GPT-2 through Hugging Face
- 00:23:56 – Learn how to entry and use these fashions?
- 00:30:42 – Price of coaching GPT-3
- 00:35:01 – Sources to apply and study with BERT
- 00:38:19 – GPT-3 and GitHub Copilot
- 00:44:35 – DALL-E is a transformer
- 00:46:13 – Assist your self to the present notes!
- 00:49:19 – How can individuals observe your work?
- 00:50:03 – Thanks and goodbye
Present Hyperlinks:
Tweet
Share
Share
Email
class=”h4″>