|Explore The Technologies Leading To ChatGPT, GPT4 & LLMs|
|Written by Sue Gee|
|Friday, 14 July 2023|
The Udemy portfolio includes a course that claims to be the only one you'll need to understand large language models. We put it through its paces.
Disclosure: When you make a purchase having followed a link to from this article, we may earn an affiliate commission.
With over 100 million active users ChatGPT is getting a great deal of attention and is making waves in the wide world as well as on the world wide web as the new AI-powered technology that is threatening the livelihoods of authors and screen writers. So I was attracted by the specific reference to it in Exploring The Technologies Behind ChatGPT, GPT4 & LLMs on the Udemy Platform, imagining that it would complement other courses that we have covered recently, in particular Generative AI with Large Language Models and Prompt Engineering for Developers on Coursersa and ChatGPT Prompt Engineering for Developers offered by DeepLearning.AI and OpenAI.
Exploring The Technologies Behind ChatGPT, GPT4 & LLMs consists of 8.5 hours of video and comes from two instructors, Justin Coleman and Tim Reynolds. A notice that appeared on the course this week states:
As of July 10, 2023, this course has undergone a comprehensive revision. Please be assured that it will continue to receive regular updates to maintain its relevance and effectiveness.
According to its latest description the course offers:
an in-depth understanding of the transformative impact that GPT-4 and ChatGPT have on modern NLP
and the Course Highlights refer to:
Some new material has certainly been added recently. At the beginning of the course there's a reading explaining what Large Language Models are and a list of the factors that have enabled them to be so successful, including the amount of computing power that has led to the amount of training data used to be expanded exponentially. And at the end of the course the sections on Reinforcement and Deep Learning have been updated with a new reading "Exploring the Relationship between RLHF and ChatGPT" and revised Jupyter Notebooks which you can work with using OpenAI Gym.
The bulk of the course is in Section 2 and while it is titled "Getting Started with LLMs" this isn't really a hands-on approach. The original start was obviously "Exploring the Evolution of NLP" which is followed up with lessons on attention mechanisms, encoder-decoder architecture and transformers. Then come lessons on Scaled Dot Product Attention, Multi-Headed attention and Transfer Learning Potential.
Next, in one of the longest videos (10 mins) comes "PyTorch Essentials for Beginners" which is followed up with "Streamlining Transformer Fine Tuning with Pytorch". After this Hugging Face models working with BERT and the GPT models are used but rather than being able to access the Jupyter notebooks, you can just see their content. However, enough information is provided for you to follow along.
Comparing Section 2 with the original course description it certainly achieves the following:
The remaining points:
are covered in the remaining three sections which, as already mentioned, have been updated. They are however short - whereas Section 2 has 57 lectures totaling 7hr 37 minutes the final three between them have 15 lectures totaling 52 minutes.
In conclusion, this course does provide a good insight into the foundations and fundamentals of NLP and LLMs and perhaps its title would be better formulated as "Exploring The Technologies Behind LLMs That Led Up to ChatGPT and GPT4".
or email your comment to: firstname.lastname@example.org
|Last Updated ( Friday, 14 July 2023 )|