Explore The Technologies Leading To ChatGPT, GPT4 & LLMs
Written by Sue Gee   
Friday, 14 July 2023

The Udemy portfolio includes a course that claims to be the only one you'll need to understand large language models. We put it through its paces.

Disclosure: When you make a purchase having followed a link to from this article, we may earn an affiliate commission.

With over 100 million active users ChatGPT is getting a great deal of attention and is making waves in the wide world as well as on the world wide web as the new AI-powered technology that is threatening the livelihoods of authors and screen writers. So I was attracted by the specific reference to it in Exploring The Technologies Behind ChatGPT, GPT4 & LLMs on the Udemy Platform, imagining that it would complement other courses that we have covered recently, in particular Generative AI with Large Language Models and Prompt Engineering for Developers on Coursersa and ChatGPT Prompt Engineering for Developers offered by DeepLearning.AI and OpenAI.

Exploring The Technologies Behind ChatGPT, GPT4 & LLMs consists of 8.5 hours of video and comes from two instructors, Justin Coleman and Tim Reynolds. A notice that appeared on the course this week states:

As of July 10, 2023, this course has undergone a comprehensive revision. Please be assured that it will continue to receive regular updates to maintain its relevance and effectiveness.

According to its latest description the course offers:

an in-depth understanding of the transformative impact that GPT-4 and ChatGPT have on modern NLP

and the Course Highlights refer to:

  • Comprehensive coverage of GPT-4 and ChatGPT concepts

Some new material has certainly been added recently. At the beginning of the course there's a reading explaining what Large Language Models are and a list of the factors that have enabled them to be so successful, including the amount of computing power that has led to the amount of training data used to be expanded exponentially. And at the end of the course the sections on Reinforcement and Deep Learning have been updated with a new reading "Exploring the Relationship between RLHF and ChatGPT"  and revised Jupyter Notebooks which you can work with using OpenAI Gym.

UdemyNLP

The bulk of the course is in Section 2 and while it is titled "Getting Started with LLMs" this isn't really a hands-on approach. The original start was obviously "Exploring the Evolution of NLP" which is followed up with lessons on attention mechanisms, encoder-decoder architecture and transformers. Then come lessons on Scaled Dot Product Attention, Multi-Headed attention  and Transfer Learning Potential.

Next, in one of the longest videos (10 mins) comes "PyTorch Essentials for Beginners" which is followed up with "Streamlining Transformer Fine Tuning with Pytorch". After this Hugging Face models working with BERT and the GPT models are used but rather than being able to access the Jupyter notebooks, you can just see their content. However, enough information is provided for you to follow along.

Comparing Section 2 with the original course description it certainly achieves the following:

  • Comprehend the historical evolution of NLP and its transition towards transformer-based models
  • Apply fundamental concepts of transfer learning and its significance in training and fine-tuning transformer models for diverse NLP tasks
  • Utilize PyTorch to effectively implement, customize, and optimize state-of-the-art transformer models
  • Master essential NLP tasks like Masked Language Modeling and Next Sentence Prediction 

The remaining points:

  • Develop a solid foundation in the practical applications of ChatGPT and other LLMs, along with the strategies and best practices
  • Identify innovative software development use cases for ChatGPT that can significantly improve efficiency

are covered in the remaining three sections which, as already mentioned, have been updated. They are however short - whereas Section 2 has 57 lectures totaling 7hr 37 minutes the final three between them have 15 lectures totaling 52 minutes.

In conclusion, this course does provide a good insight into the foundations and fundamentals of NLP and LLMs and perhaps its title would be better formulated as "Exploring The Technologies Behind LLMs That Led Up to  ChatGPT and GPT4". 

UdemyChatGTPMore Information

Exploring The Technologies Behind ChatGPT, GPT4 & LLMs 

Generative AI with Large Language Models 

Related Articles

Get Hands-On With Generative AI On Coursera

Take Vanderbilt's Prompt Engineering for ChatGPT For Free

Free Course On ChatGPT Prompt Engineering

The Hugging Face NLP Course

 

To be informed about new articles on I Programmer, sign up for our weekly newsletter, subscribe to the RSS feed and follow us on Twitter, Facebook or Linkedin.

 

Banner


Pulumi Adds Infrastructure Lifecycle Management Features
25/04/2024

Pulumi has added new infrastructure lifecycle management features to Pulumi Deployments, its deployments and workflow product.



Vesuvius Challenge Continues
28/04/2024

The Vesuvius Challenge is a machine learning and computer vision competition which started in March 2023. Its overarching aim is to read the contents of physically impenetrable Herculaneum Papyri burn [ ... ]


More News

raspberry pi books

 

Comments




or email your comment to: comments@i-programmer.info

Last Updated ( Friday, 14 July 2023 )