Global Admissions Logo

$ USD

Search Programs

Pretraining LLMs

California, United States

The Non-degree in Pretraining LLMs at DeepLearning.AI is a program for international students taught in English.

📖 Introduction

DeepLearning.AI is an online education platform founded in 2017 by Andrew Ng, a leading AI expert and co-founder of Coursera. As a private organization, DeepLearning.AI specializes in AI and machine learning education, offering high-quality courses, specializations, and professional certifications in collaboration with top institutions and industry leaders. The platform is known for its practical, hands-on approach to teaching AI concepts and its focus on making cutting-edge AI knowledge accessible to learners worldwide.

📚 About the Program

In Pretraining LLMs you’ll explore the first step of training large language models using a technique called pretraining. You’ll learn the essential steps to pretrain an LLM, understand the associated costs, and discover how starting with smaller, existing open source models can be more cost-effective.Pretraining involves teaching an LLM to predict the next token using vast text datasets, resulting in a base model, and this base model requires further fine-tuning for optimal performance and safety. In this course, you’ll learn to pretrain a model from scratch and also to take a model that’s already been pretrained and continue the pretraining process on your own data. In detail: 1. Explore scenarios where pretraining is the optimal choice for model performance. Compare text generation across different versions of the same model to understand the performance differences between base, fine-tuned, and specialized pre-trained models. 2. Learn how to create a high-quality training dataset using web text and existing datasets, which is crucial for effective model pretraining. 3. Prepare your cleaned dataset for training. Learn how to package your training data for use with the Hugging Face library. 4. Explore ways to configure and initialize a model for training and see how these choices impact the speed of pretraining. 5. Learn how to configure and execute a training run, enabling you to train your own model. 6. Learn how to assess your trained model’s performance and explore common evaluation strategies for LLMs, including important benchmark tasks used to compare different models’ performance. After taking this course, you’ll be equipped with the skills to pretrain a model—from data preparation and model configuration to performance evaluation.

In Pretraining LLMs you’ll explore the first step of training large language models using a technique called pretraining. You’ll learn the essential steps to pretrain an LLM, understand the associated costs, and discover how starting with smaller, existing open source models can be more cost-effective.Pretraining involves teaching an LLM to predict the next token using vast text datasets, resulting in a base model, and this base model requires further fine-tuning for optimal performance and safety. In this course, you’ll learn to pretrain a model from scratch and also to take a model that’s already been pretrained and continue the pretraining process on your own data. In detail: 1. Explore scenarios where pretraining is the optimal choice for model performance. Compare text generation across different versions of the same model to understand the performance differences between base, fine-tuned, and specialized pre-trained models. 2. Learn how to create a high-quality training dataset using web text and existing datasets, which is crucial for effective model pretraining. 3. Prepare your cleaned dataset for training. Learn how to package your training data for use with the Hugging Face library. 4. Explore ways to configure and initialize a model for training and see how these choices impact the speed of pretraining. 5. Learn how to configure and execute a training run, enabling you to train your own model. 6. Learn how to assess your trained model’s performance and explore common evaluation strategies for LLMs, including important benchmark tasks used to compare different models’ performance. After taking this course, you’ll be equipped with the skills to pretrain a model—from data preparation and model configuration to performance evaluation.

🏫 About the University

DeepLearning.AI is dedicated to advancing artificial intelligence education and empowering individuals to build careers in AI and machine learning. The platform offers a range of courses, including the renowned "Deep Learning Specialization" and "AI for Everyone," designed to cater to beginners, professionals, and researchers. By collaborating with leading experts and institutions, DeepLearning.AI provides industry-relevant content that bridges the gap between theoretical knowledge and real-world applications. Through its online courses, research initiatives, and community-driven projects, DeepLearning.AI plays a crucial role in shaping the future of AI education and innovation.

💰 Fees

Application Fee

$0 USD

$0 USD

Tuition Fee

$120 USD

$120 USD

per year

✅ Entry Requirements

The minimum age is 18 and the maximum age is 50.

English Fluent is required.

Minimum education level Bachelor's degree

All students from all countries are eligible to apply to this program.

📬 Admissions Process


1

Step 1

Choose programs

2

Step 2

Apply online

3

Step 3

Enroll

📝 Reviews

Write a review

Application Fee

$0 USD

Service Fee

$0 USD

Tuition

$120 USD

Why Apply on Global Admissions?

Similar Programs to Non-degree

Similar Programs to Non-degree

We use cookies to improve your experience and analyze site usage. Read our Privacy Policy to learn more about our data practices.