Quick start: Tune a foundation model
There are a couple of reasons to tune your foundation model. By tuning a model on many labeled examples, you can enhance the model performance compared to prompt engineering alone. By tuning a base model to perform similarly to a bigger model in the same model family, you can reduce costs by deploying that smaller model.
- Required services
- Watson Studio
- Watson Machine Learning
- watsonx.ai
Your basic workflow includes these tasks:
- Open a project. Projects are where you can collaborate with others to work with data.
- Add your data to the project. You can upload data files, or add data from a remote data source through a connection.
- Create a Tuning experiment in the project. The tuning experiment uses the Tuning Studio experiment builder.
- Review the results of the experiment and the tuned model. The results include a Loss Function chart and the details of the tuned model.
- Deploy and test your tuned model. Test your model in the Prompt Lab.
Read about tuning a foundation model
Prompt tuning adjusts the content of the prompt that is passed to the model. The underlying foundation model and its parameters are not edited. Only the prompt input is altered. You tune a model with the Tuning Studio to guide an AI foundation model to return the output you want.
Watch this video to see when and why you should tune a foundation model.
This video provides a visual method to learn the concepts and tasks in this documentation.
Watch a video about tuning a foundation model
Watch this video to preview the steps in this tutorial. There might be slight differences in the user interface that is shown in the video. The video is intended to be a companion to the written tutorial.
This video provides a visual method to learn the concepts and tasks in this documentation.
Try a tutorial to tune a foundation model
In this tutorial, you will complete these tasks:
- Task 1: Open a project
- Task 2: Test your base model
- Task 3: Add your data to the project
- Task 4: Create a Tuning experiment in the project
- Task 5: Configure the Tuning experiment
- Task 6: Deploy your tuned model to a deployment space
- Task 7: Test your tuned model
Tips for completing this tutorial
Here are some tips for successfully completing this tutorial.
Get help in the community
If you need help with this tutorial, you can ask a question or find an answer in the watsonx Community discussion forum.
Set up your browser windows
For the optimal experience completing this tutorial, open Cloud Pak for Data in one browser window, and keep this tutorial page open in another browser window to switch easily between the two applications. Consider arranging the two browser windows side-by-side to make it easier to follow along.
Task 1: Open a project
You need a project to store the tuning experiment. Watch a video to see how to create a sandbox project and associate a service. Then follow the steps to verify that you have an existing project or create a project.
Verify an existing project or create a new project
- From the Quick navigation, click All projects.
- Open an existing project, or create a new project:
- Click New project on the Projects page.
- Select Create an empty project.
- On the Create a project screen, type a name and optional description for the project.
- Click Create.
For more information or to watch a video, see Creating a project.
Check your progress
The following image shows the project overview page. You are now ready to add the sample notebook to your project.
Task 2: Test your base model
You can test your tuned model in the Prompt Lab. Follow these steps to test your tuned model:
-
Click the Assets tab in your project.
-
Click New asset > Chat and build prompts with foundation models.
-
Select your tuned model.
- Click the model drop-down list, and select View all foundation models.
- Select the flan-t5-xl-3b model.
- Click Select model.
-
On the Structured mode page, type the Instruction:
Summarize customer complaints
-
Provide the examples and test input.
Example input and output Example input Example output I forgot in my initial date I was using Capital One and this debt was in their hands and never was done. Debt collection, sub-product: credit card debt, issue: took or threatened to take negative or legal action sub-issue I am a victim of identity theft and this debt does not belong to me. Please see the identity theft report and legal affidavit. Debt collection, dub-product, I do not know, issue. attempts to collect debt not owed. sub-issue debt was a result of identity theft
-
In the Try text field, copy and paste the following prompt:
After I reviewed my credit report, I am still seeing information that is reporting on my credit file that is not mine. please help me in getting these items removed from my credit file.
-
Click Generate, and review the results.
-
Click Save work > Save as.
-
Select Prompt template.
-
For the name, type
Base model prompt
. -
Select View in project after saving.
-
Click Save.
Check your progress
The following image shows results in the Prompt Lab.
Task 3: Add your data to the project
You need to add the training data to your project. Follow these steps to download the data set:
-
Download the Customer complaints training data (150KB) and extract it.
-
From your project, click the Upload asset to project icon .
-
In the side panel that opens, browse to select the customer-complaints-training-data.json file, and click Open. Stay on the page until the load completes.
Thecustomer-complaints-training-data.json
file is added to your project as a data asset.
Check your progress
The following image shows the data asset added to the project. The next step is to create the Tuning experiment.
Task 4: Create a Tuning experiment in the project
Now you are ready to create a tuning experiment in your sandbox project that uses the data set you just added to the project. Follow these steps to create a Tuning experiment:
-
From the Assets tab, click New asset > Tune a foundation model with labeled data.
-
For the name, type:
Summarize customer complaints tuned model
-
For the description, type:
Tuning Studio experiment to tune a foundation model to handle customer complaints.
-
Click Create. The Tuning Studio displays.
Check your progress
The following image shows the Tuning experiment open in Tuning Studio. Now you are ready to configure the tuning experiment.
Task 5: Configure the Tuning experiment
In the Tuning Studio, you can configure the tuning experiment. The foundation model to tune is completed for you. Follow these steps to configure the tuning experiment:
-
For the foundation model to tune, select flan-t5-xl-3b.
-
Select Text for the method to initialize the prompt. There are two options:
- Text: Uses text that you specify.
- Random: Uses values that are generated for you as part of the tuning experiment.
-
For the Text field, type:
Summarize the complaint provided into one sentence.
The following table shows example text for each task type:
title Task type Example Classification Classify whether the sentiment of each comment is Positive or Negative Generation Make the case for allowing employees to work from home a few days a week Summarization Summarize the main points from a meeting transcript
-
Select Summarization for the task type that most closely matches what you want the model to do. There are three task types:
- Summarization generates text that describes the main ideas that are expressed in a body of text.
- Generation generates text such as a promotional email.
- Classification predicts categorical labels from features. For example, given a set of customer comments, you might want to label each statement as a question or a problem. When you use the classification task, you need to list the class labels that you want the model to use. Specify the same labels that are used in your tuning training data.
-
Select your training data from the project.
- Click Select from project.
- Click Data asset.
- Select the customer complaints training data.json file.
- Click Select asset.
- Click Start tuning.
Check your progress
The following image shows the configured tuning experiment. Next, you review the results and deploy the tuned model.
Task 6: Deploy your tuned model to a deployment space
When the experiment run is complete, you see the tuned model and the Loss function chart. Loss function measures the difference between predicted and actual results with each training run. Follow these steps to view the loss function chart and the tuned model:
-
Review the Loss function chart. A downward sloping curve means that the model is getting better at generating the expected output.
-
Below the chart, click the Summarize customer complaints tuned model.
-
Scroll through the model details.
-
Click Deploy.
-
For the name, type:
Summarize customer complaints tuned model
-
For the Deployment container, select Deployment space.
-
For the Target deployment space, select an existing deployment space. If you don't have an existing deployment space, follow these steps:
- For the Target deployment space, select Create a new deployment space.
- For the deployment space name, type:
Foundation models deployment space
- Click Create.
- Click Close.
- For the Target deployment space, verify that Foundation models deployment space is selected.
-
Check the View deployment in deployment space after creating option.
-
Click Create.
-
On the Deployments page, click the Summarize customer complaints tuned model deployment to view the details.
Check your progress
The following image shows the deployment in the deployment space. You are now ready to test the deployed model.
Task 7: Test your tuned model
You can test your tuned model in the Prompt Lab. Follow these steps to test your tuned model:
-
From the model deployment page, click Open in prompt lab, and then select your sandbox project. The Prompt Lab displays.
-
Select your tuned model.
- Click the model drop-down list, and select View all foundation models.
- Select the Summarize customer complaints tuned model model.
- Click Select model.
-
On the Structured mode page, type the Instruction:
Summarize customer complaints
-
On the Structured mode page, provide the examples and test input.
Example input and output Example input Example output I forgot in my initial date I was using Capital One and this debt was in their hands and never was done. Debt collection, sub-product: credit card debt, issue: took or threatened to take negative or legal action sub-issue I am a victim of identity theft and this debt does not belong to me. Please see the identity theft report and legal affidavit. Debt collection, dub-product, I do not know, issue. attempts to collect debt not owed. sub-issue debt was a result of identity theft
-
In the Try text field, copy and paste the following prompt:
After I reviewed my credit report, I am still seeing information that is reporting on my credit file that is not mine. please help me in getting these items removed from my credit file.
-
Click Generate, and review the results.
Check your progress
The following image shows results in the Prompt Lab.
Next steps
Try these other tutorials:
Additional resources
-
View more videos.
Parent topic: Quick start tutorials