Time Series Forecasting
Overview
Time series data often holds valuable information to understand the past and better predict the future. For predictive (forecasting) use cases, time series foundation models can provide immense value in generating data for predictive analytics. Typically, time series foundation models are pre-trained on large amounts of time series data to power their forecasting capabilities. Additionally, some models can be further trained on your specific data to improve results.
Example
IBM watsonx.ai Time series Forecasting API and SDK aims to provide developers with advanced forecasting capabilities, built on the Granite Time series models. IBM’s Granite time series models (TinyTimeMixers), a family of pre-trained, lightweight models based on a novel architecture. Granite time series models are ideal for forecasting use cases involving IoT sensor data, stock market prices, energy demand, and other datasets with smaller time intervals. The models dynamically adjust to data irregularities, seasonality, and trends, allowing for zero-shot forecasting.
Granite time series models currently support multiple input context lengths (512, 1024, and 1536 data points), and are capable of multi-variate, multi-time series predictions across a number of channels and IDs. Please refer to the documentation for more details.
The following example uses the time series foundation models to forecast energy demand. The purpose of this example is to show core functionality of features available in Time Series Foundation Models.
1. Visit Developer Access page
To begin using the API and SDK, you will need 3 values: a project or space id, an endpoint for your region, and an API key. You can visit the Developer Access page to receive these values. Make sure to also follow the link on the Developer Access page to get your IBM Cloud API key separately.
2. Set up your environment
Create a watsonx.ai Runtime Service instance (a free plan is offered, and information about how to create the instance can be found in the documentation).
Install and import the package ibm-watsonx-ai and dependecies:
1!pip install wget | tail -n 1 2!pip install -U matplotlib | tail -n 1 3!pip install -U ibm-watsonx-ai | tail -n 1
Note: The full documentation for ibm-watsonx-ai can be found here.
3. Interact with training dataset
This example uses the hourly energy demand dataset to train the time series model. For simplicity, the dataset was prepared to have no missing values and irrelevant columns.
Import the dataset
The following commands will import the dataset and read it as a csv.
1import os, wget 2import pandas as pd 3 4filename = 'energy_dataset.csv' 5base_url = 'https://github.com/IBM/watson-machine-learning-samples/raw/refs/heads/master/cloud/data/energy/' 6 7if not os.path.isfile(filename): wget.download(base_url + filename) 8 9df = pd.read_csv(filename)
To get more information about the dataset, execute:
1df.describe() # Describe the data 2 3df.tail() # Show the last few rows of the dataset
Split the data
The selected model ibm/granite-ttm-512-96-r2 accepts only a 512 input length. The dataset will be split into a historical dataset containing 512 rows, and the next 96 lines will be used to check the consistency of the predictions.
To split the data:
1timestamp_column = "time" 2target_column = "total load actual" 3context_length = 608 4future_context = 96 5 6# Only use the last `context_length` rows for prediction. 7future_data = df.iloc[-future_context:,] 8data = df.iloc[-context_length:-future_context,]
Visualize the data
You can visualize the data using the following code:
1import matplotlib.pyplot as plt 2import numpy as np 3 4plt.figure(figsize=(10,2)) 5plt.plot(np.asarray(data[timestamp_column], 'datetime64[s]'), data[target_column]) 6plt.title("Actual Total Load") 7plt.show()
4. Use with Time Series Foundation Models in watsonx.ai
To use this dataset with available Time Series Foundation Models in watsonx.ai follow these instructions:
List available models
You can list the available models as follows:
1for model in client.foundation_models.get_time_series_model_specs()["resources"]:
2 print('--------------------------------------------------')
3 print(f'model_id: {model["model_id"]}')
4 print(f'functions: {model["functions"]}')
5 print(f'long_description: {model["long_description"]}')
6 print(f'label: {model["label"]}')
Defining model
Specify the model_id that will be used for inferencing:
1ts_model_id = client.foundation_models.TimeSeriesModels.GRANITE_TTM_512_96_R2
Initialize the TSModelInference class.
TSModelInference is a wrapper around watsonx.ai models that provides integration for the models.
1from ibm_watsonx_ai.foundation_models import TSModelInference 2 3ts_model = TSModelInference( 4 model_id=ts_model_id, 5 api_client=client 6)
Defining the model parameters
Provide a set of model parameters that will influence the result:
1from ibm_watsonx_ai.foundation_models.schema import TSForecastParameters 2 3forecasting_params = TSForecastParameters( 4 timestamp_column=timestamp_column, 5 freq="1h", 6 target_columns=[target_column], 7)
Forecasting
Call the forecast() method to predict electricity usage.
1results = ts_model.forecast(data=data, params=forecasting_params)['results'][0]
Plot predictions along with the historical data
And to compare historical data with the predications by plotting it visually:
1plt.figure(figsize=(10,2)) 2plt.plot(np.asarray(data[timestamp_column], dtype='datetime64[s]'), data[target_column], label="Historical data") 3plt.plot(np.asarray(results[timestamp_column], dtype='datetime64[s]'), results[target_column], label="Predicted") 4plt.plot(np.asarray(future_data[timestamp_column], dtype='datetime64[s]'), future_data[target_column], label="True", linestyle='dashed') 5plt.legend(loc='center left', bbox_to_anchor=(1, 0.5)) 6plt.show()
Next steps
Now that you have successfully made a simple request with watsonx.ai Timeseries API/SDK, learn about other features and capabilities of the product.
Experiment with incorporating the timeseries capabilities with Text generation, Chat or Tool calling to create an end-to-end solution.