PostgreSQL connection
To access your data in PostgreSQL, create a connection asset for it.
PostgreSQL is an open source and customizable object-relational database.
Supported versions
- PostgreSQL 15.0 and later
- PostgreSQL 14.0 and later
- PostgreSQL 13.0 and later
- PostgreSQL 12.0 and later
- PostgreSQL 11.0 and later
- PostgreSQL 10.1 and later
- PostgreSQL 9.6 and later
Create a connection to PostgreSQL
To create the connection asset, you need the following connection details:
- Database name
- Hostname or IP address
- Port number
- Username and password
- SSL certificate (if required by the database server)
Select Server proxy to access the PostgreSQL data source through a server proxy. Depending on its setup, a server proxy can provide load balancing, increased security, and privacy. The server proxy settings are independent of the authentication credentials and the personal or shared credentials selection. The server proxy settings cannot be stored in a vault.
- Proxy hostname or IP address: The proxy URL. For example, https://proxy.example.com.
- Server proxy port: The port number to connect to the proxy server. For example, 8080 or 8443.
- The Proxy username and Proxy password fields are optional.
For Credentials and Certificates, you can use secrets if a vault is configured for the platform and the service supports vaults. For information, see Using secrets from vaults in connections.
Choose the method for creating a connection based on where you are in the platform
- In a project
- Click Assets > New asset > Prepare data > Connect to a data source. See Adding a connection to a project.
- In a catalog
- Click Add to catalog > Connection. See Adding a connection asset to a catalog.
- In a deployment space
- Click Import assets > Data access > Connection. See Adding data assets to a deployment space.
- In the Platform assets catalog
- Click New connection. See Adding platform connections.
Next step: Add data assets from the connection
Where you can use this connection
You can use PostgreSQL connections in the following workspaces and tools:
Projects
-
AutoAI (Watson Machine Learning)
-
Cognos Dashboards (Cognos Dashboards service)
-
Decision Optimization (Watson Studio and Watson Machine Learning)
-
Data quality rules (IBM Knowledge Catalog, IBM Knowledge Catalog Premium). See Supported data sources for curation and data quality.
-
Data Refinery (Watson Studio, IBM Knowledge Catalog any edition)
-
DataStage (DataStage service). For more information, see Connecting to a data source in DataStage.
-
Metadata enrichment (IBM Knowledge Catalog any edition). See Supported data sources for curation and data quality.
-
Metadata import (IBM Knowledge Catalog any edition). See Supported data sources for curation and data quality. For information about the supported product versions and other prerequisites when connections are based on MANTA Automated Data Lineage for IBM Cloud Pak for Data scanners, see the Lineage Scanner Configuration section in the MANTA Automated Data Lineage on IBM Cloud Pak for Data Installation and Usage Manual. This documentation is available at https://www.ibm.com/support/pages/node/6597457.
For metadata import (lineage), MANTA Automated Data Lineage for IBM Cloud Pak for Data and a corresponding license key must be installed. See Installing MANTA Automated Data Lineage and Enabling lineage import. -
Notebooks (Watson Studio). Click Read data on the Code snippets pane to get the connection credentials and load the data into a data structure. For more information, see Load data from data source connections.
-
SPSS Modeler (SPSS Modeler service)
-
Synthetic Data Generator (Synthetic Data Generator service)
-
Watson Machine Learning Accelerator (Watson Machine Learning Accelerator service)
Catalogs
-
Platform assets catalog
-
Other catalogs (IBM Knowledge Catalog)
- Data Product Hub
- You can connect to this data source from Data Product Hub. For instructions, see Connectors for Data Product Hub.
- Data Virtualization service
- You can connect to this data source from Data Virtualization.
Federal Information Processing Standards (FIPS) compliance
This connection can be used on a FIPS-enabled cluster (FIPS tolerant); however, it is not FIPS-compliant.
PostgreSQL setup
Running SQL statements
To ensure that your SQL statements run correctly, refer to the SQL Syntax in the PostgreSQL documentation.
Learn more
Parent topic: Supported connections