Setting up a connection to Db2 Big SQL
You must set up a connection to Db2® Big SQL before users can connect to Db2 Big SQL from a client application. You can set up the connection by using either the NodePort or the Passthrough Secure Route method.
NodePort method
You can use the Db2 Big SQL
NodePort
service resource to connect JDBC or ODBC applications to Db2 Big SQL on IBM Cloud Pak® for Data. A NodePort
-type
Service
resource exposes a service on a specific port on all nodes in the cluster.
The port is specified in the Service
resource's
.spec.ports[*].nodePort
field.
After you deploy Db2 Big SQL, you can find
the NodePort
by running the following commands.
Ensure that you source the environment variables before you run the commands in this task.
- Log in to Red Hat OpenShift Container Platform as a cluster
administrator.
${OC_LOGIN}
Remember:OC_LOGIN
is an alias for theoc login
command. - Log in to your OpenShift cluster as a project
administrator:
oc login <OpenShift_URL>:<port>
- Change to the project where the Cloud Pak for Data control plane is
installed:
oc project ${PROJECT_CPD_INST_OPERANDS}
- Obtain the NodePort number.
- Identify the Db2 Big SQL instance
ID:
oc get cm -l component=db2bigsql -o custom-columns="Instance Id:{.data.instance_id},Instance Name:{.data.instance_name},Created:{.metadata.creationTimestamp}"
- For a non-TLS port, run the following
command:
oc get -n project services c-bigsql-<instance-id>-db2u-engn-svc -o jsonpath="{.spec.ports[?(@.name=='legacy-server')].nodePort}"
- For a TLS port, run the following
command:
oc get -n project services c-bigsql-<instance-id>-db2u-engn-svc -o jsonpath="{.spec.ports[?(@.name=='ssl-server')].nodePort}"
- Identify the Db2 Big SQL instance
ID:
Users can now connect to Db2 Big SQL by using the NodePort from a client application. For example, the following JDBC URL shows how to connect to Db2 Big SQL on the non-TLS port:
jdbc:db2://<publicly exposed or load balancer node>:<non-TLS NodePort>/bigsql:user=<username>;password=<password>;
To connect a JDBC client using a TLS NodePort, you must copy the certificate into a truststore file and provide its details to the JDBC connection by completing the following steps:
- Download the SSL certificate from the Db2 Big SQL instance page in the IBM Cloud Pak for Data web interface.
- Copy the certificate to the client host and then import it into a keystore
file:
keytool -import -keystore /var/jdbc-client.jks -file bigsql-ssl.cert -noprompt -storepass <truststore-password>
- Add the following properties to the JDBC connection
URL:
For example:sslConnection=true sslTrustStoreLocation=/var/jdbc-client.jks sslTrustStorePassword=<truststore-password>
jdbc:db2://<FQDN>:<port>/bigsql:user=<username>;password=<password>;sslConnection=true;sslTrustStoreLocation=<path-to-truststore-file>;sslTrustStorePassword=<truststore-password>;
Passthrough secure route method
A secure route has the advantage over NodePort in that it can automatically take advantage of the router that is deployed by default to your cluster, and functions as the ingress point for external network traffic. The router listens on the public host network interface, unlike the application pods that listen only on private IPs. The router uses a service selector to find the service and the endpoints that back the service, and creates routes that expose services as hostnames to be used by external clients. For more information on how applications are exposed with routes, see Exposing apps with routes in the IBM® Cloud documentation.
To use this method, the connection must be made by using an SSL connection, using the certificate that you download from the Db2 Big SQL instance page in the Cloud Pak for Data web interface.
Ensure that you source the environment variables before you run the commands in this task.
- Log in to Red Hat OpenShift Container Platform as a cluster
administrator.
${OC_LOGIN}
Remember:OC_LOGIN
is an alias for theoc login
command. - Change to the project where the Cloud Pak for Data control plane is
installed:
oc project ${PROJECT_CPD_INST_OPERANDS}
- Identify the Db2 Big SQL instance
ID:
oc get cm -l component=db2bigsql -o custom-columns="Instance Id:{.data.instance_id},Instance Name:{.data.instance_name},Created:{.metadata.creationTimestamp}"
- Create a new service for the secure route by running the following
command:
cat << EOF | oc apply -f - apiVersion: v1 kind: Service metadata: name: bigsql-<instance-id>-jdbc-tls spec: ports: - name: jdbc-tls port: 50001 protocol: TCP targetPort: 50001 selector: app: bigsql-<instance-id> name: dashmpp-head-0 type: ClusterIP EOF
- Create the secure route by running the following
command:
oc create route passthrough bigsql-<instanceid>-jdbc-tls --service=bigsql-<instanceid>-jdbc-tls
- Confirm that the route was
created:
oc get route bigsql-<instanceid>-jdbc-tls
This command should return a result such as:
NAME HOST/PORT PATH SERVICES PORT TERMINATION WILDCARD bigsql-<instanceid>-jdbc-tls bigsql-<instanceid>-jdbc-tls-<Project>.apps.<publicly exposed or load balancer node> bigsql-jdbc-tls jdbc-tls passthrough None
Users can now connect to Db2 Big SQL by using the secure route from a client application. For example, the following JDBC URL shows how to connect to Db2 Big SQL by using the secure route that you obtained from step 5:
jdbc:db2://bigsql-<instanceid>-jdbc-tls-<Project>.apps.<publicly exposed or load balancer node>:<port>/bigsql:user=<username>;password=<password>;sslConnection=true;sslTrustStoreLocation=<path-to-truststore>;sslTrustStorePassword=<truststore-password>;
enableClientAffinitiesList=1;enableSeamlessFailover=true;clientRerouteAlternatePortNumber=443;clientRerouteAlternateServerName=bigsql-<instance_id>-jdbc-tls-<Project>.apps.<publicly exposed or load balancer node>;
As well as considering the router wide timeouts, the timeout on the Db2 Big SQL jdbc secure route can be customized by using a router annotation, as described in Configuring route timeouts in the Red Hat OpenShift documentation.
Although they are alternative ingress mechanisms, the NodePort and Passthrough Secure Route methods can coexist and be used interchangeably.