Securing Destinations with TLS in Bluemix Secure Gateway

Share this post:

In our previous tutorial, we showed how you can connect to an on-premises MySQL database using the Secure Gateway Service. We created a destination with No TLS. While this is acceptable for public destinations, we may want to secure our destination so that only we can access it. In this tutorial, we will set up a destination with TLS: Mutual Auth so that a private key and certificate are required to connect to our destination. We will also provide an example of how to use this destination in a Node.js Bluemix app.

We will use the same setup we previously did, connecting to a MySQL database on the backend of our hypothetical company, ACME.

Creating Our Destination

First, begin by creating a gateway and getting the client connected as shown in the previous tutorial, or use the previously created gateway. We can then create a new mutual auth destination by inputting a name for this destination, the IP or hostname and port of our database, selecting TLS: Mutual Auth, and checking the “Auto generate cert and private key” box. Instead of auto generating the certificate and private key, we could upload our own certificate.

Once we have added this destination, we will be provided the cloud host and port. We can also now download a zip file containing the certificates and keys we need to connect to our destination by clicking the gear icon and selecting “Download Keys”.

Now that was have connected our gateway, created our mutual auth destination, and downloaded the certificate package, we are ready to connect to the destination.

Connecting to Our Destination via a Bluemix App

We can start by creating a Bluemix Node.js runtime app and downloading the starter code. This will give us an Express app with one route that points to the sample index page. We will convert this route into an API call that returns the result of some SQL statement. To accomplish this, we will create a module that creates our MySQL connection, a module that creates a no-TLS to TLS tunnel to attach our certificates and keys to our connection, and the route that will utilize these modules upon a request. An example of the app we are building can be found in the project securegateway-mutualauth-sample.

Creating our MyQuery Module

Our MyQuery module will be a simple module with one exported function, run. Run will take in a SQL statement, the port to connect to, and a callback function. The callback function will get executed once the query is complete, passing either an error or the result of the SQL query. The code below uses the mysql node module to connect using the database credentials and the host and port of the tunnel we will create in the next section.

var mysql = require("mysql"); = function(sql, port, callback){
var connection = mysql.createConnection({
host : '',
port : port,
database : "dbName",
user : "dbUser",
password : "dbPassword"
connection.query({sql: sql, timeout:5000}, function(err, rows, fields) {
if (err){
callback(null, rows);


Creating a No TLS to TLS Tunnel

Since our destination is configured to use mutual auth, we need to attach the certificates and key when creating a connection or our request will be refused. In order to convert our request to TLS and attach our certificates, we create a tunnel that will attach the certificates for us. The code shown below allows us to create a tunnel at a specific port using the exported create function, and allows us to close the tunnel when we are done with it using the exported close function. It also keeps track to make sure that we do not close the tunnel if another connection is still using it.

The tunnel options that we need to include are the cloud host and port of our destination and the key, cert, and ca files. These files are included in the certificate package we downloaded earlier, so we can just extract the zip file and point to these files. The key file should be named <destination_id>_key.pem and the cert file should be named <destination_id>_cert.pem.

var tls = require('tls');
var fs = require('fs');
var net = require('net');

var options = {
host: 'cloudHost',
port: 'cloudPort',
key: fs.readFileSync('keyfile'),
cert: fs.readFileSync('certFile'),
ca: fs.readFileSync('DigiCertCA2.pem'),
rejectUnauthorized: true

var creations = 0;
var server;
exports.create = function(port, callback) {
if(creations == 0){
server = net.createServer(function (conn) {
connectFarside(conn, function(err, socket) {
server.listen(port, function(){
} else{

function connectFarside(conn, callback) {
try {
var socket = tls.connect(options, function() {
callback(null, socket);
} catch(err) {

exports.close = function(){
if(creations == 0){

Using These Modules in Our Route

To modify our route to use these modules, we can open app.js in the starter code. We then need to require the two modules we created so that we can use their exported functions. We can create a variable that will be our SQL statement to pass to our MyQuery module or we can get this statement from the request.

On each request, we will first create our tunnel at port 3001. Once the creation is complete, its callback function will be called, so we can then run our MyQuery module’s run function to execute our SQL statment. We pass the run function the same port as the tunnel so that it knows where to connect. Once the run function is complete, we will send either the rows returned from the SQL query or the error that it hit and close the tunnel. The code below shows how we can require our two modules and create a route that will execute their functions.

var mysqlScript = require('./myquery.js');
var tunnel = require('./tunnel.js');

app.get('/', function(req, res){
var sql = 'SELECT * FROM customers';
tunnel.create(3001, function(){, 3001, function(err, rows){
} else{

Now we should have everything we need to connect to our database. If we run our app locally, we can hit the route through our browser since it is a simple GET request. Below shows what we can expect from a successful call to our route.

Creating a User Provided Service

Now we have an app that will allow us to connect to a mutual auth destination and query our database, but what if we wanted to externalize our database credentials, destination information, and certs/keys so that they are not hard coded in our app? We can utilize a user provided service in Bluemix to hold all this information.

To create a user provided service with the destination and database’s information, we can run the cf cups command shown below and pipe a json file that contains our key value pairs. Our json file should look something like {"key":"<keyFileContents>", "cert":"<certFileContents>",...}. The sample includes userProvidedService.json as an example of what this JSON file should look like.

cat &lt;json_file&gt; | xargs cf cups  -p

Then, we can bind the user provided service to our Bluemix App so we can access our credentials from VCAP_SERVICES. Once bound to our app, our app should look like this:

We can access VCAP_SERVICES from our app and use them as shown below. If we had more than one user provided service bound to this app, we would need to check the name to check we are grabbing the correct user provided service.

var userProvided = {};

vcap = JSON.parse(VCAP_SERVICES);
userProvided = vcap["user-provided"][0].credentials;

var options = {
port: userProvided.port,
key: userProvided.key,
cert: userProvided.cert,
ca: userProvided.cacert,
rejectUnauthorized: true

Once our app has been deployed and we have bound our user provided service, we can check that it is connecting to our database by going to the route we created where we should see the result of our SQL query. We have now successfully connected to our MySQL database in Bluemix through our TLS: Mutual Auth Destination.

More stories
May 1, 2019

Two Tutorials: Plan, Create, and Update Deployment Environments with Terraform

Multiple environments are pretty common in a project when building a solution. They support the different phases of the development cycle and the slight differences between the environments, like capacity, networking, credentials, and log verbosity. These two tutorials will show you how to manage the environments with Terraform.

Continue reading

April 29, 2019

Transforming Customer Experiences with AI Services (Part 1)

This is an experience from a recent customer engagement on transcribing customer conversations using IBM Watson AI services.

Continue reading

April 26, 2019

Analyze Logs and Monitor the Health of a Kubernetes Application with LogDNA and Sysdig

This post is an excerpt from a tutorial that shows how the IBM Log Analysis with LogDNA service can be used to configure and access logs of a Kubernetes application that is deployed on IBM Cloud.

Continue reading