My colleague Praveen Devarao has posted a blog on programming in node with DB2
Data Application Development
Mario Briggs 270001DNFS Tags:  ruby-on-rails golang sqlalchemy node.js db2 django python go ruby 1,337 Views
Jan 1 2015 marked a release of the ibm_db python module that culminated the simplification of developer experience for DB2 from node.js , Ruby and Python . For example if you used mysql from these languages, it was just the one npm install/gem install/pip install command required whereas for DB2 (and our other enterprise friends) it was pre-requisites first including things like LD_LIBRARY_PATH ugh!!!. With the latest version of the ibm_db module in these 3 languages, it is as simple now and one does not have to bother about prerequisites and linking stuff.
Give these packages a try and any feedback is welcome. Most satisfying was the work on ibm_db node.js package on Windows. With the package manager not having a way to ship binaries and no mingw support, it makes it hard since users are not going to have a VC++ compiler on their machine .
Golang - the db2cli package has been out there for a while thanks to great work from Patrick Higgins and it is what we will collaborate on for DB2 connectivity from Go. With bluemix now having a Go runtime by default, here's a dashDB sample with Go. Finally, Avinash will be looking into if at all we can do the above same cool enhancement for Go.... seems challenging but we are looking and any pointers from folks with experience with Go in this area is welcome.
 - If you are using Mac OS, our support for that platform is still awaiting this cool enhancement (still need to install DB2 Express-C) and so too does the ibm_db gem on windows.
 - I did now see node-pre-gyp , so something to check out if need to consolidte on
AutoReconfig, Shading, Buildpack configuration extensions with PaaS runtimes and the lack of control.
That’s quite a mouthful and before i get into that story, let me provide the useful piece of information upahead : spring-cloud now supports dashDB & SQLDB out of the box. Here’ a couple of end-user tutorials on using it
With that out of the way, a few weeks back i got the itch to add spring-cloud support for dashDB and sqlDB. It evolved like this - i wanted to see how to use a scala application connecting to dashDB on IBM Bluemix. Turned out i can’t push scala code directly into a Bluemix runtime but rather i needed to generate java bytecode and push that up. Looking around it seemed like the Play framework seemed a good bet… ‘activator dist’ produced a jar that could be pushed up to IBM Bluemix and the java_buildpack knows how to handle this jar. Seemed cool that i decided to try it out. Now how does Play handle the database configuration ? That is inside conf/app.conf via properties, the most important of all being ‘db.default.url’. So how do i get my Bluemix application’s VCAP_SERVICES values into that ? The java_buildpack pulls in a few spring-cloud jar files (learned this later) that provide 'auto-reconfiguration' - turns your VCAP_SERVICES database related configuration into environment variables that are named like ‘cloud.services.<serviceInstance_name>.connection.jdbcurl’ that you can reference inside your app.conf and then your Play app is connected to your database. Spring-cloud did not yet have this support for dashDB and that’s where the itch originated.
Now onto figuring out where to get this code added? The helpful folks at the java_buildpack told me spring-cloud and not in their code and that left me scratching my head at start. @cgfrost provided some useful links following which i ended up at the spring-cloud-connectors github repo and could see files there related to mysql/postgresql and so i knew i reached the place where i needed to add the code. I issued a PR that was accepted and so onto the next question: how do i test these changes in its real place i.e. a Play app on Bluemix? @scottfrederick made me aware of the spring-snapshot repo and it was cool that a build was already available that had my changes. Now how do i get the Play app or java_buildpack to use these hot-off-the-oven snapshot jars? @cgfrost pointed me to using my own index.yml and env vars ... scratching my head again… i needed something simpler to start with and so i started searching cloudfoundry-samples on github and found this hello-world app and its accompanying blog.. just what i needed… in this one the maven build just added the spring-cloud jars to the lib folder of the war and i am on my way. So i cloned the repo, pointed the pom.xml to the spring-snapshot repo, produced the war (oh i forgot… needed to add db2jcc.jar to the lib folder since it is not yet in maven), pushed the app. I exercised the UI to find that a DataSource was injected that was connected to dashDB. I print out the java env variables to see if 'cloud.services.<dasbDB_Instance_name>.connection.jdbcurl’ is there and i can call it game-set-match, but it isn’t there.
This means i need to go back and dive into understanding how to extend the java_buildpack using index.yml as @cgfrost mentioned. Looking at the logs of the staging process of the java_buidpack, i see this message - 'Downloading Play Framework Auto Reconfiguration 1.7.0_RELEASE fromhttps://download.run.pivotal.io/auto-reconfiguration/auto-reconfiguration-1.7.0_RELEASE.jar', so I pull down that jar and inside it, see that org.springframework.cloud.xxx exists (i noticed though that the package structure inside is org.cloudfoundry.reconfig) , so this is the jar and i need to produce my version of it. Now i understood @cgfrost' comments better and re-reading the docs find ‘play_framework_auto_reconfiguration.yml’ and understand that i need to create ‘JBP_CONFIG_PLAY_FRAMEWORK_AUTO_RECONFIGURATION’ environment variable in my app and have that pointing to a web url where my index.yml and my jar version of auto-reconfiguration-xxx.jar is uploaded. I then found the java_buildpack_auto-reconfiguration project in github, cloned it, edited its pom.xml to pull the cloud-spring jars from the spring-snapshot repo and then built my version of auto-reconfiguration-xxx.jar and pushed that up as a bluemix app along with a index.yml. I now pushed my original Play app to Bluemix with the JBP_CONFIG_PLAY_FRAMEWORK_AUTO_RECONFIGURATION variable defined appropriately and the staging logs did provide proof that it was now using my auto-reconfiguration jar. Great. However my app was still not working, the dashDB connection was failing with this message
'com.ibm.db2.jcc.am.DisconnectNonTransientConnectionException: [jcc][t4][4.20.27] A communication error occurred during operations on the connection's underlying socket, socket input stream, ERR or socket output stream. Error location: Reply.fill() - socketInputStream.read (-1). Message: Connection reset. ERRORCODE=-4499, SQLSTATE=08001’.
Obviously the 'cloud.services.<dasbDB_Instance_name>.connection.jdbcurl’ env var’s value was not right.
So how do i debug this or in other words find out what was the actual value being used? Believe it or not, in this stack there isn’t a way to do it . The Play framework does not print out the value it is using. I can turn on dashDB JDBC driver logging and it will log the value it received, but i will not be able to access/view that driver log file on Bluemix since log files on Bluemix are not accessible until the app has properly started and in this case the app does not properly start because of the failed database connection. I guess this problem needs fixing, else one is stuck (also a reason why one might want to consider Bluemix Containers or VMs over runtimes). Fortunately for me, i sit alongside the dashDB JDBC driver team in office and i requested them to provide a version of the jar that wrote to the console the value of the url it received. Without this, i wouldn’t have been able to find out what the problem was.
It turned out i needed this But before that, how do i make that change and test it in Bluemix without waiting for a new jar in spring-snapshot repo? I produced my local version of spring-cloud-core and pushed it to my local maven repo and then changed java_buildpack_auto-reconfiguration’s pom.xml to pull from my local repo. You might ask why all this trouble and why not simply drop in the updated .class file into the existing jar? Problem again. The java_buildpack_auto-reconfiguration produces a jar where it changes org.springframework.cloud into org.cloudfoundry.reconfiguration.org.springframework.cloud via some magic called shading. Therefore i am always only producing the former package and never the latter, so can’t just drop it into the existing jar. I learnt this the real hard way and through many trial and error experiments. Finally for whatever reason (maybe a issue on my side or not), the shading process was always pulling the spring-cloud-core jars from spring-snapshot repo and not my local repo. This was another dead-end that i worked around by creating a new project in eclipse with the package structure of org.cloudfoundry.reconfiguration and put all the spring-cloud-core .java files under that , changed their packages names to reflect the shaded package name and compile them. Then copied the required class file from this into the reconfiguration jar. The app worked correctly now
Phew what an exercise scratching this itch turned out to be. Learning about auto-reconfiguration from an insider point-of-view, shading and java_buildpack configuration extension. Then foxed by lack of control on Bluemix runtimes when there is an error before the app starts. Finally the mess i had to go through to produce shaded jars with my local changes. Seems exactly distributed like the world of cloud. Thanks to @cgfrost, @scottfrederick and my dashDB jdbc driver developer colleague Kollol Mishra, for without their help spring-cloud support for dashDB/SQLDB would still not be available.
Still didnt get why the env vars are not available when i print them out from inside my app.