DB2 Administration and Application Development on Cloud (BlueMix)
Finally after 3 months of preparation, I have completed the DB 9 Advance DBA certification. It was my 2nd attempt. First time when I gave this exam, I was in an impression that it will be similar to the exams I gave earlier (DBA and Application Development) where I can just read certification guide and will be able to clear it but that was not the case here. When I attempted first time and failed by 4 marks I realized, I do need to read some concepts very much in details. Specifically HA and Performance. HA and performance makes 52% of the exam and you need to have good hold on these topics to get a good score. So I started reading the complete performance and HA guide. While reading I really felt that DB2 is not just SQL and some monitoring stuff. There are a lot in it. It also gave me the feeling how hard it can be to tune a database and how difficult it would be for a DBA to tune a database which is huge in size. This encourage me to read some more administration guide and go for DB2 problem determination certification. And yes that is my next goal for this year. Another certification and a new way to see a database tuning and problem determination.
In case you need some tips and questions, let me know. I will be happy to help who likes DB2 administration.[Read More]
As I promised that I will be putting the questions asked by the customer on DB2, so here are the some.
1. One of the questions was on GTT(Global Temporary Table). As documented, GTT are at the session level and will be flushed out once the session is closed. The question was, can we have 2 GTT with the same name in 2 different stored procedure. I think its possible but it seems it may create a conflict when we try to call both the stored procedure using same connection. As GTT are at the session/connection level, the GTT created in second stored procedure may conflict with the existing GTT created in the first. I wonder if the GTT are flushed out as soon as we come out of the stored procedure execution. I still need to play and find out the correct answer. Your feedbacks are welcome.
2. Second question was on stored procedure. As we know execution permission on the package are enough to call a stored procedure, now if one of the stored procedure is calling another inside it, do we need to grant explicit execute permission on the internal stored procedure or giving the permission on the external stored procedure will implicitly grant the permission on internal too. The question here was, in there scenario they have a lot of nesting of stored procedures and giving explicitly permission on each of the nested procedures is really cumbersome.
3. Oracle gives a flexibility to provide external hints to the SQL for optimization. According to them, these hints are really useful for them as they can force the query to use some indexes. They have the question that do we have something similar. Yes we do have but we never encourage to use it as DB2 optimizer is very much intelligent enough to decide on the indexes to be used and providing these hints may force optimizer to use user provided hints and may degrade the performance.
4. Do we have compiler directives ? I am not sure what they mean by this. There might be something in oracle.
Your comments on these questions are welcome.
XML, XML and XML... most of the data in this world can be represented in XML format. But sometime it is not baneficial to do that specially when their is tight coupling with the data and data is really obeying strict rules. But at the same time, when data need to be very flexible and you like to store only those things which has some logical values, XML is the way. It will save your memory if there are a lot of propeties/column which as null value. In RDBMS, NULL is a logical identity while in XML world, there is nothing like NULL, its just either value exist or not. While in RDBMS, metadata is stored only once (one column name), in XML metadata will be repeated for every record. Then what the benefit ? There is no need to stored metadata at a different location which means no catalog tables. but yes there is schemas which will do the job.While RDBMS cant work without catalog tables, XML can work without schemas.
Now coming to how XML fits within RDBMS ?RDBMS stored the data in a table format. Each column has a strict data typing. So to store XML, either we need to split into the parts which has strong datatypes or assume that full XML, a binary or character streams. In first case, we need to create metadata first and then split the data and in second, we are completely ignoring the metadata associated with the XML document. We need a way where we save the metadata and at the same time there is no need to split the metadata from the XML to retain its flexibility. Which means storing the XML in such a way that it preserve its structure and at the same time fit into RDBMS model. And here comes the innovative solution of pureXML.
DB2 pureXML is a technology which work with both relational and XML data under one umbrella. It can query both XML and relational data at the same time using SQL/XML language. the benefit is that applications now can concentrate on business logic rather then worrying how to handle XML data coming in their way. DB2 will take care of your XML data from storing it in memory to query it, transform it, represent it. DB2 treat XML as just another data type in his repository and provide you the functions to work with this datatype.
XQuery is the language to query XML data and for this new XML data type, DB2 provide you the flexibility to query this data type using XQuery only. At the same time, it integrate XQuery and SQL together using SQL/XML. So you can query a table to select the relational columns where some condition in XML has met. So far So good.[Read More]
Last week I got a request from one of my colleague to help her in deploying one of their application to Bluemix. This application is an enterprise application and I have been given an ear file and a readme file which just provide me step to step instruction of installing the application in a local liberty profile.
To start with, the task seems to be easy as we already have the step to step instruction for local liberty profile and Bluemix use the liberty runtime only. My understanding was that there wont be much differences accept I don't need the liberty installation (as I will get it on the Bluemix), but as I moved on I started seeing some challenges which is very much particular to cloud environment. These challenges has changed my view and clarified some differences between, how the application should be developed and deployed in a cloud environment Vs in premise development.
The first step I did is to push the ear file to Bluemix using cf push and that worked smoothly. My web module is up and running within couple of minutes. The next task was now to have the database up and running and configured with the application. I have added an sqldb service and bound it to the application. Till now its was smooth selling, but the real challenge came now. How to make sure that application configured and connect to the database ? Literally without changing the application code ?
If I think of how we do it in a local environment, we do put the details of database like db name, user name, password, host name, port number etc in server.xml file of the liberty profile and give it a JNDI name. This JNDI name is then referred in the application to connect to the database. This server.xml file is part of the liberty server and not application. In cloud environment we generally don't get access to the server and hence has no control on the server.xml file of the default liberty runtime on Bluemix. The step by step instruction given to me says to copy, server.xml file zipped along with application, to my server profile. But this is not possible when working with the cloud. Same problem goes with the database driver jars. In local installation we generally copy them to the server library. In case of cloud we need to make it part of the application library (under WEB-INF/lib). Both of these alternatives are not possible with the standard liberty build pack and with a ear which can not be changed.
After some understanding of build pack and how I can push changes to build packs, I got the solution to the problem. So here is what I did.
So what this solution is actually doing. When you run the server package command with usr directory included, It actually zipping all the server runtime and application in one folder. When you push this to Bluemix, Bluemix actually detect the server.xml file inside the zip and understand that the zip file contain both the application and the runtime and overwrite the existing configuration file (server.xml) with the one you have created and zipped.
The same solution works for the driver jars file too. You can add the driver file in your local liberty profile under shared resources directory (LIBERTY_HOME/usr/shared/resources) and this will be zipped as part of your server package. Upon cf push, will become part of your customized build back accessible to application. When you do this, make sure that your server.xml file actually has the correct path for these JDBC driver jars, other wise liberty will not be able to find it at the time connection request..
So this solved my problem and application was deployed successfully and was using my customized liberty build pack and sqldb service on the cloud.
This process can be used to do anything which require you to change your build pack including adding new liberty feature in your cloud liberty build pack.
So happy developing :)
In last one week, I have a chance to visit one academia institution and one corporate to evangelize BlueMix. I was really thrilled to see the the kind of interest it is generating among various developers and student community.
In ABES, around 60 Faculty members, mostly from IT and Computer Science department, attended the session. One of my colleague, Nihar covered the basics of BlueMix, Its architecture and its benefit, I covered the practical session where I showcased the various User Interface components and how to develop a basic web application using Eclipse Plugin for cloud Foundry. I also covered how a developer can move the local database to the cloud before It deploy a web based database centric application
In HCL Technology, there was around 5 0 developers who attended the session. one of my other colleague, Neeraj covered the basic concepts, while I covered the hand-on, Cloud foundry concepts and a how to develop the application and deploy database.
Some of the questions came from these visits are listed below. While I am listing only questions here, I will try to answer some of them in my future posts.
Overall the sessions was really good and intractive. There is a lot of interest for trying out BlueMix and I am sure many of them might have already started exploring it.
Just wanted to share the link for my first Facebook application deployed on BlueMix. This just has a login facility using Facebook account and a small welcome page.
It took me just a couple of hours to build it and thats when I have never written any FB app in the last and have very basic knowledge of Java Script.
Soon I will share my experience of developing this application with you. so Stay Tuned as always.
Sardana 06000055HU Tags:  engineering infosphere reverse architect data bluemix 1 Comment 3,875 Views
As I promised, today I will talk about how to reverse engineering your existing database to extract the required DDL and deploy it on the cloud. For the folks who has worked on InfoSphere Data Architect, this is just 10 mins jobs and for the people who has not, add on 10 mins more for learning :). Yes only another 10 minutes.
So Lets get started. As a prerequisite, do download and install the InfoSphere Data Architect product.
Below is the step by step process.
So now you know how you can reverse engineer the local database in your system and deploy it on the cloud. So Happy reverse Engineer and do let me know if you face any issue following the steps.
Next experiment I am planning to do is to create a Log in page for my application which take advantage of Facebook login instead of creating my own implementation. So stay tuned and enjoy BlueMixing :).
Today I started reading the book "Database Administration : The Complete Guide to Practices and Procedures". I started with the 3rd chapter which talk about the data modeling. I am not sure if the data modeling work falls into the tasks a DBA should perform. At the same time as its one time activity, having a independent team to do the data modeling may result into no-work for the team once the database is implemented. As there are a lot of tools available to model the data, despite of that, is data modeling is really difficult ?. The book says efficient data model will allow you to minimize the data redundancy, maintain integrity, better data sharing and access and maintain consistency and while modeling you should consider how the data can be used in future instead of the current usage. But predicting about how the data can be used in future and really a difficult task. With new technologies coming every day, the usage of the data can go any way. Predicting the future use correctly will really make a data model a good one (of course, prediction should be right for that).I think now I need to start looking for some article which will give me some insight on how to predict the different manner a data can be accessed in future?. Do let me know if you know anything related.[Read More]
Yesterday, I read a small article on DBA traits in Craig Mullins blog. I agreed to what he mentioned in the article. Some traits he mentioned are organized and capable of succinct planning, adaptable, insatiably curious and should have some people skills too. He says "DBAs are expected to know everything about everything -- at least in terms of how it works with databases". When I shared this article with some of my colleagues, I was a little surprised by the responses. One of the question come out of the discussion is "Do DBA certification help becoming a DBA ?". Some says no as it will never give you a feel of real world problems. According to me it depends on who is doing the certification. A person who the real work experience may see no value in doing the certification however for a person who never get a chance to see the real world, A DBA certification may help at least giving a little insight into the kind of problem DBA face. It will at least enable then to see the problem in a right manner and approach in the right direction. I will say certification is the first step to become a DBA and a DBA will never want to go back to the first step. So value of certification depends on the step where you are in the DBA ladder.
While I was in discussion with my colleagues, I was looking for some book which will give me a complete list of task which a DBA need to perform. I found a book "Database Administration: The Complete Guide to Practices and Procedures" again written by Craig Mullins. Hoping that I will find the stuff which I am looking for in this book.[Read More]
Today while browsing developerworks, I came across this tutorial which explain how to setup your system to create an web application from free softwares suits from IBM ie DB2 express-C, Eclipse and WAS Community edition. The tutorial explain how to install, configure and integrate these component and start creating your application. I think this will be beneficial for the students for their projects and at the same time for the people who likes to learn and create their sample web application and see the power of this suit. Here are what this tutorial cover in 2 part sessions.
# Downloading and installing DB2 Express-C 9.5
# Creating databases and manipulating data with tools in DB2 Express-C 9.5
# Downloading and installing Application Server 2.0
# Managing Application Server through the Web console
# Connecting Application Server to DB2 Express-C 9.5 using a JCA 1.5 connector
# Downloading and installing Eclipse
# Installing the Eclipse Web Tools Platform (WTP) server adapter for Application Server (formerly called the Application Server plug-in for Eclipse)
# Managing, browsing, and editing DB2 Express-C 9.5 data through the Eclipse IDE
# Testing Web applications in Eclipse using existing Application Server installation
# Rapidly developing and testing a JSP/JSTL Web application in Eclipse, with data access to DB2 Express-C 9.5, and deploying it to Application Server
# Configuring Application Server as a general Web server on the Internet
And here is the link to the tutorial first part
So enjoy reading and create your web application for free.[Read More]
Finally, I am able to successfully install the IBM Mashup starter kit with the help of Lauren. I come to know about this kit from Sreekanth blog entry "Get it done quickly - Mashup". In this entry he mentioned that this kit will help creating a dashboard with the data from various sources with zero coding,So This looks interesting and I thought of giving it a try. After 2-3 days of struggle and with the help of Stephen and Lauren, I was able to successfully install both QEDWiki and mashup hub which are the part of this kit. I still need to spend some time to play and do some experiment with it. README provided with this kit says that you need to use Express-C version of DB2 and zend core version 2.0.4 or later. But it works fine even with DB2 V9 ESE but not with previous version of zend core.
I will write about my experience using this kit soon in an another blog entry. Till then enjoy reading.[Read More]
Earlier today, I was preparing the presentation on DB2 pureXML. At the end of the presentation, I wanted to put some references and came across these DB2 Games. A nice idea to teach DB2 to the people who are new to DB2. I like the detective game more interesting then business one. It is easier but interesting too. I think it will be very useful to teach people DB2 and SQL/XQuery. Here is the link to download these games.
DB2 Business game
DB2 Detective Gamehttp://www.ibm.com/developerworks/edu/dm-dw-dm-0402kubasta-i.html
Have fun learning DB2[Read More]
Recently I was facing a problem in one of my experiments with Bluemix, where I was trying to connect to a local database on my system from a Bluemix application. I was failing to do so and the reason being that my system was behind a firewall of my organization. Bluemix being on the open network (internet) can not access network behind firewall. Offcourse this is what most of the organization's objective is, for security reasons and to make sure no unwanted access happens on their network.
Being in the cloud world now where organization are planning to have their application over cloud, one of the requirement will be to have some communication happen between their application on cloud and application/tools/databases etc inside their premises. Organization can not do away with firewall security too. So what is the solution here.
Cast Iron live cloud integration add on in Bluemix actually help you achieve this, communication in a secure manner without compromising on your firewall security. Lately I was playing with this service and found it really worth in achieving the kind if integration most of the organization require.
Cast Iron live allow to create a secure connector between a system behind firewall and cast iron service on the Bluemix. This service that communicate with the application. So in summary cast iron live create a secure channel between the application and the system behind firewall in such a way that security is not compromised.
Overall the process includes creating a cast iron service on Bluemix, installing the secure connector in the system and configuring it, creating end points and defining inputs/output of various endpoint defined using cast iron studio, deploying it and using the end points to access the system in your application. Cast iron studio supports various end points including database (jdbc access), HTTP, HTTPs etc. I am planning to cover each of these steps in my next blog entry. Till than keep reading and experimenting with Bluemix and various services available.
Lately I am trying one of the service on Bluemix, Twilo which allow you to send and receive SMS through your application. For my on demand car pooling application I am planning to implement a OTP authentication by sending a one time password to user to validate there mobile number.
Have you ever done such a thing or if you have done anything on Twilo in the past, do share your experience. This will surely help me implement the same quickly.
Waiting to hear from you. In case i make some progress, learn things on this topic, will surely share it with you in my next bog entry.
at present I am following this article to make use of Twilo
I am sure you might have been wondering where I lost after writing a lot on exploring Bluemix and my answer is I was making a transition from exploring Bluemix to developing with Bluemix. After so much of learning with Bluemix, I decided to develop one real application to see how fast I can do that. Last 1 month or so was spent on this. Along with my office work I was spending around 4-5 Hours a week on this. The application I was developing was for on demand car pooling. So here is my experience.
So overall it took me only a full week work to put this application together and make it live.
I am sure You can guess now how easy it to put a live website/portal using Bluemix platform. Have you given it a try ? If no do try it out.
Any problem, do let me know and we can work together to solve it.