Writing to you now from outside of Rome at IOD EMEA, but I wanted to take some time to focus on some of my impressions of this year's IDUG NA in Tampa:
The first one is branding. It’s been a year since we renamed many of our Data Studio tools to Optim (and InfoSphere), partly as a result of acquiring Princeton Softech, who was using the name Optim. But there is still confusion. I spend time on this topic in my presentations, explaining that Data Studio is no longer the product family name. I explain that the term Data Studio now refers to the no-charge offering that supports DB2 (on LUW, z/OS, and i5) as well as IDS and that there is no such things as a "set of Data Studio Tools." Branding is (obviously) a very powerful thing, and changing branding can take a long time to absorb. Kathy tells me that we recently updated our packaging article on developerWorks, so if you are still confused, you may want to check it out. It lays out the high level functionality, names, etc. for the ‘portfolio previously known as Data Studio.’
Secondly, I was surprised to hear the number of customers that are using .NET to access DB2 for z/OS. I attended a Speed-SIG focused on development in Java and .NET, and all of the customers attending said that they have .NET apps accessing DB2 for z/OS. I knew that this existed, but the high frequency did surprise me. Of course, it gave us a good chance to talk about pureQuery Runtime and how it can help .NET applications in much the same way as Java applications.
Of course, it was great to see many familiar faces and lots of new ones in Tampa. The IDUG board of directors was happy to see a 20% increase in attendance over last year's North America conference. And don’t forget, IDUG Europe is planned for November 8-12 in Vienna.
I'm excited to be co-presenting with one of our customers at IOD EMEA in May. Roberta Barnabe, a DB2 Specialist for UGF (Unipol Gruppo Finanziario) Assicurazioni, will be sharing her experiences with IBM Optim Query Workload Tuner for DB2 for z/OS. For any of you interested in z/OS query tuning and want to hear what someone besides we IBMers have to say, I encourage you to come hear what she has to say about:
- Statistics Advisor
- Dynamic statements analysis
- Access Plan Graph and Query Format
- What-if analysis
- Workload analysis
- Query Environment Capture
Ending with a summary statement that the tool resulted in an easier way to perform performing tuning analysis, and "BIG TIME SAVING !!!!!!"
I hope to see you there. Here are the particulars:
Session TSB-3313Thursday, May 20
11:45AM -12:35 PM
Earlier this week, I was invited to attend and present at the Inaugural Iowa DB2 Users Group Semi-Annual Event in Des Moines. It was a very well-organized event and had great attendance. During one of the breaks, I had someone tell me, "I understand the vision of heterogeneous database tooling, but please don't divert your attention away from further work on the tooling in support of DB2, in exchange for, say Oracle." It was valid concern since we spend a generous amount of time in our presentations talking about support for Oracle, SQL Server, Sybase... on the horizon. I tried to reassure her that we are still very focused on DB2, and that much of our support extending to other database platforms is done when we can leverage things like common SQL interfaces to the other platforms. I referred them to Curt Cotner's slide (included below – I also added Guardium for completeness) that he showed during the kickoff presentation as further evidence of this. I think she was convinced, or at least felt that I understood her concern.
Technorati Tags: optim
Well, I'm glad I got your attention... One of the things that I've been pondering lately is what percentage of DBAs manage more than one database platform. Since this is an IBM blog, how many DBAs manage both DB2 for Linux, UNIX, and Windows and DB2 for z/OS? I've heard that they exist, but maybe I haven't been asking, and y'all haven't been telling. Does your shop have any of these elusive creatures? If so, what platforms do they support? If not, why not? Does this model work well for you? Why or why not? Is this something that you'd like to see happen?
One of our objectives in creating heterogeneous database tooling is to help reduce the learning curve when using another database platform. While this is still a work in progress for parts of our portfolio, I'd sure like to know how important this is to DBAs.
So, please tell... Post your answer here, or on the IBM Optim Facebook fan page (click on the Fans tab if you don’t see it), or, if you’re shy, you can just send me an email at bfsmith at us.ibm.com.
Technorati Tags: smith
Technorati Tags: ibm
I'm on the plane today returning from two customer visits in the D.C. area where I was talking about the advantages of pureQuery in DB2 for z/OS environments for DBAs (as opposed to applications developers). One uses WAS on z/OS and the other uses WAS on Linux for System z. Both customers were very familiar with how to manage and monitor CICS applications accessing DB2 data, and they felt out of control with Java application access to their mainframe. One customer even said, "we tried to tell our application teams to stop using Hibernate" because they couldn't see/review the SQL or know which application was running the dynamic SQL in their system. They said the application developers ignored them(!). I told them (hopefully, I wasn't too preachy) that they needed to stop resisting these contemporary application architectures (Java, .NET) and instead take back control by transforming those dynamic SQL statements to static SQL statements via pureQuery. I told them that if they continue the "no Java will access MY mainframe", that they are headed down a path of getting rid of their mainframe. The application development teams don't care what DBMS is used for their application, and if met with resistance from the z/OS team, they will simply deploy the applications on some other DBMS.
Related to this, I just watched this new short 7- minute video on pureQuery for DB2 for z/OS, and it's a great quick summary of the value that can be gotten with pureQuery. In my mind, it's a no-brainer to implement pureQuery when you have Java or .NET applications accessing DB2 for z/OS.
If you want to get a good idea of what the process is like in more detail, check out the upcoming virtual tech briefing on this very topic. It's entitled pureQuery Deep Dive Part 1: Client optimization, scheduled for February 4th. Patrick Titzler, one of the authors of the original tutorial on this topic, and Chris Farrar, client optimization technical lead, will be presenting. It should be good.
My kids told me the other day that they can tell it's Fall because my calendar is full of travel for work. The good part about this travel is that there are lots of opportunity to meet with customers as well as other IBMers at two big conferences: IDUG Europe
and Information on Demand (IOD)
IDUG Europe is in Rome this year, and the only bad part is that I can't stay in Italy longer! I have to leave Thursday morning so I can attend some Boy Scout leader training Fri-Sun. Hopefully, I can sneak out in the early morning and see some of the cool sights again like the Forum. IOD is in... SURPRISE, Beautiful Las Vegas! OK, I guess that's no surprise. It's interesting to hear the reactions from folks on Las Vegas. It seems like a love or hate relationship. As artificial and overindulgent as Vegas is, I enjoy it. I enjoy the food (ask me for restaurant suggestions), the weather of the Fall there, the safety of walking along The Strip anytime of the day or night, and most of all, the people watching. There's nothing better than getting a good seat with my favorite beverage and watching the crowd parade past.
OK, back to the topic of technical conferences. I have a friend who works as a developer for a DBMS competitor (no need to ask ;-)), and he's always jealous when he hears how much my fellow developers and I get to go and present at these conferences. At his company, they mostly only let product managers present to customers and leave the geeky developers back at the lab. There is very little that can substitute for face to face contact with the people who are working on the products. I'm glad that IBM lets us geeky folks out of our cages for these events. Conferences seem to be suffering from lower attendance (no surprise, considering the economy), but it's really unfortunate when you consider the value than can be gotten. The conference web sites usually have an ROI justification to help with this, and I really think that the focus, the proximity to knowledgeable experts, and the hands on labs really do justify the costs.
I just plotted out the sessions that I plan to attend at IDUG. I always like how IDUG has a high number of customer presentations. I always enjoy listening to those. IOD also has a lot of customer talks this year. Curt mentioned some of them in his blog post
Well, I hope to see you in Rome or in Vegas. In both places, I have a "new" session called, "Why are DB2 for z/OS DBAs interested in Data Studio and Optim Tooling?
" (IDUG Rome: on Monday, Oct 5th, and IOD on Monday, Oct 26th -- as always, check the monitors for last minute gate changes when you arrive at the airport). I felt this tailoring of a session for DB2 for z/OS DBAs was needed because we have a lot of stuff labeled Optim, and not all of it is of interest to a DB2 for z/OS DBA as of today. I also take a no-glitz look at the free Data Studio offering and show exactly what it can be used for on a DB2 for z/OS system. Holger Karn will join me at the IOD session to talk about some performance monitoring futures that I know will get you excited. At IOD, I have another session with Jay Bruce called, "Query Tuning on IBM DB2 for z/OS
" that discusses the task and shows tooling that can help you do it.
Hope to see you there!
Today's entry is inspired by a recent Dilbert cartoon
where the pointy-haired boss tells Dilbert that he needs to get better at anticipating problems. While we'd all like to see problems before they happen, we need a little help here, and inspiring words from the pointy-haired boss just doesn't cut it.
Today's DBAs have a lot of responsibility; arguably more than they have had in the past in terms of number of systems and the complexity of these systems. Most DBAs have implemented early detection mechanisms for production systems, but what about non-production or less-critical systems like development or test systems? These are often called "non-critical systems" until a severe issue occurs with them, and then they suddenly become critical because they are preventing new work from being implemented on schedule. Sometimes it may be difficult to justify the cost of robust monitoring software like DB2 Performance Expert
, Tivoli OMEGAMON for DB2
, or IBM Tivoli Monitoring
for these labeled "less-critical" systems, so what's a pro-active DBA to do?
One solution is the Data Studio Administration Console (DSAC)
. It is a no-charge offering with your data server license that supports DB2 for z/OS and DB2 for Linux, UNIX, and Windows with an "at-a-glance" view to see the health and availability of these systems. It is not a full-blown performance monitor, but it does show several key indicators like whether the system is up/down, locking rates, resource utilization, etc.In new news, although DSAC used to be the delivery vehicle for the Q Replication Dashboard, we have just made available a new and improved Q Replication Dashboard One of our Gold Consultants, Frank Fillmore, will be discussing this dashboard in a webcast (two sessions to accommodate different timezones) with IBM on September 15. Get the details from his blog.
With this change, you might be asking what other changes are in store for DSAC? You may have heard us talking about our next generation performance manager. It has a new architecture along with a web browser interface that will support DB2 and eventually other DBMSs. Once we roll out this performance manager (be sure to attend IOD
to find out more), we plan to use this new architecture for the next release of DSAC. It will still provide the same high-level health and availability capabilities that DSAC 1.2 provides today, but the Web user interface will be refreshed and have consistency with our other Web UI offerings.
So, don't let the pointy-haired boss get you down the next time they ask you to anticipate problems better -- just smile, thank them for their leadership, and go take a look at DSAC to prevent those critical situations.
I had two recent visits with customers where I was explaining pureQuery. When I finished what I thought was a nice polished presentation on the subject, both times someone said, "So, I have to use those pureQuery APIs in order to turn my dynamic SQL into static SQL." Ugh. You know that feeling where it seems like you must be speaking in a foreign language because the words just aren't being understood? I felt some relief when Rafael Coss told me that he gets this every time he explains pureQuery, and he has a great knack for making the complex seems simple.
Just in case you are also of the impression that the pureQuery APIs must replace existing JDBC, Spring, Hibernate, etc. calls to the database, the answer is no. The conversion from dynamic to static SQL using client optimization, does not require any changes to your application. Plain old JDBC calls can remain in your programs and with pureQuery Runtime, we can capture the SQL and it can be statically bound to DB2 (z/OS or LUW). This explanation usually creates the "Ah ha" moment.
So, while pondering this, I have come up with a new way to explain pureQuery. I now plan to hold off introducing the APIs until after I finish talking about client optimization and the great capabilities you get when you use client optimization:
- How Optim Development Studio tooling provides the pureQuery Outline to visualize the relationships among Java code, SQL statements and database objects
- How SQL injection can be reduced/eliminated
- How framework-generated SQL can be reviewed and possibly tuned
- How SQL can be revised
- How non-captured SQL can be blocked from reaching the data sever
(By the way, Patrick Titzler's tutorial
, although a little dated, still is the best source I know of to understand the process of client optimization.)
Then, after a deep breath and a new title slide, I plan to talk more about ORM frameworks and our pureQuery APIs. (By the way, if you're curious about how pureQuery relates to ORM frameworks, check out Rafael's ChannelDB2 video
Hopefully the "Ah ha" moments won't get delayed any more. :-)
Thanks to those of you who responded to my previous blog asking for feedback on using Java on z/OS for your database tools. It was really helpful.
I need your help again.
We are having internal discussions about plans for the Data Studio administration console, a no-charge download that includes both a replication dashboard and high-level monitoring of database health and availability. It is the database health and availability monitor that I need your feedback on. I need to hear from both DB2 for LUW and DB2 for z/OS users, so don’t be shy!
If you aren’t familiar with the health and availability monitor, there's a good tutorial here. Just as a reminder, health and availability monitoring enables you to easily assess the high-level health of DB2 for LUW and DB2 for z/OS systems. It includes a health overview, which lets you look over a landscape of database servers, and a dashboard that lets you focus on a single server. In addition, there is a time slider that lets you view changes over time in both the dashboard and an alert history. Health and availability monitoring also includes quick analysis and suggests possible resolutions for many database server conditions and scenarios.
The intended purpose of the health and availability monitor is to serve as a first level, “at a glance” type monitor. It’s not intended to provide the deep diagnostics that a monitor like DB2 Performance Expert or Tivoli OMEGAMON XE Performance Expert for DB2 provides, but it can allow you to quickly glance over your systems and immediately spot whether one of your databases needs attention. Our thought that was customers would most likely want to use this in their test environments, as they may not want to invest heavily in monitoring non-production servers.
OK, so here are my questions (these aren’t formal survey-style questions, so feel free to improvise):
- If you have not installed Data Studio administration console for database health and availability monitoring we’d like to know why not. Is it because you were not aware of it? Do you already own a monitor for all your environments, including test? If you are willing, it would be great if you could install it locally and give it a try. Then you can also be one of the people to give us feedback on it.:-)
- If you have installed it:
- Was it easy to install? Do you like the overall usability of it? What about quality? Were there features you found useful? Were there features you felt were lacking?
- Does your organization have a need for this type of "at glance" monitor and have people who benefit from using it? Or do you typically install a more sophisticated monitor on all your database servers, including test?
- Are you still using it? If not, why not?
Please take a few minutes to dash off an email with as much information as you can. Don’t forget to tell us a bit about your environment and if and how that influences your answers. I really appreciate your help.
You can send your feedback to dstudio at us.ibm.com, and I’ll be sure to get your responses.
-- Bryan Smith
Edited on 2/6/2009 to correct the query and to acknowledge that the existence of Java stored procedures may not necessarily mean you have the SDK. Thanks for keeping me honest, folks.
We are investigating implementing some server-side functions in our data tools that would run in a Java runtime on z/OS, and I would appreciate getting your feedback to help us with this planning work.
- Do you have the "IBM SDK for z/OS Java 2" installed on all of your z/OS systems where you would be running your IBM DB2 tools? If you're a DBA, you may need to ask your system programmer or, if you know you have Java stored procedures, you are more likely to have it. If you're not sure if you have any Java stored procedures, you can run this query:
SELECT SCHEMA, NAME, CREATEDBY, LANGUAGE, ROUTINETYPE, SPECIFICNAME, WLM_ENVIRONMENT
WHERE LANGUAGE = 'JAVA';
- If yes, what level(s) of the SDK do you have installed?
- Do you have zAAPs on the LPARs you would run the tools on?
- If no (that is, you don't have the SDK installed), do you foresee any problems installing that as a prereq for a DB2 for z/OS tool product?
- Any comments about our using Java on z/OS?
You can send your feedback directly to me at bfsmith at us.ibm.com.
Thanks a lot!
--- Bryan Smith
Howdy! In case you missed it, we just announced a new release of HPU (High Performance Unload) for DB2 for LUW... V4.1. In case you've never looked at our HPU products (for DB2 for z/OS and DB2 for LUW), they can be great productivity enhancers and possibly even save you some resources.
One of the great things about HPU is that it has both a utility-like interface and an SQL interface. The SQL interface is perfect for application developers since they aren't used to invoking utilities. Once invoked, HPU can access the underlying table space or backup / image copy directly, producing multiple data type conversions and unload file formats suitable for most any target data store. When extracting a high volume of data in this way, or by sampling the source, the elapsed time and CPU savings are humongous versus using SQL (or Export or DSNTIAUL).
HPU for DB2 for LUW is also partition-aware, allowing you to unload from multiple partitions with a single execution of HPU into a single output file/pipe or multiple files/pipes. It also provides a re-partitioning capability that unloads and re-partitions the output for new data distribution on the same or different system.
The hot new feature in the 4.1 release for DB2 for LUW adds the ability to migrate data directly (unloading, transferring, and loading) from one database to another without the need for intermediate disk storage. This capability delivers the fastest way to migrate your data. The new release also has other usability improvements and now supports Windows 64-bit platforms.
For more information on HPU, visit http://www.ibm.com/software/data/studio/high-performance-unload/
-- Bryan Smith
We had the Data Studio Customer Advisory Council today. One of our toughest customers gave Torsten a standing ovation upon completing his demo of the E2E database monitoring
that is planned for delivery soon. I never saw Torsten blush before today.
One of the Toronto user-centered design guys, Rick, came up with a clever way (aka Vegas style) for the CAC members to vote on future functions... They were given poker chips to put into feature function cups. It was really clever.
I finally won tonight at the tables after the Rock the Mainframe party. -- Bryan Smith
Here are my impressions from Monday through Wednesday.
-- Holger Karn
- There is a lot of interest on the new end-to-end database technology we are adding to DB2 Performance Expert. Had 133 people in our session on Monday. That's more than we ever had in a PE session before.
- Torsten is asked to do many demos of this functionality to customers during this week
- As usual you have to walk a lot during IOD. Conference building is huge, and hotels are sometime a good walking distance away. :-)
Curt almost missed his session yesterday. Someone called him 10 mins before the session to remind him -- he recovered quickly and was only 5 mins late. Full room with heads bobbing up and down when talking about problems with supporting Java applications.
Most folks liked Dana Carvey much better last year than Martin Short.
Listening now to Jim Pickel on DB2 Security.
Lost more money at craps and blackjack table. I'm now starting to feel this financial crisis that I keep hearing about on the news. -- Bryan Smith
I am sitting in the awesome developer den where there is a roomful of colorful bean bags! Along with 2 Wiis that they will be giving away at the end of the week. IOD attendees should stop by to enter into the Wii contest and there are many laptops set up with Data Studio developersWorks articles. Visitors can check out articles on Data Studio and its family of products and also ask the experts about Data Studio. We are located in Breakers G from 10am - 5pm.
Bad news, we had connectivity problems in the Data Studio for DB2 for z/OS labs. But we have a re-do on Thursday, and I hope everyone who really wanted to do this lab with a DB2 for z/OS server will come:
Session: HOL-2670B Data Studio and DB2 for z/OS
Time: Thu, 30/Oct, 02:00 PM - 05:00 PM
Location: Mandalay Bay South Convention Center - Lagoon F-- Tina Chen
This is actually day 4 for me as I spent the weekend with DB2 LUW customers attending the DB2 Customer Advisory Council, a group of DB2 customers that provide feedback to the DB2 Toronto team on upcoming releases and future strategies to help shape DB2 LUW. We had a three hour window with these customers on Data Studio and the feedback was tremendous. The big change I'm seeing from 6 months ago at IDUG is customers are now downloading and using Data Studio throughout their developer communities. They're seeing the value Data Studio has to add over and above what they get from either Developer Workbench or other 3rd party development tools. Also, not only are they using this for their DB2 LUW environments, but several users indicated that they're using this for DB2 z/OS, as it enables their developers to easily build and debug SQL stored procedures for their DB2 z/OS environments. This is a big change from 6 months ago when customers didn't even know Data Studio existed.
Yesterday I held a session called "Empowering DBAs with Data Studio", the room was full with standing room only. Goes to show that DBAs are always looking for the latest and greatest technology to manage their databases. I demonstrated the new Data Studio coming soon. This upcoming release has added a ton of functionality for the DBA; including utilties, commands and more DDL management. The audience was very excited and really wants to see Data Studio become their tool for managing both DB2 LUW and DB2 z/OS databases. There's definitely a buzz around Data Studio at this conference.
Tonight I arranged a podcast with YL&A consultants and Curt Cotner on Data Studio. I'll let you know how that goes....-- Deb Jenson
Just a reminder that I'm here to take input on our Data Studio administration story.
Through the grapevine, I've been hearing of some commonly asked questions ever since we announced IBM Data Studio Administrator
. I'll answer a few of the questions here, and will let Jeff Ruggles, our lead architect on Data Studio Administrator, tackle some of the more indepth ones later. Q: Will Data Studio tooling replace the DB2 Control Center?
A: Yes, eventually. DB2 Control Center has evolved over many releases, so it will take a while for the Data Studio tooling to fully encompass the functionality in DB2 Control Center, but it is certainly our intention to eventually replace DB2 Control Center with Data Studio tooling. The next major release of Data Studio Administrator when combined with IBM Data Studio Developer
should provide for DB2 Linux, UNIX, and Windows a majority of the functions that DB2 Control Center provides today.Q: When will Data Studio Administrator support other databases (especially DB2 for z/OS)?
A: Data Studio's mission is to provide uniform tooling across all popular RDBMS', and we want Data Studio Administrator to be the preferred tool for DBAs for general administration. Providing support for DB2 for z/OS is high on the priority list, but it isn't the highest, because we (IBM) already provide a very comprehensive solution with the IBM DB2 Administration Tool for z/OS
and the IBM DB2 Object Comparison Tool for z/OS
products that has comparable function to Data Studio Administrator. So, stay tuned... and know that we're working hard to deliver Data Studio across all of your RDBMS'.
Late last month, I asked you DBAs to give me some feedback on our tools for administrators. I'm still looking for more! Thanks to Fred and Rahul both for your comments. Rahul, I believe your issues with DDL generation in Data Studio Developer are being addressed through the forum.
Anyway, since both comments were talking about DDL, it seemed to me like this was the perfect opportunity to talk about Data Studio Administrator, which we just announced on July 8th. Some of you may be familiar with this product under its previous name – DB2 Change Management Expert.
Data Studio Administrator is really designed from the ground up to handle complex database changes. Let’s face it, database changes can be tough. Understanding the dependencies, the authorizations, the effect on applications…. It’s a really big deal to get it right because the consequences of getting it wrong can range from annoying to devastating. The goal of Data Studio Administrator is to make it much easier for you to model the target database, compare two sets of objects to see where they differ, migrate a set of objects to the target, or redefine the target objects to be like the source. Changes automatically roll through all related objects, streamlining the entire change management process.
And, because we all make mistakes, you can automatically undo all changes, if necessary.
There are a lot of other cool things like reporting on the impact of proposed changes to identify dependencies and/or mitigate risk. You can also publish an HTML change report as part of the DDL generation, which looks like this:
So, when you need to do complex, or even simple, schema/DDL changes, the Data Studio Administrator acts as a great process control flow engine to keep you out of hot water. I strongly encourage you to check this webcast
on July 23rd by our lead engineer on this project to learn more. If you miss it, there will be a replay available.
Data Studio is all about collaboration, and Data Studio Administrator helps you achieve that with your team. It can integrate input from multiple group members who are participating in the change process. It integrates with Rational Data Architect making it easy to move from logical modeling to physical implementation.
Data Studio Administrator currently supports DB2 for LUW, and we plan to extend it to other DBMS platforms (both IBM and non-IBM) in the future. Stay tuned…
-- Bryan Smith