As I've been visiting various customers and presenting at conferences, I'm finding that there is a lot of interest in our new performance monitoring capabilities in Optim Performance Manager. One of hottest topics in that area is the new statement-level performance metrics. These were significantly enhanced in DB2 LUW 9.7 and are planned for DB2 for z/OS version 10, currently in beta. There are two aspects of this technology that people find exciting:
- You get details on the cost of individual SQL statements rather than seeing a rollup of the costs for an entire package or plan.
- The cost of collecting this data is very low -- in the range of 3% overhead or less.
That last bullet is really the part that excites people. In the past, you had to run an expensive SQL trace to get this kind of data, and most customers found the overhead was too high to have the trace on all the time. The new DB2 technology gives us statement cost histograms for short time intervals during the day (typically 60 seconds or so). Armed with this data, Optim Performance Manager can show us how the cost of an individual SQL statement changes during the day, week, or month.
The histograms can also allow us to easily identify statements that have volatile cost due to data skew. The combination of this function with the Optim Performance Manager's end-to-end monitoring, which allows to account each SQL to the individual workload it originates from (end user, application, client machines etc.), provides a pretty powerful tool. We believe this will be an important new capability in DB2 and our tools, since it holds the promise of allowing us to review performance problems after the fact without having to recreate the problem scenario. That will save all of us a lot of time, since in many cases it isn't easy to reconstruct the conditions that caused the performance problem.
Hi, everyone. It seems as if lately I am always on the road. Today, I am actually back here at SVL, where, fortunately or unfortunately, depending on your point of view, it is raining, and raining hard. Tomorrow I head off to the UK for regional user group meetings. I also have Rome to look forward to, and that’s why I’m writing today, to give you the highlights at IOD EMEA for products in my portfolio.
When I look at the sessions, one major theme I see is around performance. Regardless of how ‘cool’ new applications and new technologies are, the issue of performance never goes away and often becomes even more critical as users become even more intolerant of slow response times or unavailability. So, although performance may not have the “coolness” factor of some topics, it’s bread and butter for most of our customers. The key is to make performance management more streamlined, less costly people-wise, and more focused on prevention rather than constant reaction.
If you’re going to IOD EMEA, I invite you to make a point of joining us at the following sessions to learn more about what we’re doing to help realize those goals. The sessions here are listed in chronological order, but you will of course need to check for changes at the conference venue itself. Some of the speakers may change as well. I look forward to seeing you there. If you haven’t signed up for an executive one on one with me through your IBM sales rep and are interested in doing so, here is a link to where you can find more information on how to do that.
|Date and time ||Session ||Title ||Speaker |
|Wednesday, May 19 |
|TSB-3181 ||Java Developer Best Practices for DB2 Performance ||Dave Beulke |
|Wednesday, May 19 |
|BLD-3375 ||How British Petroleum Manages Enterprise Data Growth with IBM Optim ||Jim Lee, David Sohl |
|Thursday, May 20 |
|TSB-3350 || |
End-to-End Monitoring and Problem Determination with Optim Performance Solution
|Torsten Steinbach, Holger Karn |
|Thursday, May 20 |
|TSB-2947 ||Query Optimization and Query Tuning with Optim Query Tuner ||Gene Fuh |
|Thursday, May 20 |
10:30AM -1:30 PM
|3259 ||Hands on Lab! Optim Performance Manager – Live! ||Ute Baumbach and Michael Skubowius |
|Thursday, May 20 |
11:45AM -12:35 PM
|TSB-3347 ||Venedim: A first hand experience with Optim Performance Management solutions ||Jean-Marc Blaise, Torsten Steinbach, Holger Karn |
|Thursday, May 20 |
11:45AM -12:35 PM
|TSB-3313 ||Unipol Slashes Costs and Improves Application Performance with Optim (features Query Workload Tuner) ||Client speaker and Bryan Smith |
|Friday, May 21 |
|TSB-2760 ||Why are DB2 DBAs (z/OS & LUW) Interested in Data Studio and Optim Tooling? ||Bryan Smith |
|Friday, May 21 |
|TSB-3341 ||Optim and Data Studio Portfolio Strategy: Optimize performance and availability while lowering costs ||Curt Cotner |
|Friday, May 21 |
|TSB-2890 ||Improve Database Archive Performance: Optim Data Growth Best Practices ||Pamela Hoffman |
|Wednesday through Friday (repeated session) ||TSB-3870 ||Usability lab: Optim Performance Manager User Experience for DB2 Performance Management ||Dirk Willuhn, Ute Baumbach |
Today we announced a major enhancement to our performance monitoring and management solution for DB2, with the 4.1 release of Optim Performance Manager for DB2 for Linux, UNIX, and Windows (I’ll use ‘OPM’ in the rest of this blog entry). This is a major new version of OPM that includes a a significantly improved up and running experience and quick problem resolution.
The biggest change you’ll see out of the box is the new Web-based user interface and redesigned problem resolution workflow Our beta customers have given us great feedback in the development and refinement of this interface, and the result seems to be pretty well-received. One of our beta clients states that “The browser interface is easy to use, with intuitive dashboard displays and easy to understand presentation of information.” Even better, since it is Web-based, you can monitor databases anywhere without having to install software on various PCs.
The repository server collects performance metrics from the monitored database and stores them into a DB2 database. You can navigate through the stored data by time and see reports or dashboard data from the chose time period. This allows post-mortem problem detection and resolution, or for proactive monitoring and trend analysis. There are also interactive reports, such as for table space disk growth and for Top n SQL statements, that you can generate from this stored information.
The team has done a lot of work on getting up and running with the monitoring solution must faster. There is an integrated installer and there are predefined monitoring profiles for a variety of workloads, such as BI, OLTP, SAP, QA and Development. I’m really happy with the reports coming from the beta that installation and configuration is “easy.”
Finally, you can launch Optim Query Tuner from several of the dashboards, including the Active SQL and Extended Insight Dashboards, to do in-context query tuning on individual problem queries.
To realize the full power of the new integrations and lifecycle capabilities of this release, you should definitely check out the new package available in this release, called Optim Performance Manager Extended Edition (OPM EE) that builds on the base capabilities in OPM by inclusion of Extended Insight (previously a separately orderable feature), integration with Tivoli monitoring solutions, and configuration tooling for DB2 Workload Manager.
If you like the value of Extended Insight, which provides key metrics and visualizations of SQL as it travels through the software stack for dynamic Java applications, you’ll really like that we’ve extended the capabilities in this release of OPM EE to include CLI applications. We also include out of the box, customizable, workload views for SAP, Cognos, DataStage, and InfoSphere SQL Warehouse help get you going.
To round out our monitoring story to support the strong message we tell with static SQL, OPM EE now includes monitoring support for static SQL from Java applications. So if you want to take advantage of static SQL from Java, either by using the pureQuery API or by using client optimization for any JDBC application, you can get the Extended Insight information that you could previously only get for dynamic.
We’ve also made it possible to import pureQuery application metadata into OPM so that detailed information about the application source (Java package name, method name, line number) can be displayed on the Extended Insight Dashboard for any individual SQL statement. This particular feature will require a pureQuery Runtime license.
Integration with Tivoli monitoring solutions smoothes the handoff between system operators and the detailed database performance analysis performed by DBAs. The integration enables the ability to drill into the deep database diagnostic capabilities of OPM EE directly from the Tivoli Enterprise Portal. One of our beta clients who does extensive work with clients using Tivoli found this integration very useful, and points out that “Outsourced operations will love the Tivoli integration as it allows them to monitor multiple WAS and DB2 instances from a single point of control.”
Finally, OPM EE provides new tooling to significantly ease the configuration of DB2 workload manager. Although the existing WLM configuration tooling is still shipped with InfoSphere SQL Warehouse, this new tooling is integrated into OPM EE. Key monitoring information vital to workload management is presented in context so that you can do related configuration and validation within a single tool
There is really way more than I can possibly cover a blog entry. Here are links where you can find more information and see the user interface in action.
Our team blog coordinator has been getting on me about posting to kick off the new year and to reflect a bit on how things went last year. So let me try to do that as concisely as possible.
Reflecting on 2009, I have to say that I am pleased to see that people are really starting to understand the business value of integrated data management (IDM) to help reduce inefficiencies in communications and to enable enforcement of rules across IT roles. And there has been even stronger interest, particularly from the z/OS community, around potential cost reductions with use of pureQuery in the application infrastructure. I know that our tech sales and enablement teams have been very busy with pureQuery savings assessments (briefly introduced in this video demo) and are constantly on the road talking to customers about pureQuery.
We’ve had some good customer successes this year that reflect the realization of the benefits of integrated data management. Here are a couple of note:
Collaboration, innovation, and data governance with data modeling at La Caixa bank
La Caixa was challenged with control over an increasing number of database entities. They wanted to set up a way to enable administrative control over created entities that would help both their administrators and developers. They decided to go with InfoSphere Data Architect, in no small part because of its integration with Rational Software Development Platform and the rest of the Optim portfolio, as well as its extensibility. They are taking advantage of the support for source control to store physical data models in a central repository and allowing only approved objects to be promoted to production. And they’ve done some customization on the DDL files to enable the SCM to parse the files in order to decide where to install the other database objects. Data governance is improved, and the integration with Rational enables the administrators to grant access to data models to appropriate developers for improved collaboration.
Greater security and performance for openJPA applications using pureQuery at Payment Business Services (PBS)
PBS has developed a new Payment Services system programmed in Java using openJPA. The developers liked working with openJPA, but the database administrators were concerned about performance and security with dynamic SQL. So they decided to take advantage of the client optimization capabilities in pureQuery to leverage static SQL for that application. They have Optim pureQuery Runtime for z/OS under a WebSphere Application Server to capture the dynamic SQ,L and then they bind that captured SQL into DB2 packages for static execution. They use Optim Development Studio to view and work with the captured SQL for tuning, problem determination, and for communicating with developers. This marriage of pureQuery with new application infrastructures in the z/OS environment is really critical to helping enterprises to both grow technologically while maintaining appropriate levels of security and performance.
What’s coming in 2010?
Looking ahead, I’m driving the team toward delivering on specific aspects of the IDM vision. It is my belief that this year we will be able to deliver on even more integration among the portfolio members (and other IBM products) to fully realize a database performance management scenario that crosses roles from system operator all the way through to modifying and tuning the database access layer. You will likely see more integration with Cognos, SAP, and DataStage. And we’ll be delivering betas and new products to help DBAs do their jobs better. And, yes, there is a good deal of work being done to support z/OS this year. And, before you ask, yes, Windows 7 support is coming soon.
I hope to see you at our next big IOD EMEA event in Rome where I will go into more detail on some of the capabilities I can only hint at here.
While I was in Rome for IDUG EMEA, I had a chance to talk to a lot of customers about Data Studio and Optim products. One of the customers expressed some concern about the difficulty of maintaining the Optim software on each desktop that runs our Eclipse-based software offerings. I'm sure a lot of other customers are worried about this topic, and it caused me to realize that this is something we don't discuss much in our conference presentations or in our blog/Web content.
The good news is that we've actually done quite a lot in this area and are continuing to do more. Our install process for the Optim products allows you to operate in one of three modes:
- An install image can be uniquely installed on each desktop. This does require each user to install the product, but we support the Rational Installer update process, which allows the users to easily check to see if updates are available in the internet and automatically install them without having to manually download updates or feed CDs into their PCs.
- You can also choose to install the Optim products on a file server that acts as the production image of the product. A small launcher file is invoked on the end user's desktop, which will load the Optim product into the end user's PC from the file server. With this approach, you only have to maintain the copy of the product that is stored on the file server, and individual users are upgraded instantly each time you apply maintenance to the file server image.
- You can also choose to package the installer using tools like Microsoft's Systems Management Server (SMS) product to deploy the installation to multiple end user desktops. Here is a link to the documentation on this mode of installation in the Installation Manager Information Center. We’re working on getting this information into the Information Management installation documentation as well. We’re also working on creating sample scripts to aid in the packaging and deployment, reducing the development time that is typical when using these tools.
All three modes require less manual labor than the traditional technique of using CDs or DVDs to install and upgrade products. The last two modes fully automate the process for the end users, so that an organization only has to perform maintenance on the centralized file server image (mode 2) or maintenance occurs automatically as a deployable update (mode 3).
In my dual role as CTO and VP of Data Servers, I spend a fair amount of time on the road talking to people about what's going on both with database server technology and with Integrated Data Management, both from a businesses and technical perspective. Anyway, I like these road trips because they help me validate the business reasons behind the technical offerings we make. Because I meet with so many different people in different job roles (CTOs, CIOs, DBAs, architects, developers..) it helps me to calibrate the tradeoffs and priorities to get a product or new functionality out the door and to place that new capability in the context of real customer problems. In addition, no matter how many times I give the "IDM vision" pitch, I almost always get feedback that helps me to make it just a little better the next time, such as finding out what part of the strategy might not be clear when people hear it for the first time. To be honest, these roadtrips also help me keep my technical chops since I have to make sure I'm up to date on the latest functionality and be prepared to answer some tough questions from the audience.
For this reason, I'm really looking forward to my all day appearance at the upcoming Tridex DB2 Users Group on October 15
. I start off with the vision pitch, because I find that helps to place the more technical talks in the correct perspective. Then, I'll go into detail on the integration work we are doing with WebSphere. This work is critically important in making the whole database/application server layer more efficient and easier to manage, as well as providing DBAs with more control. Then I go into a deep dive on Query Workload Tuner, which is our follow-on product to DB2 Optimization Expert. Finally, I'll be closing with a talk on using pureQuery for high volume applications, in which I discuss more about the performance aspects of using pureQuery such as heterogeneous batching, static support, and more.
If you're in the New York area, I think you would get a lot of value from attending this free event. Note that they do require you to register ahead of time, so go to web site
, download the invitation, fill out the registration form attached to that invitation, and email it in.
Hope to see you there.
I know you all are probably getting bombarded with news about the upcoming Information on Demand NA conference
in Las Vegas, but I wanted to make sure you were aware of some highlights from an Integrated Data Management perspective. I'll be joining my colleague Al Smith, originally from Princeton Softech, presenting our strategy and vision (I need to earn my flight there, after all), but I think what's even more interesting are the customer speakers that are lined up. There's a broad representation across industries of companies who are focused on different aspects of our solutions. For example:
- Scotiabank Speeds Application Testing and Protects Data Privacy with IBM Optim (BFM-2345)
- State Farm Optimizes App Performance with Optim Development Studio, pureQuery (TDM-1718)
- Efficient Test Data Management: An ROI Success Story (Travelport) (TDM-2841)
- Evaluating Java Data Access Technologies at Handelsbanken (TDM-2177)
- A Day in the life of a DBA - how to keep your sanity (Blue Cross Blue Shield) (TDM-1499)
Also, if you want to get your hands dirty, please reserve your seat in some of the hands-on labs. I know our technical enablement team has some great labs lined up that include integration of some of the products into solutions, such as:
- Accelerating Java Applications (HOL-1276) which talks about using Optim Development Studio, Optim Query Tuner, Optim pureQuery Runtime, and Performance Expert Extended Insight together. This is a great use case that DBAs should be aware of. (It actually reflects pretty closely the Java Acceleration Solution demo.)
- Model-driven data governance using InfoSphere Data Architect and Optim (HOL-1277) includes Optim Test Data Management and Data Privacy Solutions together. If any of you are responsible for data privacy or are looking for ways to help 'bake in' privacy safeguards starting at design time, then this is the lab for you
If you want to talk to me, be sure to get your seat reserved at the Meet the Experts on Tuesday afternoon (you can enroll through the SmartSite link below). It will be a very busy week, but I think you'll find it worthwhile.
To help you with planning, here are the key links:Integrated Data Management roadmap to the conference.
(Print it out and take it along with you.)IOD Conference SmartSite
to help you plan your agenda and enroll.IBM Optim on Twitter
(This is already active, but we'll use this to keep you informed of updates at the conference)IOD 2009 on Twitter
(for the bigger picture on conference activities)
Look forward to seeing you.
It was just over a month ago that I posted the information about our new releases
under the Optim name. Today we announce the z/OS versions of some of those products.
Optim Query Tuner is designed for single-query tuning, and Optim Query Workload Tuner provides both single query and workload tuning capability. Both offer seamless integration with Optim Development Studio
. Optim Query Workload Tuner is a renamed enhancement of IBM Optimization Expert for z/OS and is the upgrade path included with your subscription and support when you are ready to move to the next release. Note that future query tuning enhancements will be made to these products. OSC is still available but will not be enhanced. It will be replaced by similar capabilities in Data Studio under development today.
And a reminder, we did add new discussion forums, including the Optim Query Tuning solution discussion forum
This product is for deployment natively on z/OS systems (for example, with a WebSphere Application Server for z/OS deployment). Some of the capabilities we added were in response to requests from z/OS customers, including the ability to replace literals with parameter markers, making more statements eligible for static execution. You can find out more about this capability in Sonali’s article on developerWorks
Thanks for reading.
Greetings from Berlin! I’m very pleased today to tell you about several new announcements we are making in the portfolio previously known as Data Studio. With these new releases, we are taking great strides toward the vision of Integrated Data Management -- An integrated, modular environment to manage enterprise application data and optimize data-driven applications, from requirements to retirement across heterogeneous environments.
While delivering on an Integrated Data Management vision is a very broad value proposition and one that will involve all aspects of IBM Software Group, you’ll see with this release that we are adopting the Optim name as a rallying point for this technology emphasizing our focus on optimizing the value of your data assets by managing them across their lifecycle. This announcement represents another major step in delivering on our vision focusing both on extending heterogeneity and portfolio integration that provides the basis for cross-role, cross-lifecycle collaboration, efficiency, and alignment.
In future posts, we’ll take you through a couple of scenarios enabled by the new releases, but to get you started, here is a summary of the announcement and links to announcement letters and web pages:
- Enhancements to and renaming of Optim Development Studio, Optim pureQuery Runtime, and Optim Database Administrator (formerly Data Studio Developer, Data Studio pureQuery Runtime, and Data StudioAdministrator). You’ll find lots of new functionality for both developers and DBAs in these releases. Most notably:
- pureQuery and development support for Oracle databases (including PL/SQL) for a common integrated development environment across DB2, Informix, and Oracle
- A host of fantastic new pureQuery capabilities that customers have been asking for, including translation of literals to host variables to improve application performance
- More complete DB2 for Linux, UNIX, and Windows administration capabilities in Optim Database Administrator including support for large warehousing environments
- Better governance of privacy attributes across design, development, and test environments Better support for DB2 package management so that DBAs can restrict changes and rebinds to just those packages affected by a change.
Many of you may be wondering about the no-charge capabilities. We heard loud and clear that our customers want a stand-alone download package for the no-charge capability. We are reverting to that packaging model with this release. More on that as it becomes available in the near future.
- New product: Optim Query Tuner for DB2 for Linux, UNIX, and Windows brings single-query tuning advice and formatting to theDB2 developer. I think Ray Willoughby will be blogging more on this, but this product provides a great first step toward enabling developers to more effectively tune queries during development. Its integration with Optim Development Studio provides a seamless environment for crafting queries, identifying SQL hot spots, and optimizing queries all pre-production to help produce enterprise-ready code and facilitate collaboration between the developers and DBAs. See the announcement letter and Web page for more detail.
- New release of InfoSphere Data Architect. This product has always been a shining star in the portfolio for its heterogeneous database capabilities. This release includes improvements to the data governance value proposition around data privacy and the ability to maintain volumetric data for capacity planning. Most notably, users can choose from pre-defined privacy templates and share privacy definitions with developers and testers via Optim Development Studio. See the announcement letter or Web page for more detail
I’m asking some of my technical leaders and product managers to go into more detail on these announcements in future blog posts. In the meantime, I must go back to the conference. Lots to talk about…
I just got back from IDUG in Denver and wanted to dash off a quick blog about it. I sat in on several sessions where Data Studio and pureQuery
were being presented (some by our team, some by consultants, some as BOF sessions). We really couldn't have asked for things to go better:
- I saw many of the same customers attending a series of sessions on these topics, so several of the customers were basically "majoring" in Data Studio and pureQuery.
- There were several external consultants presenting their findings of how these technologies worked in their field engagements. All reports were in line with what we found in our own lab measurements.
- I sat in one presentation where the consultant presenter reported 25% savings, but the folks in the audience reported that their savings was closer to 40%.
It really great to hear these success stories. I look forward to meeting more of you and hearing your stories at IOD EMEA
in Berlin and IOD NA
in Las Vegas. By the way, the deadline for submitting talks for IOD North America is May 29th. I know there is a strong emphasis on getting customers speakers, so if you have a story to tell, you should submit an abstract.
See you at both conferences.
-- Curt Cotner
As you may recall, when we announced Data Studio pureQuery Runtime
2.1 for LUW, one of the new features in the release was ability to use pureQuery with .NET applications. This support is available for all of the DB2 servers; however, I want to focus a bit on this from a z/OS perspective, since it is primarily these customers who let us know that they heard and liked the pureQuery for Java story we were telling, but that they needed something like this for .NET as well. They wanted the advantages of static SQL - for security, manageability, and performance reasons.
So we did add that support for .NET in the 2.1 release of the Data Studio pureQuery Runtime and in the latest IBM ADO .NET provider. Although it doesn't have all the rich tools support that Java does, it provides many of the key benefits that Java shops can get - static SQL performance and consistency, static SQL authorization model, and the ability to create uniquely named packages that can help DBAs and system programmers isolate performance problems to a particular application and particular SQL statements. And, since it's using client optimization, that means your applications can get these benefits without having to change source code.
To validate the performance benefit, I'm very happy to announce that we've published the results of our performance study
(using the IRWW benchmark) of the pureQuery support for .NET. I don't want to spoil the surprise, but the numbers are very impressive with huge increases in throughput and dramatic reductions in CPU per transaction.
Also, be sure to see this developerWorks tutorial
. It's a good step-by-step guide to the process of enabling .NET applications to use pureQuery.
It's been a while since I've blogged. I've been spending a lot of time talking to customers about what they want and need from our integrated data management portfolio, and now the whole team is working on some great new capabilities and offerings to help address some of the key pain points I've been hearing about. More on that later.
Right now, I just wanted to draw your attention to our recent announcement of the pureQuery Runtime 2.1 on z/OS. This version of the product has been available on LUW since December and we are happy to announce its availability on z/OS for those shops who run their apps natively on z/OS. There are some excellent capabilities in the 2.1 Data Studio pureQuery Runtime and Data Studio Developer releases -- you can read this What's New article for a good overview, and these videos also show you many of these enhancements through Data Studio Developer. And don't forget Jeff Sullivan's blog entry which gives a lot of good reasons from a z/OS system programmer perspective why he likes pureQuery.
Use pureQuery Runtime for z/OS with stand-alone applications, applications deployed on WebSphere Application Server or other application servers, or with DB2 stored procedures. Data Studio pureQuery runtime supports both type 2 database drivers and type 4 database drivers for DB2 for z/OS V8 and V9.
It's been less than 5 months since we announced our 1.2 releases of Data Studio, which I blogged about
back in July.
Since then, we have talked to thousands of people, provided demonstrations to hundreds, and visited dozens of customers. People are starting to understand Data Studio and the value of Integrated Data Management better.
With this latest release, announced today, we are really targeting the DBA with enhancements across the portfolio to help DBAs improve application performance, security, manageability, and TCO. In this release, the enhancements are particularly targeting Java applications that access DB2 data, but you'll see we're starting to branch into .NET as well.
The announcements today are for:Data Studio Administrator
2.1, in which we've really focused on both usabilty and functionality. We've done lots of usability testing with DBAs and have provided a more natural approach for doing many tasks, including copy and paste of database changes, flatter traversal of the data source explorer, better sorting and filtering of objects, and new task assistants for utilities, commands and configuration parameters, so you won't have to leave your environment to go out to the command line or control center to perform those tasks.Data Studio Developer
and Data Studio pureQuery Runtime
2.1, which extends the power of pureQuery for developers and DBAs to collaborate together to:
- Eliminate SQL injection risk for Java database applications by giving you the ability to indicate that only SQL that has been captured and approved my be executed.
- Optimize SQL performance by providing developers with the ability to profile the SQL to see immediately how many times a SQL statement is executed, and how long it takes to run (elapsed time), giving developers an easy way to start identifying potential hot spots in the application before coming to the DBA.
- Improve quality of service for OpenJPA and .NET applications. Steve Brodsky blogged about the integration of pureQuery with OpenJPA, which actually was avalable with the 1.2 pureQuery release with WebSphere Application Server v7. For the many many people who ask when can we see the benefits of static SQL with .Net applications, we have taken an initial step in this release by allowing client optimization .NET applications; in other words, the ability to capture dynamically executing SQL and bind them into packages.
Last but not least, DB2 Performance Expert for Linux, UNIX, and Windows 3.2
and the new DB2 Performance Expert Extended Insight Feature
3.2. This is an announcement particularly close to my heart as many of you who have sat in on my talks probably know. Whenever I sit down with DBAs and talk about the problems with diagnosing performance problems in a Java application environment, they always nod their heads in agreement. There is a real pain point here by not having the same diagnostic capabilities for Java as many DBAs are familiar with for COBOL/CICS applications.
If you extend DB2 Performance Expert with the Extended Insight feature (separate PID and separately priced but prereqs DB2 PE), you can enable new end-to-end database monitoring for Java applications for DB2 servers on Linux, UNIX, and Windows. This monitoring capability will really help improve availability of mission-critical database applications by making it much easier to detect performance issues and figure out whether the problem is one in the database or somewhere else in the software stack.
Also, you can set thresholds (your SLAs, so to speak) so you can easily see how the application is performing against those targets. If you haven't read it yet, I encourage you to see the article
that the Germany team who develops this feature wrote. It's a great introduction to this new capability, and it's really just our first step. This whole concept of providing greater insight to DBAs and developers is planned to be rolled out across more databases and more data access environments.
Just a head up. We're not done. We have more announcements coming soon!
(Updated 10/15/2008 with updated Guide to IOD
These Information on Demand
conferences that we have every year really keep our lab and sales people busy. But all the work leading up to them is really worth it in the end, because it gives us an opportunity to talk directly to so many of our customers and get insight into the problems they face (which hopefully we can help solve). Last year’s IOD is where Data Studio was launched, and I’m very pleased that at this year’s IOD we have so much more to share with you – lots of sessions, demos, labs, community events, usability sandboxes, BOFs, etc. It’s also a great opportunity to get a sneak peek of upcoming enhancements.
I’m very honored that State Farm Insurance will be co-presenting with me on their experiences using pureQuery and Data Studio software, and I hope you can make that session.
We like working with customers on early releases to help us drive the product vision forward. This is the same basic pattern that we’ve used in DB2 for our entire 25 year history for each of the major steps in our technology journey (initial release of DB2, sysplex technology, DRDA and DDF, stored procedures, and now Data Studio and pureQuery). At a high level, the pattern is pretty simple:
- Design and implement the next technology leap based on the best information available to our product developers
- Find some key customers that want to push the envelope in this technology area, and partner closely with them
- Be responsive to these customers, and adapt the technology as needed to make sure the early customers are successful and extremely happy with the solution. Sometimes, this will mean that we will drive the technology in new directions as we learn more about the problems the customer is trying to solve.
- Finally, release the product for general availability and hope that we picked the early customers wisely… Here, I mean that we want to partner with customers that are representative of the broader customer population. If we picked the right customers, whatever direction they help us choose will end up being the solution that the other customers need as well. I’m happy to report that so far, we’ve had a stellar record of choosing the right customers, and have never ended up with a solution that was great for one customer, but had little value for the rest of the customer population.
It really helps our team deliver products that will fulfill your needs if they actually spend time talking to you. This is why I encourage you to come to our sessions, our birds of a feather sessions, our demos and tell us what you think about what we’re saying and showing. Effective product management and development relies on conversations, not on lectures.
For those of you who can’t make the conference in person, our team will be blogging from the conference. Hopefully that will give you the opportunity to get some idea of what is happening and provide you with the opportunity in this blog to add your comments and feedback.
I’m attaching a document here
that highlights key sessions around Data Studio and Integrated Data Management and provides a pretty comprehensive list of sessions by day and product/topic area. I hope you find it useful, and I hope to see you there.
-- Curt Cotner
I announced our latest Data Studio offerings in this blog about 4 weeks ago. Since that time, I've been getting feedback that many people are confused about how we are packaging the Data Studio offerings, and they aren't sure how these new offerings fit into the existing Data Studio development and administration tools. In a nutshell, we now have three offerings:
- The Data Studio "base" offering is a no-charge product that contains the tools that customers get when they buy a DB2 LUW, DB2 Connect, or IDS product from IBM. The base offering is made up of the application development and database administration tools that shipped with Data Studio 1.1.x (Visual Explain, stored procedure builder and debugger, SQL builder, simple ER diagram support, table editor, etc.). This same no-charge product can also be downloaded without purchasing another IBM product here: http://www.ibm.com/software/data/studio/
- Data Studio Developer V1.2 consists of the Data Studio base offering above, with a number of additional tools that help customers build pureQuery applications. This includes tools that allow the developer to capture JDBC queries during unit test, and bind those queries as static SQL using the pureQuery runtime. This allows you to benefit from the pureQuery technology advantages, without changing your application source code to use the pureQuery APIs directly.
- Data Studio Administrator V1.2 consists of the Data Studio base offering with many additional improvements in database administration for DB2 LUW that greatly reduce the human labor associated with making complex database schema changes:
- Synchronizes, copies, clones, or merges database schema definitions from a source DB2 system to a target DB2 system
- Remotely runs DB2 commands, such as utilities, application rebind, and data import/export
- All generated commands can be viewed, added, deleted, and validated before running them
- Graphical wizards lead novice users through the DB2 schema change management process:
- Manages side-effects and dependencies
- Generates Data Definition Language (DDL) to change the database
- Preserves data
- Preserves authorizations
- Preserves application binding
- Automatically propagates changes to related objects, streamlining change management
The IBM Data Studio Administrator v1.2 and IBM Data Studio Developer v1.2 are now available as a 30-day trial package from the following website:
Data Studio Administrator 1.2:http://www14.software.ibm.com/webapp/download/search.jsp?go=y&rs=swg-dsa12
Data Studio Developer 1.2:http://www14.software.ibm.com/webapp/download/search.jsp?go=y&rs=swg-dsd12
The trial packages will allow use of new features of Data Studio Administrator and Data Studio Developer v1.2 products for 30 days. It will revert to the existing Data Studio v1.1.x no-charge functionality at day 31.
In addition, please check the following link for a video overview of the Data Studio Developer v1.2 features.http://channeldb2.ning.com/video/video/show?id=807741%3AVideo%3A10108
I hope you have a chance to download these new Data Studio offerings and see what they have to offer. If you have any feedback about this blog entry, either add a comment to this blog or send an email to firstname.lastname@example.org.
-- Curt Cotner