Managing the data lifecycle
As part of our Control Center replacement work we are gathering information about customer usage of the Task Center. If you are using Task Center, please complete our survey about your usage and what you value most. Would love to have your input by the end of November. Link to survey here.
Happy IOD Season! Kimberly Madia here from the Optim product marketing team, and I am excited to be blogging for the first time. I wanted to share some details about key activities for the Optim tools at the IOD Global Conference in
You will also not want to miss our big tent events. This is a fantastic opportunity to hear from the executive team and other thought leaders including Arvind Krishna and Steven Adler.
• Expo Grand Opening Reception, Sunday, 6:00 – 8:00 PM
• InfoSphere Community Reception, Sunday, 8:00 – 10:00 PM
• Data Management Keynote Session, Monday 2:45 – 3:45 PM
• InfoSphere Keynote, Tuesday, 3:00 – 4:00 PM
• Information Governance Keynote, Wednesday, 10:00–11:00 AM
• IOD 2010 Networking Party, Tuesday, 7:00 – 11:00 PM
Next, you will have the opportunity to meet Optim tools product experts including several distinguished engineers and of course Curt Cotner, IBM Fellow, VP & CTO for Database Servers. Here are a few of the experts, session numbers and times when these folks will be available to you.
• MTE-3683, Curt Cotner, Tue, 26/Oct 4:30-5:30 PM
• MTE-3490, Bryan Smith, Tue, 26/Oct 1:00-2:00 PM
As for technical presentations, we have lots. Certainly too many to mention here, so I will just point out a few highlights. Look for the Optim tools sessions in both DB2 and Information Governance tracks.
We have hands on labs covering the following products:
We also have usability labs for the following products:
Finally, at IOD you will have numerous opportunities to see live product demos. Optim tools will be available in the InfoSphere demo area as well as all Data Management demo suites.
Well that wraps up the highlights. We in product marketing are certainly excited about this conference. Looking forward to seeing you there! Have a great weekend.
hollyann 100000SR9G Tags:  data-studio optim lifecycle hayes 2010 october 7 Comments 7,967 Views
Big news last week on the Optim tools front.
Let me take you through the highlights.
Administration and Availability
IBM_Optim 27000269HS Tags:  purequery bommireddipalli ods webcast announcement fixpacks download java 5,575 Views
I talk to many, many people about pureQuery and the benefits it can bring to performance, security, and problem determination of database applications. A key part of being able to deliver these enhancements to the wide variety of existing Java applications is the ability to layer pureQuery optimization onto existing applications without making changes to the application code. This process is called client optimization.
We have some enterprise customers now reaping the benefits of client optimization leading to this delivery which helps further improve security, manageability, and scalability to support complex enterprise environments. The main result of these requirements is delivery of a shared repository that can be used to securely store pureQuery artifacts. Not only does this provide added security, it provides the DBA with centralized control of the client optimization process. In other words, after the initial pureQuery Runtime setup within the application server by the administrator, you, the DBA, can modify the application’s pureQuery Runtime environment configuration without requiring further changes to the application server.
In addition, the centralized repository makes it possible to do continuous capturing, providing a step toward the vision of a more automated capture, configure, and bind process.
Another enhancement based on customer requests is the ability to group SQL into packages based on string tokens or special register values to streamline and simplify package management. An example of where this can be useful is if you are currently using special registers to set the schema qualifier during dynamic execution. Now you can package up the appropriate SQL statements into separate packages (one per schema for example) and do the same thing statically using the QUALIFIER BIND option.To help make sense of these enhancements, the product documentation for client optimization has been updated with scenario-based guidance on running, deploying, managing, and troubleshooting the client optimization process.
If you are interested in hearing more about the capabilities in this release from product experts, I invite you to join us on June 24th for the Virtual Tech Briefing entitled pureQuery Deep Dive Part 3: Client optimization administration. My colleagues Patrick Titzler and Chris Farrar, who some of you may remember from the first pureQuery deep dive tech briefing, will provide and overview of the new capabilities, how it fits into the overall direction we’re going, and will be available to answer questions. Patrick is planning also to do some demos of the tooling support for these features delivered in the Optim Development Studio Fix Pack 3.
As I've been visiting various customers and presenting at conferences, I'm finding that there is a lot of interest in our new performance monitoring capabilities in Optim Performance Manager. One of hottest topics in that area is the new statement-level performance metrics. These were significantly enhanced in DB2 LUW 9.7 and are planned for DB2 for z/OS version 10, currently in beta. There are two aspects of this technology that people find exciting:
The histograms can also allow us to easily identify statements that have volatile cost due to data skew. The combination of this function with the Optim Performance Manager's end-to-end monitoring, which allows to account each SQL to the individual workload it originates from (end user, application, client machines etc.), provides a pretty powerful tool. We believe this will be an important new capability in DB2 and our tools, since it holds the promise of allowing us to review performance problems after the fact without having to recreate the problem scenario. That will save all of us a lot of time, since in many cases it isn't easy to reconstruct the conditions that caused the performance problem.
Writing to you now from outside of Rome at IOD EMEA, but I wanted to take some time to focus on some of my impressions of this year's IDUG NA in Tampa:
The first one is branding. It’s been a year since we renamed many of our Data Studio tools to Optim (and InfoSphere), partly as a result of acquiring Princeton Softech, who was using the name Optim. But there is still confusion. I spend time on this topic in my presentations, explaining that Data Studio is no longer the product family name. I explain that the term Data Studio now refers to the no-charge offering that supports DB2 (on LUW, z/OS, and i5) as well as IDS and that there is no such things as a "set of Data Studio Tools." Branding is (obviously) a very powerful thing, and changing branding can take a long time to absorb. Kathy tells me that we recently updated our packaging article on developerWorks, so if you are still confused, you may want to check it out. It lays out the high level functionality, names, etc. for the ‘portfolio previously known as Data Studio.’
Secondly, I was surprised to hear the number of customers that are using .NET to access DB2 for z/OS. I attended a Speed-SIG focused on development in Java and .NET, and all of the customers attending said that they have .NET apps accessing DB2 for z/OS. I knew that this existed, but the high frequency did surprise me. Of course, it gave us a good chance to talk about pureQuery Runtime and how it can help .NET applications in much the same way as Java applications.
Of course, it was great to see many familiar faces and lots of new ones in Tampa. The IDUG board of directors was happy to see a 20% increase in attendance over last year's North America conference. And don’t forget, IDUG Europe is planned for November 8-12 in Vienna.
I'm excited to be co-presenting with one of our customers at IOD EMEA in May. Roberta Barnabe, a DB2 Specialist for UGF (Unipol Gruppo Finanziario) Assicurazioni, will be sharing her experiences with IBM Optim Query Workload Tuner for DB2 for z/OS. For any of you interested in z/OS query tuning and want to hear what someone besides we IBMers have to say, I encourage you to come hear what she has to say about:
Ending with a summary statement that the tool resulted in an easier way to perform performing tuning analysis, and "BIG TIME SAVING !!!!!!"
I hope to see you there. Here are the particulars:
Thursday, May 20
11:45AM -12:35 PM
Hi, everyone. It seems as if lately I am always on the road. Today, I am actually back here at SVL, where, fortunately or unfortunately, depending on your point of view, it is raining, and raining hard. Tomorrow I head off to the UK for regional user group meetings. I also have Rome to look forward to, and that’s why I’m writing today, to give you the highlights at IOD EMEA for products in my portfolio.
When I look at the sessions, one major theme I see is around performance. Regardless of how ‘cool’ new applications and new technologies are, the issue of performance never goes away and often becomes even more critical as users become even more intolerant of slow response times or unavailability. So, although performance may not have the “coolness” factor of some topics, it’s bread and butter for most of our customers. The key is to make performance management more streamlined, less costly people-wise, and more focused on prevention rather than constant reaction.
If you’re going to IOD EMEA, I invite you to make a point of joining us at the following sessions to learn more about what we’re doing to help realize those goals. The sessions here are listed in chronological order, but you will of course need to check for changes at the conference venue itself. Some of the speakers may change as well. I look forward to seeing you there. If you haven’t signed up for an executive one on one with me through your IBM sales rep and are interested in doing so, here is a link to where you can find more information on how to do that.
We recently uploaded Fix Packs for many of our products. The major theme for these Fix Packs is support for Windows 7. Here are the product Fix Packs that include the Windows 7 support with links to the appropriate download documentation.
You may notice that High Performance Unload is not in that list. I will post again when that fix pack is ready.
Remember, you can find help from volunteers and other users at discussion forums that are available for all of these products. See the Integrated Data Management community space for links to all discussion forums on developerWorks.
IBM_Optim 27000269HS Tags:  cotner optim_performance_manager monitor announcement performance announcements 4,990 Views
Today we announced a major enhancement to our performance monitoring and management solution for DB2, with the 4.1 release of Optim Performance Manager for DB2 for Linux, UNIX, and Windows (I’ll use ‘OPM’ in the rest of this blog entry). This is a major new version of OPM that includes a a significantly improved up and running experience and quick problem resolution.
The biggest change you’ll see out of the box is the new Web-based user interface and redesigned problem resolution workflow Our beta customers have given us great feedback in the development and refinement of this interface, and the result seems to be pretty well-received. One of our beta clients states that “The browser interface is easy to use, with intuitive dashboard displays and easy to understand presentation of information.” Even better, since it is Web-based, you can monitor databases anywhere without having to install software on various PCs.
The repository server collects performance metrics from the monitored database and stores them into a DB2 database. You can navigate through the stored data by time and see reports or dashboard data from the chose time period. This allows post-mortem problem detection and resolution, or for proactive monitoring and trend analysis. There are also interactive reports, such as for table space disk growth and for Top n SQL statements, that you can generate from this stored information.
The team has done a lot of work on getting up and running with the monitoring solution must faster. There is an integrated installer and there are predefined monitoring profiles for a variety of workloads, such as BI, OLTP, SAP, QA and Development. I’m really happy with the reports coming from the beta that installation and configuration is “easy.”
Finally, you can launch Optim Query Tuner from several of the dashboards, including the Active SQL and Extended Insight Dashboards, to do in-context query tuning on individual problem queries.
To realize the full power of the new integrations and lifecycle capabilities of this release, you should definitely check out the new package available in this release, called Optim Performance Manager Extended Edition (OPM EE) that builds on the base capabilities in OPM by inclusion of Extended Insight (previously a separately orderable feature), integration with Tivoli monitoring solutions, and configuration tooling for DB2 Workload Manager.
If you like the value of Extended Insight, which provides key metrics and visualizations of SQL as it travels through the software stack for dynamic Java applications, you’ll really like that we’ve extended the capabilities in this release of OPM EE to include CLI applications. We also include out of the box, customizable, workload views for SAP, Cognos, DataStage, and InfoSphere SQL Warehouse help get you going.
To round out our monitoring story to support the strong message we tell with static SQL, OPM EE now includes monitoring support for static SQL from Java applications. So if you want to take advantage of static SQL from Java, either by using the pureQuery API or by using client optimization for any JDBC application, you can get the Extended Insight information that you could previously only get for dynamic.
We’ve also made it possible to import pureQuery application metadata into OPM so that detailed information about the application source (Java package name, method name, line number) can be displayed on the Extended Insight Dashboard for any individual SQL statement. This particular feature will require a pureQuery Runtime license.
Integration with Tivoli monitoring solutions smoothes the handoff between system operators and the detailed database performance analysis performed by DBAs. The integration enables the ability to drill into the deep database diagnostic capabilities of OPM EE directly from the Tivoli Enterprise Portal. One of our beta clients who does extensive work with clients using Tivoli found this integration very useful, and points out that “Outsourced operations will love the Tivoli integration as it allows them to monitor multiple WAS and DB2 instances from a single point of control.”
Finally, OPM EE provides new tooling to significantly ease the configuration of DB2 workload manager. Although the existing WLM configuration tooling is still shipped with InfoSphere SQL Warehouse, this new tooling is integrated into OPM EE. Key monitoring information vital to workload management is presented in context so that you can do related configuration and validation within a single tool
There is really way more than I can possibly cover a blog entry. Here are links where you can find more information and see the user interface in action.