With the holidays just around the corner, this is the right time to add to the spirit by announcing the new release of Data Studio
184.108.40.206 (the complimentary tooling for DB2 and IDS). Along with improved quality, following are the two significant functionalities we added:
- Administration of DB2 pureScale: If you aren't already aware, DB2's new version provides continuous availability through the use of highly reliable IBM PowerHA technology on IBM Power systems and a cluster- based shared disk architecture. DB2 pureScale provides practically unlimited capacity for any transactional workload. Thanks to a proven, scalable architecture, you can grow your application to meet the most demanding business requirements. Using Data Studio, you can do the following administrative tasks on DB2 pureScale
- Cluster Members: Start, Stop, Configure, Quiesce, Unquiesce
- PowerHA pureScale Server (CF): Start, Stop, Configure
- Basic single query tuning features: The following features have been incorporated from the deprecated Optimization Service Center for DB2 for z/OS:
- Capture queries from all data sources that Optimization Service Center for DB2 for z/OS supports and from XML files that are exported from DB2 Query Monitor for z/OS.
- View formatted queries.
- View access plan graphs.
- Capture information about the data server that queries run against, a feature which corresponds to Service SQL in Optimization Service Center for DB2 for z/OS
- Generate reports on the performance of queries.
- Run the Query Statistics Advisor to analyze the statistics that are available for the data that a query accesses, check for inaccurate, outdated, or conflicting statistics, and look for additional statistics that you might capture to improve how the data server processes the query.
Note that even if the above features have a legacy in the Optimization Service Center for DB2 for z/OS, you also get the same functionality for DB2 for Linux, UNIX, and Windows in Data Studio. For integration with Optim Development Studio
or for more advisors, such as Query Advisor, Access Path Advisor and Index Advisor, you need Optim Query Tuner
The above new features are only available from the stand-alone package of Data Studio, which is being refreshed. Hereis a link to the download document
that includes links to both the Data Studio stand-alone refresh and IDE Fix Pack.
Or, if you want to go ahead and download Data Studio stand-alone right now, here is a direct link
. And don’t forget the new free e-book, Getting Started with Data Studio for DB2, which you can download here
We have a lot of exciting things planned for 2010. So stay tuned and happy holidays!!
Updated December 15 to correct a typo.
I am the software architect for InfoSphere Data Architect
, and I wanted to spend a few minutes telling you what we’ve been cooking at the Lab over the past few months to deliver Fix Pack 1
for IDA 7.5.2, which was made available on December 11.
In this Fix Pack, we’ve added new features and improvement in a number of key areas, which I’ll highlight here. Diagramming improvements:
We are excited to have started incorporating the ILOG diagramming technology into InfoSphere Data Architect to provide an enhanced diagram layout. The new diagramming capability will offer choices of layouts as well the option to specify the spacing of objects, all very important steps towards offering greater control and flexibility of visualization.Import DB2 physical objects from other tools:
For years, InfoSphere Data Architect has offered the capability to import models from other tools. Significant in Fix Pack 1, we have provided unique capability to help import DB2-specific properties for physical database objects, such as index and storage, faithfully into IDA from other tools like CA ERwin and CA Gen. Whereas generic export/import capabilities may cause you to lose this information, this enhancement in IDA will enable you to preserve your existing data design efforts. Import from COBOL source files and copybooks:
Although this capability was on a temporary leave, our z/OS friends will be happy to know that this capability is back in with this Fix Pack. This "legacy data" often contains critical information, so they need to be included in the data modeling process. Filtering improvements for productivity and performance:
It is now possible to have a couple different approaches of specifying filters for model comparison and synchronization: you can specify filter options at the workspace level to streamline and improve overall comparison performance, and you can specify filter options at individual comparison invocation to improve ease of use of the comparison editor. Integration with InfoSphere Discovery (forrmerly Exeros):
Those familiar with our history and approach know that we have a strong focus in building linkage points across IBM products. This ability to easily share and collaborate the metadata is crucial to the acceleration of projects. In IDA 220.127.116.11, we continue this focus with the introduction of the integration with InfoSphere Discovery
. With this new capability, you can import the discovered metadata from InfoSphere Discovery directly into InfoSphere Data Architect and share and use it in a wide range of scenarios including the Optim Data Archiving
and Data Privacy
solutions, as well as InfoSphere Foundation Tools
For more details on the contents of this Fix Pack, be sure to read the Release Notes
Let me close by saying that the InfoSphere Data Architect team loves getting your input so please keep them coming. Feel free to post your comments and questions on the IDA forum
. We are excited about what we are going to do in 2010!
I was at the Lab in San Jose this week doing some planning and meeting with Business Partners. The folks here say the cold is pretty unreasonable, but I’m heading back to the upper Midwest, so I don’t think they have much room to complain.
Anyway, today we are shipping a set of Fix Packs and refreshes to our portfolio of database admin and tuning products. Some individual product architects will be blogging in more detail about what is in the Fix Packs, but I wanted to give you a head start on downloading by giving you a summary list of all the download documents that tell you how to find what you need. Reminder:
two product releases that Holly Hayes blogged
about in November also are electronically available today. Links to their download documents are here:
Have a great holiday.
I wanted to let you know that today we published a new e-book called Getting Started with Data Studio (for DB2).
It's part of the DB2 on Campus
effort, but we wrote it with the idea that anyone (not just college students) learning DB2 and Data Studio
could use this free book to get comfortable with the user interface and for performing everyday database administration tasks and for routine and Data Web Services development. You can use this with any edition of DB2, including Express-C
, which you can also download for free. And although the book was written with DB2 for Linux, UNIX, and Windows in mind, those of you who use DB2 for z/OS can use this as well since there is significant overlap in capability.
Anyway, I hope you like the book. And, more importantly, I hope it encourages more people to get familiar with and try out the broader range of offerings for integrated data management. Several of these other offerings are also available for download on a limited trial basis. You can find links to all the Optim trials (and to Data Studio) on the Integrated Data Management community space
In one of my last blogs, I wrote about the untold story of data privacy focusing on non-production systems. This past week, IBM made a significant acquisition of Guardium to improve support for compliance and the protection of privacy across all systems. As discussed in the previous blog, we often forget about the systems not in production and think that the current security in place is enough yet that is not the case.
The addition of Guardium to the mission of privacy, protection and compliance continues IBM’s mission in helping organizations at the enterprise level. Just like with data masking, often organizations don’t think of monitoring use, access and change to archives, but it is yet a risk. The combination of Guardium and Optim will provide that capability to monitor the SQL accesses and usage of data not only in production data sources, but also those in archives, development, test and training environments.
With Guardium comes an enterprise solution supporting most popular databases and application frameworks. Like the Princeton Softech and Ascential acquisitions of years ago, IBM plans to continue support for databases beyond those that are “Blue” and with the proof that they will based on now 4 years post Ascential and 2 years past Princeton Softech, I believe they will keep that promise. Most enterprises contain multiple technologies and enterprise solutions require support across them. Guardium continues IBM software’s vision of meeting the enterprises needs.
Hey, DB2 for Linux, UNIX, and Windows DBAs! We’ve enhanced the Optim Database Management Professional Edition
with new editions of:
If you missed the summer announcement, this solution combines database administration, change management, performance monitoring, and high-speed unload capabilities into a conveniently packaged and attractively priced solution. Manas blogged
about the new release of Optim High Performance Unload 4.1.2, which includes extended offline capabilities to help you reduce the risk of business disruption. Check out the full announcement here
Optim Database Administrator V2.2.2 helps prevent errors and data loss when upgrading databases to support new applications. It helps you:
- Automate and script structural database changes
- Perform extended alters that require dropping and re-creating tables and managing data preservation
- Perform schema compare and synchronization with custom mapping features
- Migrate database objects, data, and privileges and generate maintenance utility commands
- Manage database objects, privileges, and utilities with an embedded components of Data Studio software
New and enhanced capabilities in Version 2.2.2 improve overall administrative tasks by supporting federated objects, enable off-peak task scheduling providing the ability to produce scripts that can run with the DB2 command line processor, add basic pureScale support, and improve analysis performance for large DB2 environments (think 1000s of tables). The announcement letter is here
We recently returned from IOD, which was quite an intense experience for those of you who have never been. We both gave lots of demos and talked to lots of customers about the Optim Query tuning solutions. And you can imagine that any session on query tuning, with or without a tools focus, was really packed. It seems as if people can never hear enough about query tuning, because it’s actually pretty interesting to do, and because it can have such an impact to the day to day life of a DBA (or whoever in your organization is tasked with reviewing and turning queries and query workloads). Ray blogged earlier
about how important it is that developers and DBAs collaborate more in the query tuning process. Not only can developers build up their skills, they can hopefully come to the DBA with some of the basic stuff taken care of, or at least a better understanding of what the issues are. The earlier in the cycle that issues are discovered, the less expensive and labor-intensive the tuning process is.
Anyway, we wanted to share with everyone who could not make the conference some scenario-based demonstrations of how query tuning solutions can work together. We are co-presenting at an upcoming Virtual Tech briefing (complimentary!) on November 19th.
The focus of this is on z/OS, so we’ll talk about query tuning from both a development and DBA perspective, and discuss how to use Optim (such as Optim Development Studio
, Optim Query Tuner
and Optim Query Workload Tuner
) and other z/OS tools to work through some typical scenarios. The scenarios we are planning to cover include:
- A development time scenario in which query tuning capabilities and reporting are available directly from the development environment.
- Invoking query tuning capabilities from a monitoring environment such as OMEGAMON and Query Monitor
- How you can use Query Workload Tuning in a version to version migration scenario
- Workload analysis capabilities and an index advisor walkthrough.
We hope you can join us. Register today
Ray and Saghi
IOD was a busy time for the Optim team. I hosted three sessions and also was in the pedestal area and had a chance to interact with several DB2 and IDS customers. While all the solutions certainly got their attention, two solutions seemed to bubble up to the top in terms of customer discussion and questions. Optim pureQuery
was one of them, mainly from the DB2 for z/OS crowd. There is definitely a trend in the market of customers wanting to take advany tage of the wealth of data stored on the DB2 for z/OS platform rather than transport that data to another platform. This trend indicates that JAVA is the preferred language when developing these new applications; however, finding an acceptable data access layer both in terms of ease of use as well as high performance, has been a challenge. Many companies are looking at things like Hibernate to address their ease of use challenges, but unfortunately Hibernate doesn't address their high performance requirement. Several customers wanted to understand how pureQuery could help them in this situation and were very excited once we talked to them about not only the benefits of Static SQL
, but also the client optimization
features of pureQuery, and most importantly how we could visualize the SQL that Hibernate was generating to the developer and DBA. Seems to be a perfect fit for providing the high performance that DB2 z/OS customers expect for these new JAVA applications.
The second solution that seemed to get a lot of attention was our Performance Expert
solution. When it comes to problem diagnosis everyone is always looking for a better mousetrap and it seems that solutions in this area get hot and then run their course. Well, Performance Expert seems to be the up and coming hot product. I believe this is due to the sheer wealth of information this product collects. If , or should I say when, a problem occurs in production the most difficult challenge is gathering that information, especially because that moment in time is now past and you really need historical data to determine what went wrong. Performance Expert collects information and stores this information in intervals, making it incredibly easy to get the information needed all synchronized to the right time. You can see a short video of this capability here
. The other thing I found interesting was all the various reports and consoles customers were writing based on the data found in the Performance Warehouse. From capacity planning to SLA reporting, it seems to me that there is a lot of customization going on that would probably make a great birds-of-a-feather session in future conferences.
As I mentioned earlier, all the solutions got their attention. I hosted a joint session with Randy Wilson from Blue Cross Blue Shield of Tennessee where Randy described a very interesting outage situation (not fun!) and how Optim High Performance Unload
saved the day for them in terms of being able to supplement their recovery and provide the historical data they needed. Interesting how customers really think outside the box and come up with creative uses for our solution. After that session I had quite a few discussions with customers on how they could use High Performance Unload for situations like Randy's where back-level DB2 or dropped tables caused challenges in terms of recovery.
Well it was a busy week and now comes all the customer follow-up that happens after IOD!
Since all the action is in Las Vegas this week, the halls are quiet (or so I hear, since I am working from home today) and meetings are cancelled. This gives me time to provide you all with a little commercial for a little spot on the "internets" called the Integrated Data Management community space
. This space started out as the Data Studio community space by Grant Hutchinson
. When I joined the team, I took over management of the space including the move to IDM to be more inclusive of the whole Optim portfolio. If you've never worked with developerWorks spaces
, they are kind of cool. Anyone (internal or external) who gets approval can self-publish their own web space using the templates provided by developerWorks.
Anyway, I feel the IDM space needs a commercial because so many people still don't know about it (even people on my own team!), and I must send out the link several times a day. The intended purpose of the space is to try and put in one place links to all the blogs, discussion forums, articles, product documentation, classes, downloads, videos, demos, upcoming events, etc related (mostly) to the Optim portfolio.
Admittedly, the site isn't perfect as I am limited by the dw templates, and I still have work to do to get more of the heritage Optim materials integrated. But most people who discover it do think it's very useful. There is even a place to post messages, but that has not been used thus far. I think using our existing discussion forums is probably the best way to get questions answered anyway. Not sure where the forums are? Well, the links to all the forums are on the space!
As of today, the space is divided in to three tabs: Home, Articles and tutorials, and Downloads. Here is the Home tab:
As you can see in the screenshot above, you can add any of the portlets to your myYahoo or igoogle page (or as an RSS feed) by clicking on the orange plus icon. The screenshot below shows my Yahoo! page with the 'latest news' and forums portlet added.
Here is the downloads tab. It also includes some additional related software such as DB2 Express-C and the Replication Dashboard.
Anyway, I hope you like the space. Let me know if there's anything you want to see added, changed, or removed. And spread the word, please?
Let me start by introducing myself. My name is Manas Dadarkar and I am the technical lead for Data Studio (the complimentary tooling for DB2 and IDS) and for High Performance Unload (HPU) for DB2 for Linux, UNIX, and Windows. I took on this role recently and am right in the middle of planning a lot of exciting things for Data Studio and HPU.
However, I will leave Data Studio for another day (blog posting), so today let’s talk about HPU, including the announcement today of the new release, 4.1.2, under the new name, Optim High Performance Unload for DB2 for Linux, UNIX and Windows.
For people not familiar with HPU, it's a product that offers high speed unload of DB2 data, thereby allowing DBAs to work with very large quantities of data easily and efficiently. In limited lab testing using a single large table, we have seen multiple magnitudes (6-10 times) performance improvement using HPU as compared to the DB2 EXPORT utility. The usual disclaimers apply: Your results will vary.
Since version 4.1, HPU has also supported automatic migrations. You can now migrate data directly from one database to another, including unloading, transferring, and loading of the data between the source database and target database without the need for intermediate storage. You can get more information about HPU here. It's also featured in our Day in the Life of a DBA video demo.
With Version 4.1.2, what's new is being able to unload or restore using incremental backup images. If you use incremental backups, it’s likely that those backups will provide you with a more current version of a dropped table. Check out the full announcement here.
For those of you who have moved to DB2 LUW 9.7 (or will be soon), you don’t need to wait for the 4.1.2 release, which is planned to be electronically available on November 20. We’ll shortly be releasing the support for 9.7, including the ability to work with DB2’s industry leading XML data support. I’ll post a short blog with the link once that is available and we’ll also put the word out on Twitter.
Also, if you are going to IOD2009, you can hear directly from a customer about HPU by attending session 1499 A day in the life of a DBA: How to keep your sanity.