DB2 for z/OS Planning Your Upgrade: Reduce Costs. Improve
by Cristian Molaro and John Campbell and Sureka Parekh
About the book:
Under the current economic climate, businesses are under significant pressure to control costs and increase efficiency in order to improve their bottom line. DB2 for z/OS customers around the world are still trying to gain competitive advantage by doing more with less: more business insight, more performance, more operational efficiency, more functionality, more productivity with less cost, quicker time to market and a lower TCO.
With the support for DB2 Version 8 scheduled to end in April 2012 there has never been a better time to start planning your DB2 10 upgrade - there are many more reasons to upgrade to DB2 10 and users will see significant business benefits. Here are the top 10 reasons why we feel you should start planning your upgrade.
1) Improved performance, reduced software license costs
2) Increase the number of concurrent users by a factor of ten
3) Reduce contention in database administration
4) More administrative capabilities while database is online
5) Improved security and auditing
6) Maintain “snapshots” of changing data – Temporal Data
7) Improved portability via enhanced SQL
8) Enhanced pureXML performance and usability
9) Improved productivity for database/systems administrators and application programmers
10) Better online transaction processing performance – Hash Access
About the authors:
Both authors are very well known, so hardly need to be introduced, but nevertheless, here are details:
John Campbell, an IBM Distinguished Engineer working at the IBM Silicon Valley Lab. John is one of IBM’s foremost authorities for implementing high-end database, transaction-processing applications. Over the past 12 months he has been working closely with DB2 10 Beta customers.
Cristian Molaro is an independent consultant based in Belgium working in Europe for customers with some of the largest DB2 for z/OS installations. He has presented papers at several international conferences and local user groups in Europe and North America. He holds a Chemical Engineer degree and a Master in Management Sciences and is coauthor of 5 IBM Redbooks related to DB2 including the recent "DB2 10 for z/OS Performance Topics". He was recognized by IBM as an Information Champion in 2009, 2010 and 2011. Cristian is part of the IDUG EMEA Conference Planning Committee, where he works as the Presentations Team Leader, and he is Chairman of the DB2 LUW User Group BeLux.
Another book that is launching at the IBM Information on Demand Conference this year is by Sunil Soares, “Selling Information Governance to the Business: Best Practices by Industry and Job Function”. You may remember Sunil from last year when we launched his very popular Flashbook, “The IBM Data Governance Unified Process: Driving Business Value with IBM Software”. Based on the feedback and enormous success of last year’s book, Sunil was offered a book contract with MC Press for a new book.
The new book, Selling Information Governance to the Business: Best Practices by Industry and Job Function” is so amazing, I’m not sure where to start. First of all, the publisher and the copy editor were completely wowed with both Sunil and the content he created for the book. I haven’t read the book yet, but Sunil has just sent me a draft and from a simple browse I can completely understand why they are so excited.About the book:
One of the major challenges with any information governance program is explaining the value to the business. Most information governance programs deal with certain themes that are common across every enterprise including poor data quality, inconsistent business terms, fragmented data, high storage costs, regulatory compliance, and security and privacy issues. However, these themes present themselves differently across different industries and job functions. For example, poor data quality manifests itself in the form of duplicate customer records in a bank, which affects the ability of the credit risk group to establish the overall exposure to an individual customer across product lines. In retail, poor data quality results in duplicate mailings of multiple catalogs by the marketing department to the same household. I have spoken to hundreds of organizations across multiple industries and geographies about their information governance programs. The conversation quickly proceeds along the following lines: “I get the value of information governance. However, it is very hard for me to convince the business about the value of an information governance program. What best practices do you have to help me do this?” That is why I am writing this book; to help you apply best practices in your organization based on what I have learned, heard, and observed through my industry experience.
This book discusses the best practices to sell the value of information governance, and is divided into four parts:
- Best practices by industry that deal with the application of information governance principles within banking and financial markets, insurance, healthcare, manufacturing, retail, travel and transportation, government, oil and gas, telecommunications, and utilities.
- Best practices by job function that deal with the application of information governance principles within critical job functions such as sales and marketing, finance, information technology operations, information security and privacy, human resources, legal and compliance, operations, supply chain, and product management.
- Cross-industry best practices that deal with horizontal topics such as roles and responsibilities, metrics, metadata, maturity assessments, and business cases. These themes appear consistently within information governance programs across job functions, industries, and geographies.
- Information-centric applications and information governance software tools that cover the applications that benefit from information governance and the actual software tools that facilitate information governance.
The objective of the book is to provide a representative sample, rather than an exhaustive list, of best practices to sell the value of information governance within an organization. You should use these best practices as inspiration for what might work within your organization. It is important that you read chapters from industries and functions outside of your own because there are a number of case studies that you might find useful for your specific situation.
Another unique thing about this book is the number of case studies and forewords. There are more than 50 case studies and 16 forewords. I think what I’ll do is list the names of the people writing the forewords for the book. This might give you an idea of how well connected Sunil is and the depths that are covered in this book:
Partner and Global Leader
Business Analytics and Optimization – Information Management Foundation
IBM Global Business Services
Global Vice President
IBM Software Group
Business Analytics and Optimization
IBM North America
From a banking perspective: David W. Bailey
Director of Enterprise Product Management
From a telecommunications perspective: Komalin Chetty
Head – Data Governance Office
Telkom South Africa
From a financial markets perspective: Paul Ranaldo
Senior Vice President
Master Data Management/Data Governance
Brown Brothers Harriman
From an analyst perspective: Aaron Zornes
Chief Research Officer, The MDM Institute
& Conference Chairman, The “MDM & Data Governance Summit” Global Conferences Series
(London, Mumbai, NYC, San Francisco, Shanghai, Singapore, Sydney, Tokyo, & Toronto)
From a consumer packaged goods perspective: Jay Yusko, Ph.D.
VP Technology Research
From a European banking perspective: Banu Ekiz
Vice President – Business Intelligence
IBM Information Champion
Akbank Information Technologies
From a retail perspective: Michel Boudrias
Director of Enterprise Architecture
SAQ – Société des alcools du Québec
From a manufacturing perspective: Anthony Harris
Enterprise Information Architect
Air Products Gases and Chemicals Inc.
From a distribution perspective: Cengiz Barlas
Vice President & Global Head of Data Management
From an information services perspective: Gustavo Tadao Okida
Chief Enterprise Architect - LATAM
From an analyst perspective: Claudia Imhoff
President of Intelligent Solutions, Inc.
Founder of the Boulder BI Brain Trust
From an analyst perspective: Andy Hayler
The Information Difference
From a retail perspective: Charles Hunsinger
Chief Information Officer
Harry & David
About the Author:
Sunil Soares is the director of information governance within IBM Software Group. Sunil has worked with hundred of clients across multiple industries including banking, insurance, life sciences, retail, telecommunications, media and entertainment, energy and utilities, manufacturing, healthcare, and government. Sunil helps clients establish information governance programs that align IT and the business around common business objectives. Sunil’s first book The IBM Data Governance Unified Process detailed the 14 steps and almost 100 sub-steps to implement an information governance program. The book is already in its second print, and has been translated into Chinese.
I’ve just learned that Sunil is also blogging now! You can find his entries in the Mastering Data Management website. Some of his entries:
Congratulations Sunil for finishing your second book!
At this time there are 82 Web-Based Training (WBT) Courses available for you to take to sharpen your skills on IBM Information Management (IM) products and technologies. Eighty-two is a LOT of courses! More are being created and being updated, so if this is the way you like to learn, make sure you stay in touch with all the latest as they are released.
IBM’s Education is known for its high quality content and excellent instructors. Although with WBT courses you won’t be in a classroom environment, you’ll get the same quality of content that you expect. WBT courses allow you to learn at your own pace and on your schedule with training that is completely self-contained from your desktop. Most courses provide introductory and overview training that serves as the foundation for more technical and follow-on training. Courses include expert instruction but do not include hands-on labs.
How do you get there?
Access this site to see all course offering types: http://www.ibm.com/software/data/education/elearning.html. Under the Web Based section, click on "Browse the complete Information Management WBT portfolio". I’ve included the 82 courses that I know of right now and have put the newest ones at the top of the list. If you’re an IBM employee, contact me offline as there is a different procedure that must be used to take these classes.
Using IBM Optim Designer and Optim Manager - Web Based – New
Connect Information Server to SAP R/3 ECC (DataStage Pack for SAP) – New
DataStage Essentials - Web Based – Updated
IBM InfoSphere Information Server Metadata in Practice – New
InfoSphere BigInsights Essentials (Web) – New
InfoSphere MDM Server Workbench for MDM Server 9 - Web Based – Updated Programming for InfoSphere Streams V2 - Web Based - New
5 Ways to Apply Data Quality with Information Analyzer WBT
A Blueprint for Implementing MDM Server Using V8.5 - Web Based Training
Advanced DataStage - Web-Based
Advanced DB2 SAP for Administrators - Web-Based
Advanced Programming InfoSphere Streams - Web Based
Basic Programming for InfoSphere Streams - Web Based
Business Glossary Essentials - Web Based
Changing Business with Data Insight (Web Based)
Configure ODBC for IBM DataStage on UNIX - Web Based Training
DB2 9 for z/OS Database Administration Workshop Part 1 (Web Based)
DB2 9.5 for Linux, UNIX, and Windows Transition from DB2 9.1 (Web-Based)
DB2 9.5 for Linux, UNIX, and Windows Transition from DB2 V8.2 (Web-Based)
DB2 9.7 for Linux, UNIX, and Windows New Features (Web-Based)
DB2 Everyplace V9 Tutorial – Online
DB2 Express V9 - Developer in a Day Workshop – Online
DB2 Family Fundamentals - Web-Based
DB2 for LUW 9.5 Fundamentals – Online
DB2 for LUW pureXML
DB2 LUW Advanced Topics – Online
Exploring IBM solidDB Universal Cache - Web-Based
Exploring Initiate Workbench V8.x - Web Based
Exploring Initiate Workbench V9.x - Web Based
FastTrack Essentials - Web Based
Federation Server Essentials - Web Based
How to Analyze Data (Using InfoSphere Information Analyzer) - Web Based Training
How to Install IBM DataStage on UNIX - Web Based Training
How to Install IBM QualityStage on UNIX - Web Based Training
How to Install Information Server 8.5 - Basic and Advanced Option
IBM Data Studio Basics - Web Based
IBM InfoSphere Change Data Capture Essentials V6.3 - Web Based
Implementing IBM InfoSphere MDM Server for PIM V9 (Web Based)
Information Analysis - Web Based
Informix 11 New Features (Web Based Training)
Informix Version 10 – Online
InfoSphere Information Server Administration - Web Based
InfoSphere Master Data Management Server Overview (MDM Server) - Web Based Training
InfoSphere MDM Server Architecture – WBT
InfoSphere MDM Server Domains for MDM Server 9 - Web Based
InfoSphere MDM Server User Interface Generator - Web Based
InfoSphere Warehouse 9 - Cubing Services - Web Based Training
InfoSphere Warehouse 9 - Data Mining and Unstructured Text Analysis – WBT
InfoSphere Warehouse 9 - SQL Warehousing Tool and Admin Console – WBT
InfoSphere Warehouse 9 Components - Web Based
Initiate Self-Paced Bundle V8.x - Web Based
Initiate Self-Paced Bundle V9.x - Web Based
Install, Configure, and Use DataStage SOA Edition (RTI) - Web Based Training
Installing and Configuring IBM Optim Data Privacy Solution - Web-Based
Introduction to IBM InfoSphere Information Analyzer - Web Based Training
Introduction to IBM Optim Data Privacy Solution - Web-Based
Introduction to IBM QualityStage - Web Based Training
Introduction to InfoSphere Change Data Capture How to Use CDC WBT
Introduction to InfoSphere DataStage Using Version 8 - Web Based Training
Introduction to InfoSphere QualityStage Using Version 8 - Web Based Training
Introduction to Initiate Master Data Service - Web Based
Introduction to New Features in Informix Version 11 – Online
Introduction to TSA in IBM Smart Analytics Systems - Web Based
Managing Workloads for DB2 LUW and InfoSphere Warehouse - Web Based
Metadata Workbench Essentials - Web Based
Overview of (IISD) InfoSphere Information Services Director - Web Based Training
Overview of IBM DataStage Enterprise Edition for ETL Developers - Web Based Training
Overview of IBM Information Server and DataStage QualityStage Version 8.0 - Web Based Training
Production Automation of IBM DataStage- Web Based Training
Programming for InfoSphere Streams - Web Based
Programming with IBM InfoSphere MDM Server for PIM V9 (Web Based)
QualityStage Advanced Concepts - Web Based
QualityStage V8 Essentials - Web Based
SQL Replication: Advanced Topics (Web Based)
Troubleshooting IBM InfoSphere Information Analyzer - Web Based Training
Using InfoSphere MDM Server for PIM V9 (Web Based)
Using Initiate Inspector V8.x - Web Based
Using Initiate Inspector V9.x - Web Based
Using Queue Replication - Web Based
Using SQL Replication (Web Based)
Validating Data with Rules in Information Analyzer 8.1.1 - Web Based Training
XML and DataStage: Convert Relational and XML Data - Web Based Training
I hope you’ve found one or two in this massive list that interest you! Take the course and let me know what you thought of it.
And it looks like I missed another DB2Night Show episode! This one was for the z/OS side and featured Julian Stuhler. If you missed it as well, get the replay.
Details about the episode:
In episode #Z08, special guest Julian Stuhler (IBM Gold Consultant, IBM Champion and former IDUG president) gave an excellent overview of the many ways you can exploit DB2 using the Java programming language; varying from simple JDBC to PureQuery. But he also discussed the impedance mismatch between Java and Relational Databases and the new features in DB2 9 and DB2 10 which Java can benefit from. If you are into Java then this DB2Night Show episode is a must!
Don’t miss another episode! Here are a few of the upcoming show:
Date & Time: 20-SEP-2011 at 11am EDT
Topic: Distributed access to DB2 for z/OS - Successful administration
Speaker: Cristian Molaro
Date & Time: 23-SEP-2011 at 11am EDT
Topic: Using and Tuning IBM OPM - What you NEED to KNOW
Speaker Scott Hayes
Date & Time: 27-SEPT-2011 at 11am EDT
Topic: Smart Analytics Optimizer - Technology Update
Speaker: Namik Hrle
Date and Time: 30-SEP-2011 at 11am EDT
Topic: DB2 LUW Index Design and Lessons Learned - "I put my kids through college by dropping indexes!"
Speaker: Martin Hubel
Date & Time: 6-OCT-2011 at 11am EDT
Topic: First Look at Changes to DSNZPARMs by DB2 10
Speaker: Willie Favero
The schedule, registration, and replays can be found on http://www.dbisoftware.com/db2nightshow/
I got this information from a newsletter that I subscribe to and thought that you may be interested. IBM continues to invest in creating education that is timely, well-designed, and highly-rated, therefore you can continue to expand your skills so that you can perform at your optimal level.
Here are the IBM Information Management courses have recently been updated:
You can view all IM training classes from this link: View all Information Management training
The newsletter I subscribe to is: IBM Training News.
To subscribe to this newsletter and many others, click here: Subscribe. Once on this site, you can choose the product and technology areas that you are interested in seeing in your newsletter and the newsletter is custom-built to your requirements.
Keep your skills as sharp as they can be!
Last year at the IOD Conference, one of the more popular events was handing out printed copies of the IBM Redbooks for Cognos 10. There were a limited number of copies that we had to hand out, and we went through them very quickly. Many other people came and left empty-handed, unfortunately. I pointed out that the book was available for free online at the ibm.com/redbooks site. Clearly there are many people who still prefer to have printed copies!
This year at the IBM Information on Demand 2011 Conference, we’re announcing up front which IBM Redbooks we’ll be handing out, and when. Make sure that if you want one of these books that you come to the Conference Bookstore at the planned times to get your copy. If you miss it, you miss it… but can get the content online!
Get one of the limited copies… and meet the author team behind the book. Maybe you can even find out how you can become an author of an IBM Redbooks.
Monday October 24 from 10:45 – 12:30 p
IBM Production Imaging Edition and Datacap Taskmaster
Metadata Management with IBM InfoSphere Information Server
Tuesday October 25 from 3:00 – 4:00 p
IBM Redbooks DB2 10 for z/OS Performance Topics
Wednesday October 26 from 11:30 – 1:30 p
IBM InfoSphere Streams and the Streams Processing Language
IBM solidDB: Delivering Data with Extreme Speed
The Conference bookstore will be a busy place this year. Here are the other signings and giveaways that are taking place at the event:
Meet Author Sandy Carter at IOD2011 - Get Bold: Using Social Media Meet Netezza Expert & Author David Birmingham at IOD11 Meet Data Warehouse expert & author: Bob Laberge Meet author Tony Giordano, “Data Integration Blueprint and Modeling” Meet FileNet expert and author Bill Carpenter at IOD11 Meet author & decision management expert James Taylor at IBM’s Information on Demand Conference From Idea to Print by Roger Sanders Flashbook: Understanding Big Data: Analytics for Enterprise Class Hadoop and Streaming Data Meet author and DB2 expert Roger Sanders at IOD11 Meet Flashbook author Arvind Sathi - Customer Experience AnalyticsMeet DB2 authors and experts Lawson & Luksetich at IOD: DB2 10 for z/OS DBA Cert Guide
Keep this in mind as you build your schedule for the conference. Note, soon I’ll have session numbers for all of these events where you can add them to your SmartSite schedule.
Join Burt Vialpando and Rav Ahuja on Thursday September 29, 2011 where they will do a comparison on autonomic computing between DB2 and Oracle:
Many database professionals ask how DB2 and Oracle compare regarding their autonomic capabilities. That is, how do they make things easier for the DBA with smart, automatically implemented features. They also wonder if there is a clear advantage of one over the other in this respect. This DB2 Chat with the Lab webinar will answer those questions and include a comparison of following autonomic capabilities in DB2 and Oracle:
- Memory Management
- Storage Management
- Utility throttling
- Automatic Configuration
- Automatic Maintenance
DB2 and Oracle - An Autonomic Computing Comparison
Date: Thursday, September 29, 2011 (29.9.2011)
Time: 12:30 PM - 2:00 PM Eastern Time (ET) ; 11:30 AM Central / 9:30 AM Pacific / 17:30hrs London / 18:30hrs Frankfurt, Paris / India 10 PM
Speakers: Burt Vialpando Host: Rav Ahuja
The session will be recorded, but attend live to get your questions answered.
Presentation charts will be available just before the webcast starts.
To register and receive instructions for attending this webcast: Register
Thanks to Rav for this information.
Sadly I missed John Hornibrook’s fourth appearance on the DB2Night Show that took place on Friday, Sept 9th. Did you miss it as well? There is a replay available (Thanks Scott!). You can find it here:
To download a recorded replay of Episode #57, right click on the link below and choose "Save As..."
Episode 57, 9 September 2011, DB2 LUW Vital Statistics with John Hornibrook
What was John talking about? DB2 LUW Vital Statistics. 94% of the audience said that they learned something new during the talk… so you can expect it to be very informative. John’s previous visits:
This show, #57 in the DB2 for LUW series, marks the kickoff of Season #3. Since the show's inception, there have been over 90,000 downloads of replays to date. The library of FREE DB2 LUW Education is enormous and growing! Check out these valuable educational resources by watching REPLAYS whenever and as often as you can.
Also, never miss another show. They take place pretty much every Friday from 11:00 – 12:00 ET, so block your calendar to make time for a weekly learning opportunity. Take a look at the schedule of upcoming shows. As always, a very impressive line up.
Thanks to host Scott Hayes for Engaging, Entertaining and Educating us!
Note, as of today, October 12, 2011..... I've been informed that this book has been delayed. I am so sorry for the inconvenience and the trouble this delay is causing for the authors, bookstore and especially the test takers! Susan
Susan Lawson and Dan Luksetich have updated their popular certification guide and it is publishing just in time for the IBM Information on Demand 2011 Conference: DB2 10 for z/OS Database Administration: Certification Study Guide. This book will be sold at the Conference Bookstore for a 20% discount. Remember that Certification exams are free for attendees at IOD.
You can meet Susan and Dan at the Conference Bookstore and have them sign a copy of your book:
Once again, Susan is teaching the Pre-Conference Certification Crammer Course:(9Q001) DB2 10 for z/OS Database Administration Certification Crammer for Exam 730 & 612
Dan is speaking at the following session:
Explaining the IBM DB2 for z/OS Dynamic Statement Cache: Tuesday 11:15 – 12:15.
IBM DB2 for z/OS does a good job of caching dynamic SQL statements to avoid recompiling (also known as binding) the statements on the fly. This can save significant system resources. However, users still must determine what dynamic statements are coming in to DB2 and how they are performing. DB2 for z/OS has the ability to capture and report on the statements in the dynamic statement cache. The session will focus on the use and management of this facility, and how to report, understand and act on the available performance information.
What’s in the book?
- Read this book to prepare to pass Exam 612 – DB2 10 for z/OS Database Administration.
- All test objectives for this exam are covered in the same degree as they are covered on the exam.
- Susan and Dan were members of the exam development team, so are fully aware as to what questions are on the exam!
- Susan and Dan use DB2 every day and have 22 years experience.
While the book does cover all topics on the test, it also covers much more than that. It covers the new features of DB2 10 for both database and application development.
What do you need to know to pass the exam?
This is an outline of the Exam Objectives. You can see the details on the website: Exam 612 Objectives
730 Exam Objectives – Family Fundamentals
Section 1 - Planning (14%)
• Knowledge of restricting data access
• Knowledge of the features or functions available in DB2 tools (just tools that come with product - distributed space - i.e., configuration advisor, configuration assistant, command line processor)
• Knowledge database workloads (OLTP vs warehousing)
• Knowledge of non-relational data concepts (extenders)
• Knowledge of XML data implications (non-shreading)
Section 2 - Security (11%)
• Knowledge of DB2 products (client, server, etc.)
• Knowledge of different privileges and authorities
• Knowledge of encryption options (data and network)
• Given a DDL SQL statement, knowledge to identify results (grant/revoke/connect statements)
Section 3 -Databases and Database Objects (17%)
• Ability to identify and connect to DB2 servers and databases
• Ability to identify DB2 objects
• Knowledge of basic characteristics and properties of DB2 objects
• Given a DDL SQL statement, knowledge to identify results (ability to create objects)
Section 4 – Using SQL (23.5%)
• Given a DML SQL statement, knowledge to identify results
• Ability to use SQL to SELECT data from tables
• Ability to use SQL to SORT or GROUP data
• Ability to use SQL to UPDATE, DELETE, or INSERT data
• Knowledge of transactions (i.e., commit/rollback and transaction boundaries)
• Ability to call a procedure or invoke a user defined function
• Given an XQuery statement, knowledge to identify results
Section 5 - Tables, Views and Indexes (23.5%)
• Ability to demonstrate usage of DB2 data types
• Given a situation, ability to create table
• Knowledge to identify when referential integrity should be used
• Knowledge to identify methods of data constraint
• Knowledge to identify characteristics of a table, view or index
• Knowledge to identify when triggers should be used
• Knowledge of schemas
• Knowledge of data type options for storing XML data
Section 6 - Data Concurrency (11%)
• Knowledge to identify factors that influence locking
• Ability to list objects on which locks can be obtained
• Knowledge to identify characteristics of DB2 locks
• Given a situation, knowledge to identify the isolation levels that should be used
612 Exam Objectives
Section 1 - Database Design and Implementation (30.5%)
• Design tables and views
• Explain the different performance implications of
• Design indexes
• Create and alter objects
• Perform table space and index partitioning
• Normalize data and translate data model into physical model
• Implement user-defined integrity rules
• Use the appropriate method to create and alter DB2 objects
• Encoding schemes
Section 2 - Operation and Recovery (29%)
• Issue database-oriented commands for normal operational conditions
• Issue database-oriented commands and utility control statements for use in abnormal conditions
• Identify and perform actions that are needed to protect databases from planned and unplanned outages and ensure that timely image copies are taken periodically
• Load and unload data into and from the created tables
• Reorganize objects when necessary
• Monitor the object by collecting statistics
• Monitor threads
• Identify and respond to advisory/restrictive statuses on objects
• Establish timely checkpoints
• Identify and perform problem determination
• Perform health checks
• Develop backup scenarios
• Describe the special considerations for availability in a data sharing environment
Section 3 - Security and Auditing (7%)
• Protect DB2 objects
• Protect connection to DB2
Section 4 - Performance (29%)
• Plan for performance monitoring by setting up and running monitoring procedures
• Analyze performance
• Analyze and respond to RUNSTATS statistics analysis
• Determine when and how to run the REORG utility
• Understand and implement Real-Time Statistics & DSNACCOR(X)
• Analyze cache
• Evaluate and set appropriately the performance parameters for different utilities
• Describe the performance concerns for the distributed environment
• Describe DB2 interaction with WLM
• Interpret traces (statistics, accounting, performance) and explain the performance impact of different DB2 traces
• Identify and respond to critical performance thresholds
• Excessive I/O wait times
• Identifying lock-latch waits and CPU waits
• Identifying and resolving deadlocks and timeouts
• Review and tune SQL
• Dynamic SQL Performance
• Performance Features
Section 5 - Installation and Migration/Upgrade(4.5%)
• Run catalog health checks using queries and utilities
• Identify the critical DSNZPARMs
• Identify the migration/upgrade modes
• Identify and explain data sharing components and concepts such as:
For other IOD sessions related to DB2 10 for z/OS, see the Information Management Roadmaps.
Other blog entries about IOD Events:
Customer Experience Analytics: Fast, Intelligent and Action-Packed
by Arvind Sathi
- Book information on blog
- Link to an online
- Electronic Version (6.98 MB)
Today’s customers have the ability to use a variety of media (including Facebook or Yelp) to broadcast their good and bad experience in real time. You must match their velocity! This book will help your organization create the capabilities to sense, isolate and alter the customer experience to your competitive advantage – creating a real time, adaptive relationship with your customers.
Analytics is one of the hottest IT topics of interest among organizations world wide. It has attracted not only the interest of IT organizations, but also has grown considerably in the minds of sales and marketing professionals. Applying analytics to customer experience provides the highest business value and is often the most sought after application area for Analytics. IBM has declared Analytics to be one of the four most important areas of growth towards our 2015 plan.
The role and business capabilities offered through analytics are rapidly changing. Customers are increasingly connected with their suppliers using a variety of electronic touch points – Web browsers, Interactive Voice Response, wireless devices, kiosks, etc. Customer experience can be altered through these electronic touch points in real-time. Also, Customers can use a variety of ways to express their satisfaction or lack thereof, publicly, in real time world wide. The associated data is accumulated by the supplier organizations in peta bytes. This data must be sorted, correlated and analyzed rapidly and intelligently to make a positive long lasting impact on the customer. Last but not the least; it is no longer sufficient to report customer problems to an analyst who routes to business owners on a monthly basis. The actions must be inserted in appropriate customer facing customer service, sales, billing or operational functions to alter the customer experience – often in near real time. While the internet has provided enormous power to the consumers, the business markets have also gained an enormous amount of sophistication in multi-supplier management, electronic gateways and use of customer and product data across the supply food-chain.
The book includes four major segments. First segment introduces the concept of Customer Experience Analytics using a series of customer experience scenarios. It also establishes the scope and focus for the book in types of customer experiences covered linking it to the Customer Life Cycle and a variety of communication touch points. The second segment introduces business value for Customer Experience Analytics. It frames the measurement dimensions, sensitivity across geographies, organizational life cycle and industries, and ways to quantify the business value. Customer Experience Analytics is a journey across maturity levels. We establish here the levels of maturity and what an organization can expect at each level of maturity. The third section covers the solution architecture for CEA solution with components offering real-time processing, intelligence and autonomics. It includes components for data collection, storage, modeling, reporting, and integration with action. This section includes choices that must be made to keep the solution simple and easy to implement. The fourth section concludes our findings and discusses some of the changes we see in the architecture as new disruptive technologies evolve.
This book is intended for semi-technical audience. It uses a series of scenarios (real as well as imaginary), case studies and allegories to illustrate the CEA business opportunity, solution and program to senior and mid level business and IT management. The technical terms are defined and indexed for easy reference. Hopefully, sales executives will also use this book to make their audience aware of the opportunities, and get their interest in exploring a solution for CEA.<
About the Author:
Dr. Arvind Sathi is the Global Communication Sector Lead Architect for Information Agenda team. He is a seasoned professional with twenty-plus years of leadership in Information Management architecture and delivery. His primary focus has been in the delivery and architecture oversight of IT projects to Communications organizations. He has extensive experience with many domestic as well as international Communications service providers as well as with other Services Industries. At Carnegie Group, he was the pioneer in developing knowledge-based solutions for CRM. At BearingPoint, he led the development of Enterprise Integration, Master Data Management and OSS/BSS (Operations Support Systems / Business Support Systems) solutions for the Communications market and also developed horizontal solutions for Communications, Financial Services and Public Services. At IBM, Dr. Sathi has led several Information Management programs in MDM, Data Security, Business Intelligence and related areas and has provided strategic Architecture oversight to IBM’s strategic accounts. Dr. Sathi has also delivered a number of workshops and presentations at industry conferences on technical subjects including Master Data Management and Data Architecture and holds patents in data masking. Dr. Sathi received his Ph.D. in Business Administration from Carnegie Mellon University and worked under Nobel Prize winner Herbert Simon.
Roger is taking part in three highly accessible events that are taking place at IOD this year:
What’s in the book?
- Read this book to prepare to pass Exam 541 – DB2 for Linux, Unix, Windows Database Administration.
- All test objectives for this exam are covered in the same degree as they are covered on the exam.
- Roger was a member of the exam development team, so if fully aware as to what questions are on the exam!
- This book isn't like Roger’s previous books, rather this is a copy of the slides that he uses for his Crammer Courses to teach people at conferences what they need to know to pass the exam. The material has been tested multiple times and updated based on feedback from people after they’ve taken the exam.
What do you need to know to pass the exam?
This is an outline of the Exam Objectives. You can see the details on the website: Exam 541 Objectives
Section 1 - DB2 Server Management (10%)
- Demonstrate the ability to configure/manage DB2 servers, instances, and databases (e.g. scope)
- Demonstrate the ability to schedule jobs
- Demonstrate the ability to use autonomic features
Section 2 - Physical Design (20%)
- Demonstrate the ability to create a database and manipulate various DB2 objects
- Demonstrate knowledge of partitioning capabilities
- Demonstrate knowledge of XML data objects
- Demonstrate knowledge of compression
Section 3 - Business Rules Implementation (5%)
- Demonstrate the ability to create constraints on tables
- Demonstrate the ability to create views WITH CHECK OPTION
Section 4 - Monitoring DB2 Activity (15%)
- Demonstrate the ability to use Admin views and SQL functions for monitoring
- Demonstrate the ability to use Work Load Manager
- Demonstrate the ability to use additional auto-monitoring tools
- Demonstrate the ability to identify the functions of Problem Determination Tools
- Demonstrate the ability to capture and analyze EXPLAIN/VISUAL EXPLAIN information
Section 5 - Utilities (15%)
- Demonstrate the ability to use the data movement utilities
- Demonstrate the ability to use the REORG, REORGCHK, REBIND and RUNSTATS utilities
- Demonstrate the ability to use the DB2 Advisor utility
- Describe the purpose of Data Studio Administrator (basic concepts - replacing existing DB2 tools)
Section 6 - High Availability (20%)
- Describe DB2 data integrity actions
- Demonstrate the ability to perform database-level and table space level backup and recovery operations
- Demonstrate the ability to configure and manage HADR
Section 7 - Security (10%)
- Describe DB2 authentication
- Describe DB2 authorizations and privileges
- Describe the ability to set and remove user, group, and role authorities and privileges
- Demonstrate general knowledge of the Audit facility
- Demonstrate knowledge of the DB2 Security Infrastructure
Section 8 - Connectivity and Networking (5%)
- Demonstrate the ability to configure database connectivity
- Demonstrate knowledge of using JDBC/SQLJ connections with DB2
Other blog entries about IOD Events:
More information about Roger & Certification Exams:
Roger Sanders was featured in the latest edition of the DM Magazine - The Man to See About Certification - “The guru of DB2 certification tests talks about how he puts them together—and how they can help your career” by Howard Baldwin
I wrote an entry on my blog some time ago that I still recommend: Certification 101. I do need to update the entry to include the DB2 9.7 exams such as Exam 541, but the content is still accurate in most ways:
Benefits of Earning a Certification
New in Certifications
How to Prepare to Pass a DB2 Certification
Details about DB2 Certification Program
Once you are Certified... What Next?
Study Materials Electronically
Roger Sanders Books, Videos, and Crammer Course
Another book that will publish just in time to be launched at the IBM Information on Demand 2011 Conference is
By Roger Sanders
Roger will be at IBM’s Information on Demand Conference this year and will be available for two book signings at the Conference bookstore:
Monday October 24 from 3:00 – 4:00 pm
Tuesday October 25 from 2:00 – 3:00 pm
Soon you’ll be able to add this signing to your Smart Site Schedule. This book and another of Roger’s books (DB2 9.7 for Linux, UNIX, and Windows Database Administration: Certification Study Notes)that will also launch at the conference will be sold at the Conference Bookstore for a 20% discount.
About the book:
I’ve already read the book and highly recommend it. It covers everything that you need to know & do in order to get an article or book published. What makes this book so different than others of the same sort is first of all Roger’s extensive experience in writing technical books and articles and second his explanation of an author contract. I personally find legal contracts to be very difficult to understand and I have little interest in putting in the effort to understand all the terms that appear in an author contract. Roger put the effort in, and it shows. He goes through each clause and term that you’ll encounter and explains it in a very understandable way. In addition, Roger gives examples of what he’s encountered throughout his career and how the various clauses in the contract affected the situation.
Getting the skills to write well isn’t nearly as hard as you think. Here are the basics that Roger covers in his book:
1. Schedule time to write. If you wait until you’re “in the mood to write”, you’ll never get anything done! Set goals for how much you want to accomplish and move to another section if one is causing you grief. Reward yourself as targets are reached.
2. Have a strong outline before you start to write. I know it sounds cliché, but the more up front planning you do, the easier the writing will be. Even for technical documents, you should “tell a story”. Have a beginning, say a problem that needs to be solved; a middle, the search for a solution; and an end, a strong conclusion.
3. Let some personality show through in the writing. There are some cases where dry, factual writing is required, but where it’s not, let the writing be conversational or slightly casual to be of interest to the reader. Always think of your reader. Even if the writing is just for a school paper, the last thing you want to do is to bore the reader so that the ending is never reached.
4. Diagrams and tables are useful, but ONLY if they are tied tightly with the text. Don’t put them there just for filler because they’ll never be looked at. The best idea is to add reference numbers to the diagrams and have text to lead the reader from one point to the next. If that sounds like too much work, maybe the diagram isn’t really needed.
5. No one’s writing is perfect… every author needs to review and revise their work many times. Most authors get quite tired of reading what they’ve written by the time it is “finished”.
To make revision as easy as possible, Roger suggests that each time you go through your draft, look for one specific thing at a time. For instance, the first time through, check that you are using the active voice instead of passive. Next, go through and look to make sure headings and lists use parallel wording. Next, look for words that are commonly spelled incorrectly that will not be caught by a spell checker. And so on.
6. For everyone, but especially if you are English-second language, consider reading the text out loud or have the computer read it to you. You may be able to hear problems in the wording easier than you can read them. Also, look at past comments you’ve received on writing assignments. Likely you often make the same errors every time you write, so pay close attention to how your previous errors were corrected, and go through your document to specifically focus on improving these problem areas.
8. One last piece of advice. If you’re writing a technical document, your goal is not to make it “beautiful”… your goal is clarity. You want to ensure that anyone who reads what you’ve written understands your technical messages.
There are a lot more details that will help you, so I encourage you to get a copy of Roger’s book.
Roger is also attending the IOD conference to share his expertise on DB2 Certification. He was an exam developer of many DB2 exams and is the instructor at a highly rated Pre-Conference Certification Crammer Course: (9Q020) DB2 9.7 for Linux, UNIX and Windows DBA Certification Crammer for Exam 541 . His other book, DB2 9.7 for Linux, UNIX, and Windows Database Administration: Certification Study Notes will help you pass one of the free exams that are being offered to attendees.
Other blog entries about IOD Events:
IBM Cognos 10 Report Studio: Practical Examples
by Roger Johnson and Filip Draskovic
About the book:
IBM Cognos 10 is the next generation of the leading performance management, analysis, and reporting standard for mid- to large-sized companies. One of the most exciting and useful aspects of IBM Cognos software is its powerful custom report creation capabilities. After learning the basics, report authors in the enterprise need to apply the technology to reports in their actual, complex work environment.
This book provides that advanced know how. Using practical examples based on years of teaching experiences as IBM Cognos instructors, the authors provide you with examples of typical advanced reporting designs and complex queries in reports. The reporting solutions in this book can be directly used in a variety of real-world scenarios to provide answers to your business problems today. The complexity of the queries and the application of design principles go well beyond basic course content or introductory books. IBM Cognos 10 Report Studio: Practical Examples will help you find the answers to specific questions based on your data and your business model. It will use a combination tutorial and cookbook approach to show real-world IBM Cognos 10 Report Studio solutions. If you are still using IBM Cognos 8 BI Report Studio, many of the examples have been tested against this platform as well. The final chapter has been dedicated to showing those features that are unique to the latest version of this powerful reporting solution.
About the authors:
Roger Johnson is a learning consultant on IBM Cognos technologies delivering a wide variety of courses focusing on the needs of his learners. His education experience has been honed over years of work in training software professionals, college students, and many types of technology users. After he started his career as a computer programmer, a co-worker said, "Hey, you do community theater productions. You would make a good trainer." With those words, his career took a different direction. Over the next 20 years, he never moved too far from either technology or education.
As a learner, he has master's degrees in Systems Management and Education. Currently, he is researching the end-user adoption of technology as his doctoral dissertation at Capella University.
He calls Orlando home, but is regularly seen around North America delivering any number of IBM Cognos courses.
Filip Draskovic has spent his professional career, which covers the past 11 years, living and breathing IBM Cognos. For the first 8 years of his career, he had been a Cognos consultant and developed his skills applying Cognos Business Intelligence and Planning solutions in multiple industries. Wanting to do something different, he spent the next 3 years as a Cognos trainer teaching public and private IBM Cognos courses in IBM's offices around North America. Following his desire to constantly gain new experiences and knowledge, he is currently filling the role of a Cognos client technical professional. You can find him today in Toronto's financial district. At home, with his wife, he is enjoying raising their son and daughter.
Understanding Big Data: Analytics for Enterprise Class Hadoop and
by Dirk deRoos, Chris Eaton, George Lapis, Paul
Zikopoulos, Tom Deutsch
Big Data represents a new era in data exploration and utilization, and IBM is
uniquely positioned to help clients navigate this transformation. This Flashbook
reveals how IBM is leveraging open source Big Data technology to deliver a
robust, secure, highly available, enterprise-class Big Data platform.
The three defining characteristics of Big Data—volume, variety, and
velocity—are discussed. You’ll get a primer on Hadoop and how IBM is 'hardening'
it for the enterprise, and learn when to leverage IBM InfoSphere BigInsights
(Big Data at rest) and IBM InfoSphere Streams (Big Data in motion) technologies.
Deployment and scaling strategies plus industry use cases are also included in
this practical guide.
- Learn how IBM hardens Hadoop for enterprise-class scalability and
- Gain insight into IBM's unique in-motion and at-rest Big Data analytics
- Learn tips and tricks for Big Data use cases and solutions
- Get a quick Hadoop primer
This book is about Big Data: but you already knew that. Big Data
is a Big Deal! This book’s authoring team is well seasoned in
traditional database technologies; and all recognized one thing: Big Data is an
inflection point when it comes to information management technologies. In fact,
Big Data is going to change the way you do things in the future, how you gain
insight, and make decisions (the change isn’t going to be a replacement, rather
a synergy and extension). Recognizing this inflection point, the author team
decided to write this book to help you get quickly up to speed on this
technology and to show you the unique things IBM is doing to turn the freely
available open source Big Data technology into a Big Data Platform;
there’s a major difference and the platform is comprised of leveraging the open
source technologies (and never forking it) and marrying that to enterprise
capabilities provided by a technology leader that understands the benefits a
platform can provide.
By the time you are done reading this book, you’ll have a good handle on the
Big Data opportunity that lies ahead, a better understanding on the requirements
that ensures you have the right Big Data platform (as opposed to just
technology), and have a strong foundational knowledge as to the business
opportunities that lie ahead with Big Data and some of the technologies
PART 1: The Big Deal about Big Data
Chapter 1 – What is Big Data? Hint: You’re a Part of it Every Day
Chapter 2 – Why Big Data is Important
Chapter 3 – Why IBM for Big Data
PART II: Big Data: From the Technology Perspective
Chapter 4 - All About Hadoop: The Big Data Lingo
Chapter 5 – IBM InfoSphere Big Insights – Analytics for “At Rest” Big
Chapter 6 – IBM InfoSphere Streams – Analytics for “In Motion” Big Data
Chris Eaton, B.Sc., is a worldwide technical specialist for
IBM’s Information Management products focused on Database Technology, Big Data,
and Workload Optimization. Chris is also an international award winning
speaker, having presented at data management conferences across the globe, and
has one of the most popular DB2 blogs located on IT Toolbox at: http://it.toolbox.com/blogs/db2luw.
Dirk DeRoos, B.Sc, B.A. is a member of the IBM World-Wide Technical
Sales Team, specializing in the IBM Big Data Platform. Dirk joined IBM eleven
years ago, and has a Bachelor of Computer Science and a Bachelor of Arts (Honors
English) from the University of New Brunswick.
Thomas Deutsch, B.A, M.B.A., serves as a Program Director in IBM’s Big
Data business. Tom has spent the couple of years helping customers with Apache
Hadoop, identifying architecture fit, and managing early stage projects in 200+
George Lapis, MS CS, is a Big Data Solutions Architect at IBM's
Silicon Valley Lab. He has worked in database software area for more than 30
years. He was a founding member of R* and Starburst research projects at IBM's
Almaden Research Center in the valley, as well as a member of the compiler
development team for several releases of DB2.
Paul C. Zikopoulos, B.A., M.B.A., is the Director of Technical
Professionals for IBM Software Group’s Information Management division and
additionally leads the World Wide Database Competitive and Big Data SWAT teams.
Paul has written more than 300 magazine articles and 14 books on DB2 and can be
reached at: email@example.com.
Two of the publishers who I work with have sent me information on their sales that are taking place this weekend, until Sept 6. I hope you’re able to take advantage of this sale to stock up on books that will help you keep your skills as sharp as they can be:
Save 50% on all ebooks through Sept 6, 2011:
Enter coupon code LABORIBM at step 3 of checkout to save 50% off IBM Press eBooks in your shopping cart.
Books you may be interested in:
Making the World Work Better: The Ideas That Shaped a Century and a Company
by Kevin Maney, Steve Hamm, Jeffrey O'Brien
IBM Cognos 10 Report Studio: Practical Examples, Rough Cuts
By Filip Draskovic, Roger Johnson
BM Style Guide, The: Conventions for Writers and Editors
By Francis DeRespinis, Peter Hayward, Jana Jenkins, Amy Laird, Leslie McDonald, Eric Radzinski
Data Integration Blueprint and Modeling: Techniques for a Scalable and Sustainable Architecture
By Anthony David Giordano
DITA Best Practices: A Roadmap for Writing, Editing, and Architecting in DITA, Rough Cuts
By Laura Bellamy, Michelle Carey, Jenifer Schlotfeldt
Save an additional 10% off the price of all their books. Books you’ll be interested in:
DB2 9.7 for Linux, UNIX, and Windows Database Administration (Exam 541) by Roger Sanders
Here is everything you need to know to pass the DB2 9.7 for Linux, UNIX, and Windows DBA Certification exam (Exam 541)!
List Price $21.95
Our Price $19.76
(You Save 10%)
Viral Data in SOA
An Enterprise Pandemic
Author: Neal A. Fishman
Happy Holiday & Happy Reading!