At this time there are 82 Web-Based Training (WBT) Courses available for you to take to sharpen your skills on IBM Information Management (IM) products and technologies. Eighty-two is a LOT of courses! More are being created and being updated, so if this is the way you like to learn, make sure you stay in touch with all the latest as they are released.
IBM’s Education is known for its high quality content and excellent instructors. Although with WBT courses you won’t be in a classroom environment, you’ll get the same quality of content that you expect. WBT courses allow you to learn at your own pace and on your schedule with training that is completely self-contained from your desktop. Most courses provide introductory and overview training that serves as the foundation for more technical and follow-on training. Courses include expert instruction but do not include hands-on labs.
How do you get there?
Access this site to see all course offering types: http://www.ibm.com/software/data/education/elearning.html. Under the Web Based section, click on "Browse the complete Information Management WBT portfolio". I’ve included the 82 courses that I know of right now and have put the newest ones at the top of the list. If you’re an IBM employee, contact me offline as there is a different procedure that must be used to take these classes.
Using IBM Optim Designer and Optim Manager - Web Based – New
Connect Information Server to SAP R/3 ECC (DataStage Pack for SAP) – New
DataStage Essentials - Web Based – Updated
IBM InfoSphere Information Server Metadata in Practice – New
InfoSphere BigInsights Essentials (Web) – New
InfoSphere MDM Server Workbench for MDM Server 9 - Web Based – Updated Programming for InfoSphere Streams V2 - Web Based - New
5 Ways to Apply Data Quality with Information Analyzer WBT
A Blueprint for Implementing MDM Server Using V8.5 - Web Based Training
Advanced DataStage - Web-Based
Advanced DB2 SAP for Administrators - Web-Based
Advanced Programming InfoSphere Streams - Web Based
Basic Programming for InfoSphere Streams - Web Based
Business Glossary Essentials - Web Based
Changing Business with Data Insight (Web Based)
Configure ODBC for IBM DataStage on UNIX - Web Based Training
DB2 9 for z/OS Database Administration Workshop Part 1 (Web Based)
DB2 9.5 for Linux, UNIX, and Windows Transition from DB2 9.1 (Web-Based)
DB2 9.5 for Linux, UNIX, and Windows Transition from DB2 V8.2 (Web-Based)
DB2 9.7 for Linux, UNIX, and Windows New Features (Web-Based)
DB2 Everyplace V9 Tutorial – Online
DB2 Express V9 - Developer in a Day Workshop – Online
DB2 Family Fundamentals - Web-Based
DB2 for LUW 9.5 Fundamentals – Online
DB2 for LUW pureXML
DB2 LUW Advanced Topics – Online
Exploring IBM solidDB Universal Cache - Web-Based
Exploring Initiate Workbench V8.x - Web Based
Exploring Initiate Workbench V9.x - Web Based
FastTrack Essentials - Web Based
Federation Server Essentials - Web Based
How to Analyze Data (Using InfoSphere Information Analyzer) - Web Based Training
How to Install IBM DataStage on UNIX - Web Based Training
How to Install IBM QualityStage on UNIX - Web Based Training
How to Install Information Server 8.5 - Basic and Advanced Option
IBM Data Studio Basics - Web Based
IBM InfoSphere Change Data Capture Essentials V6.3 - Web Based
Implementing IBM InfoSphere MDM Server for PIM V9 (Web Based)
Information Analysis - Web Based
Informix 11 New Features (Web Based Training)
Informix Version 10 – Online
InfoSphere Information Server Administration - Web Based
InfoSphere Master Data Management Server Overview (MDM Server) - Web Based Training
InfoSphere MDM Server Architecture – WBT
InfoSphere MDM Server Domains for MDM Server 9 - Web Based
InfoSphere MDM Server User Interface Generator - Web Based
InfoSphere Warehouse 9 - Cubing Services - Web Based Training
InfoSphere Warehouse 9 - Data Mining and Unstructured Text Analysis – WBT
InfoSphere Warehouse 9 - SQL Warehousing Tool and Admin Console – WBT
InfoSphere Warehouse 9 Components - Web Based
Initiate Self-Paced Bundle V8.x - Web Based
Initiate Self-Paced Bundle V9.x - Web Based
Install, Configure, and Use DataStage SOA Edition (RTI) - Web Based Training
Installing and Configuring IBM Optim Data Privacy Solution - Web-Based
Introduction to IBM InfoSphere Information Analyzer - Web Based Training
Introduction to IBM Optim Data Privacy Solution - Web-Based
Introduction to IBM QualityStage - Web Based Training
Introduction to InfoSphere Change Data Capture How to Use CDC WBT
Introduction to InfoSphere DataStage Using Version 8 - Web Based Training
Introduction to InfoSphere QualityStage Using Version 8 - Web Based Training
Introduction to Initiate Master Data Service - Web Based
Introduction to New Features in Informix Version 11 – Online
Introduction to TSA in IBM Smart Analytics Systems - Web Based
Managing Workloads for DB2 LUW and InfoSphere Warehouse - Web Based
Metadata Workbench Essentials - Web Based
Overview of (IISD) InfoSphere Information Services Director - Web Based Training
Overview of IBM DataStage Enterprise Edition for ETL Developers - Web Based Training
Overview of IBM Information Server and DataStage QualityStage Version 8.0 - Web Based Training
Production Automation of IBM DataStage- Web Based Training
Programming for InfoSphere Streams - Web Based
Programming with IBM InfoSphere MDM Server for PIM V9 (Web Based)
QualityStage Advanced Concepts - Web Based
QualityStage V8 Essentials - Web Based
SQL Replication: Advanced Topics (Web Based)
Troubleshooting IBM InfoSphere Information Analyzer - Web Based Training
Using InfoSphere MDM Server for PIM V9 (Web Based)
Using Initiate Inspector V8.x - Web Based
Using Initiate Inspector V9.x - Web Based
Using Queue Replication - Web Based
Using SQL Replication (Web Based)
Validating Data with Rules in Information Analyzer 8.1.1 - Web Based Training
XML and DataStage: Convert Relational and XML Data - Web Based Training
I hope you’ve found one or two in this massive list that interest you! Take the course and let me know what you thought of it.
And it looks like I missed another DB2Night Show episode! This one was for the z/OS side and featured Julian Stuhler. If you missed it as well, get the replay.
Details about the episode:
In episode #Z08, special guest Julian Stuhler (IBM Gold Consultant, IBM Champion and former IDUG president) gave an excellent overview of the many ways you can exploit DB2 using the Java programming language; varying from simple JDBC to PureQuery. But he also discussed the impedance mismatch between Java and Relational Databases and the new features in DB2 9 and DB2 10 which Java can benefit from. If you are into Java then this DB2Night Show episode is a must!
Don’t miss another episode! Here are a few of the upcoming show:
Date & Time: 20-SEP-2011 at 11am EDT
Topic: Distributed access to DB2 for z/OS - Successful administration
Speaker: Cristian Molaro
Date & Time: 23-SEP-2011 at 11am EDT
Topic: Using and Tuning IBM OPM - What you NEED to KNOW
Speaker Scott Hayes
Date & Time: 27-SEPT-2011 at 11am EDT
Topic: Smart Analytics Optimizer - Technology Update
Speaker: Namik Hrle
Date and Time: 30-SEP-2011 at 11am EDT
Topic: DB2 LUW Index Design and Lessons Learned - "I put my kids through college by dropping indexes!"
Speaker: Martin Hubel
Date & Time: 6-OCT-2011 at 11am EDT
Topic: First Look at Changes to DSNZPARMs by DB2 10
Speaker: Willie Favero
The schedule, registration, and replays can be found on http://www.dbisoftware.com/db2nightshow/
I got this information from a newsletter that I subscribe to and thought that you may be interested. IBM continues to invest in creating education that is timely, well-designed, and highly-rated, therefore you can continue to expand your skills so that you can perform at your optimal level.
Here are the IBM Information Management courses have recently been updated:
You can view all IM training classes from this link: View all Information Management training
The newsletter I subscribe to is: IBM Training News.
To subscribe to this newsletter and many others, click here: Subscribe. Once on this site, you can choose the product and technology areas that you are interested in seeing in your newsletter and the newsletter is custom-built to your requirements.
Keep your skills as sharp as they can be!
Last year at the IOD Conference, one of the more popular events was handing out printed copies of the IBM Redbooks for Cognos 10. There were a limited number of copies that we had to hand out, and we went through them very quickly. Many other people came and left empty-handed, unfortunately. I pointed out that the book was available for free online at the ibm.com/redbooks site. Clearly there are many people who still prefer to have printed copies!
This year at the IBM Information on Demand 2011 Conference, we’re announcing up front which IBM Redbooks we’ll be handing out, and when. Make sure that if you want one of these books that you come to the Conference Bookstore at the planned times to get your copy. If you miss it, you miss it… but can get the content online!
Get one of the limited copies… and meet the author team behind the book. Maybe you can even find out how you can become an author of an IBM Redbooks.
Monday October 24 from 10:45 – 12:30 p
IBM Production Imaging Edition and Datacap Taskmaster
Metadata Management with IBM InfoSphere Information Server
Tuesday October 25 from 3:00 – 4:00 p
IBM Redbooks DB2 10 for z/OS Performance Topics
Wednesday October 26 from 11:30 – 1:30 p
IBM InfoSphere Streams and the Streams Processing Language
IBM solidDB: Delivering Data with Extreme Speed
The Conference bookstore will be a busy place this year. Here are the other signings and giveaways that are taking place at the event:
Meet Author Sandy Carter at IOD2011 - Get Bold: Using Social Media Meet Netezza Expert & Author David Birmingham at IOD11 Meet Data Warehouse expert & author: Bob Laberge Meet author Tony Giordano, “Data Integration Blueprint and Modeling” Meet FileNet expert and author Bill Carpenter at IOD11 Meet author & decision management expert James Taylor at IBM’s Information on Demand Conference From Idea to Print by Roger Sanders Flashbook: Understanding Big Data: Analytics for Enterprise Class Hadoop and Streaming Data Meet author and DB2 expert Roger Sanders at IOD11 Meet Flashbook author Arvind Sathi - Customer Experience AnalyticsMeet DB2 authors and experts Lawson & Luksetich at IOD: DB2 10 for z/OS DBA Cert Guide
Keep this in mind as you build your schedule for the conference. Note, soon I’ll have session numbers for all of these events where you can add them to your SmartSite schedule.
Join Burt Vialpando and Rav Ahuja on Thursday September 29, 2011 where they will do a comparison on autonomic computing between DB2 and Oracle:
Many database professionals ask how DB2 and Oracle compare regarding their autonomic capabilities. That is, how do they make things easier for the DBA with smart, automatically implemented features. They also wonder if there is a clear advantage of one over the other in this respect. This DB2 Chat with the Lab webinar will answer those questions and include a comparison of following autonomic capabilities in DB2 and Oracle:
- Memory Management
- Storage Management
- Utility throttling
- Automatic Configuration
- Automatic Maintenance
DB2 and Oracle - An Autonomic Computing Comparison
Date: Thursday, September 29, 2011 (29.9.2011)
Time: 12:30 PM - 2:00 PM Eastern Time (ET) ; 11:30 AM Central / 9:30 AM Pacific / 17:30hrs London / 18:30hrs Frankfurt, Paris / India 10 PM
Speakers: Burt Vialpando Host: Rav Ahuja
The session will be recorded, but attend live to get your questions answered.
Presentation charts will be available just before the webcast starts.
To register and receive instructions for attending this webcast: Register
Thanks to Rav for this information.
Sadly I missed John Hornibrook’s fourth appearance on the DB2Night Show that took place on Friday, Sept 9th. Did you miss it as well? There is a replay available (Thanks Scott!). You can find it here:
To download a recorded replay of Episode #57, right click on the link below and choose "Save As..."
Episode 57, 9 September 2011, DB2 LUW Vital Statistics with John Hornibrook
What was John talking about? DB2 LUW Vital Statistics. 94% of the audience said that they learned something new during the talk… so you can expect it to be very informative. John’s previous visits:
This show, #57 in the DB2 for LUW series, marks the kickoff of Season #3. Since the show's inception, there have been over 90,000 downloads of replays to date. The library of FREE DB2 LUW Education is enormous and growing! Check out these valuable educational resources by watching REPLAYS whenever and as often as you can.
Also, never miss another show. They take place pretty much every Friday from 11:00 – 12:00 ET, so block your calendar to make time for a weekly learning opportunity. Take a look at the schedule of upcoming shows. As always, a very impressive line up.
Thanks to host Scott Hayes for Engaging, Entertaining and Educating us!
Note, as of today, October 12, 2011..... I've been informed that this book has been delayed. I am so sorry for the inconvenience and the trouble this delay is causing for the authors, bookstore and especially the test takers! Susan
Susan Lawson and Dan Luksetich have updated their popular certification guide and it is publishing just in time for the IBM Information on Demand 2011 Conference: DB2 10 for z/OS Database Administration: Certification Study Guide. This book will be sold at the Conference Bookstore for a 20% discount. Remember that Certification exams are free for attendees at IOD.
You can meet Susan and Dan at the Conference Bookstore and have them sign a copy of your book:
Once again, Susan is teaching the Pre-Conference Certification Crammer Course:(9Q001) DB2 10 for z/OS Database Administration Certification Crammer for Exam 730 & 612
Dan is speaking at the following session:
Explaining the IBM DB2 for z/OS Dynamic Statement Cache: Tuesday 11:15 – 12:15.
IBM DB2 for z/OS does a good job of caching dynamic SQL statements to avoid recompiling (also known as binding) the statements on the fly. This can save significant system resources. However, users still must determine what dynamic statements are coming in to DB2 and how they are performing. DB2 for z/OS has the ability to capture and report on the statements in the dynamic statement cache. The session will focus on the use and management of this facility, and how to report, understand and act on the available performance information.
What’s in the book?
- Read this book to prepare to pass Exam 612 – DB2 10 for z/OS Database Administration.
- All test objectives for this exam are covered in the same degree as they are covered on the exam.
- Susan and Dan were members of the exam development team, so are fully aware as to what questions are on the exam!
- Susan and Dan use DB2 every day and have 22 years experience.
While the book does cover all topics on the test, it also covers much more than that. It covers the new features of DB2 10 for both database and application development.
What do you need to know to pass the exam?
This is an outline of the Exam Objectives. You can see the details on the website: Exam 612 Objectives
730 Exam Objectives – Family Fundamentals
Section 1 - Planning (14%)
• Knowledge of restricting data access
• Knowledge of the features or functions available in DB2 tools (just tools that come with product - distributed space - i.e., configuration advisor, configuration assistant, command line processor)
• Knowledge database workloads (OLTP vs warehousing)
• Knowledge of non-relational data concepts (extenders)
• Knowledge of XML data implications (non-shreading)
Section 2 - Security (11%)
• Knowledge of DB2 products (client, server, etc.)
• Knowledge of different privileges and authorities
• Knowledge of encryption options (data and network)
• Given a DDL SQL statement, knowledge to identify results (grant/revoke/connect statements)
Section 3 -Databases and Database Objects (17%)
• Ability to identify and connect to DB2 servers and databases
• Ability to identify DB2 objects
• Knowledge of basic characteristics and properties of DB2 objects
• Given a DDL SQL statement, knowledge to identify results (ability to create objects)
Section 4 – Using SQL (23.5%)
• Given a DML SQL statement, knowledge to identify results
• Ability to use SQL to SELECT data from tables
• Ability to use SQL to SORT or GROUP data
• Ability to use SQL to UPDATE, DELETE, or INSERT data
• Knowledge of transactions (i.e., commit/rollback and transaction boundaries)
• Ability to call a procedure or invoke a user defined function
• Given an XQuery statement, knowledge to identify results
Section 5 - Tables, Views and Indexes (23.5%)
• Ability to demonstrate usage of DB2 data types
• Given a situation, ability to create table
• Knowledge to identify when referential integrity should be used
• Knowledge to identify methods of data constraint
• Knowledge to identify characteristics of a table, view or index
• Knowledge to identify when triggers should be used
• Knowledge of schemas
• Knowledge of data type options for storing XML data
Section 6 - Data Concurrency (11%)
• Knowledge to identify factors that influence locking
• Ability to list objects on which locks can be obtained
• Knowledge to identify characteristics of DB2 locks
• Given a situation, knowledge to identify the isolation levels that should be used
612 Exam Objectives
Section 1 - Database Design and Implementation (30.5%)
• Design tables and views
• Explain the different performance implications of
• Design indexes
• Create and alter objects
• Perform table space and index partitioning
• Normalize data and translate data model into physical model
• Implement user-defined integrity rules
• Use the appropriate method to create and alter DB2 objects
• Encoding schemes
Section 2 - Operation and Recovery (29%)
• Issue database-oriented commands for normal operational conditions
• Issue database-oriented commands and utility control statements for use in abnormal conditions
• Identify and perform actions that are needed to protect databases from planned and unplanned outages and ensure that timely image copies are taken periodically
• Load and unload data into and from the created tables
• Reorganize objects when necessary
• Monitor the object by collecting statistics
• Monitor threads
• Identify and respond to advisory/restrictive statuses on objects
• Establish timely checkpoints
• Identify and perform problem determination
• Perform health checks
• Develop backup scenarios
• Describe the special considerations for availability in a data sharing environment
Section 3 - Security and Auditing (7%)
• Protect DB2 objects
• Protect connection to DB2
Section 4 - Performance (29%)
• Plan for performance monitoring by setting up and running monitoring procedures
• Analyze performance
• Analyze and respond to RUNSTATS statistics analysis
• Determine when and how to run the REORG utility
• Understand and implement Real-Time Statistics & DSNACCOR(X)
• Analyze cache
• Evaluate and set appropriately the performance parameters for different utilities
• Describe the performance concerns for the distributed environment
• Describe DB2 interaction with WLM
• Interpret traces (statistics, accounting, performance) and explain the performance impact of different DB2 traces
• Identify and respond to critical performance thresholds
• Excessive I/O wait times
• Identifying lock-latch waits and CPU waits
• Identifying and resolving deadlocks and timeouts
• Review and tune SQL
• Dynamic SQL Performance
• Performance Features
Section 5 - Installation and Migration/Upgrade(4.5%)
• Run catalog health checks using queries and utilities
• Identify the critical DSNZPARMs
• Identify the migration/upgrade modes
• Identify and explain data sharing components and concepts such as:
For other IOD sessions related to DB2 10 for z/OS, see the Information Management Roadmaps.
Other blog entries about IOD Events:
Customer Experience Analytics: Fast, Intelligent and Action-Packed
by Arvind Sathi
- Book information on blog
- Link to an online
- Electronic Version (6.98 MB)
Today’s customers have the ability to use a variety of media (including Facebook or Yelp) to broadcast their good and bad experience in real time. You must match their velocity! This book will help your organization create the capabilities to sense, isolate and alter the customer experience to your competitive advantage – creating a real time, adaptive relationship with your customers.
Analytics is one of the hottest IT topics of interest among organizations world wide. It has attracted not only the interest of IT organizations, but also has grown considerably in the minds of sales and marketing professionals. Applying analytics to customer experience provides the highest business value and is often the most sought after application area for Analytics. IBM has declared Analytics to be one of the four most important areas of growth towards our 2015 plan.
The role and business capabilities offered through analytics are rapidly changing. Customers are increasingly connected with their suppliers using a variety of electronic touch points – Web browsers, Interactive Voice Response, wireless devices, kiosks, etc. Customer experience can be altered through these electronic touch points in real-time. Also, Customers can use a variety of ways to express their satisfaction or lack thereof, publicly, in real time world wide. The associated data is accumulated by the supplier organizations in peta bytes. This data must be sorted, correlated and analyzed rapidly and intelligently to make a positive long lasting impact on the customer. Last but not the least; it is no longer sufficient to report customer problems to an analyst who routes to business owners on a monthly basis. The actions must be inserted in appropriate customer facing customer service, sales, billing or operational functions to alter the customer experience – often in near real time. While the internet has provided enormous power to the consumers, the business markets have also gained an enormous amount of sophistication in multi-supplier management, electronic gateways and use of customer and product data across the supply food-chain.
The book includes four major segments. First segment introduces the concept of Customer Experience Analytics using a series of customer experience scenarios. It also establishes the scope and focus for the book in types of customer experiences covered linking it to the Customer Life Cycle and a variety of communication touch points. The second segment introduces business value for Customer Experience Analytics. It frames the measurement dimensions, sensitivity across geographies, organizational life cycle and industries, and ways to quantify the business value. Customer Experience Analytics is a journey across maturity levels. We establish here the levels of maturity and what an organization can expect at each level of maturity. The third section covers the solution architecture for CEA solution with components offering real-time processing, intelligence and autonomics. It includes components for data collection, storage, modeling, reporting, and integration with action. This section includes choices that must be made to keep the solution simple and easy to implement. The fourth section concludes our findings and discusses some of the changes we see in the architecture as new disruptive technologies evolve.
This book is intended for semi-technical audience. It uses a series of scenarios (real as well as imaginary), case studies and allegories to illustrate the CEA business opportunity, solution and program to senior and mid level business and IT management. The technical terms are defined and indexed for easy reference. Hopefully, sales executives will also use this book to make their audience aware of the opportunities, and get their interest in exploring a solution for CEA.<
About the Author:
Dr. Arvind Sathi is the Global Communication Sector Lead Architect for Information Agenda team. He is a seasoned professional with twenty-plus years of leadership in Information Management architecture and delivery. His primary focus has been in the delivery and architecture oversight of IT projects to Communications organizations. He has extensive experience with many domestic as well as international Communications service providers as well as with other Services Industries. At Carnegie Group, he was the pioneer in developing knowledge-based solutions for CRM. At BearingPoint, he led the development of Enterprise Integration, Master Data Management and OSS/BSS (Operations Support Systems / Business Support Systems) solutions for the Communications market and also developed horizontal solutions for Communications, Financial Services and Public Services. At IBM, Dr. Sathi has led several Information Management programs in MDM, Data Security, Business Intelligence and related areas and has provided strategic Architecture oversight to IBM’s strategic accounts. Dr. Sathi has also delivered a number of workshops and presentations at industry conferences on technical subjects including Master Data Management and Data Architecture and holds patents in data masking. Dr. Sathi received his Ph.D. in Business Administration from Carnegie Mellon University and worked under Nobel Prize winner Herbert Simon.
Roger is taking part in three highly accessible events that are taking place at IOD this year:
What’s in the book?
- Read this book to prepare to pass Exam 541 – DB2 for Linux, Unix, Windows Database Administration.
- All test objectives for this exam are covered in the same degree as they are covered on the exam.
- Roger was a member of the exam development team, so if fully aware as to what questions are on the exam!
- This book isn't like Roger’s previous books, rather this is a copy of the slides that he uses for his Crammer Courses to teach people at conferences what they need to know to pass the exam. The material has been tested multiple times and updated based on feedback from people after they’ve taken the exam.
What do you need to know to pass the exam?
This is an outline of the Exam Objectives. You can see the details on the website: Exam 541 Objectives
Section 1 - DB2 Server Management (10%)
- Demonstrate the ability to configure/manage DB2 servers, instances, and databases (e.g. scope)
- Demonstrate the ability to schedule jobs
- Demonstrate the ability to use autonomic features
Section 2 - Physical Design (20%)
- Demonstrate the ability to create a database and manipulate various DB2 objects
- Demonstrate knowledge of partitioning capabilities
- Demonstrate knowledge of XML data objects
- Demonstrate knowledge of compression
Section 3 - Business Rules Implementation (5%)
- Demonstrate the ability to create constraints on tables
- Demonstrate the ability to create views WITH CHECK OPTION
Section 4 - Monitoring DB2 Activity (15%)
- Demonstrate the ability to use Admin views and SQL functions for monitoring
- Demonstrate the ability to use Work Load Manager
- Demonstrate the ability to use additional auto-monitoring tools
- Demonstrate the ability to identify the functions of Problem Determination Tools
- Demonstrate the ability to capture and analyze EXPLAIN/VISUAL EXPLAIN information
Section 5 - Utilities (15%)
- Demonstrate the ability to use the data movement utilities
- Demonstrate the ability to use the REORG, REORGCHK, REBIND and RUNSTATS utilities
- Demonstrate the ability to use the DB2 Advisor utility
- Describe the purpose of Data Studio Administrator (basic concepts - replacing existing DB2 tools)
Section 6 - High Availability (20%)
- Describe DB2 data integrity actions
- Demonstrate the ability to perform database-level and table space level backup and recovery operations
- Demonstrate the ability to configure and manage HADR
Section 7 - Security (10%)
- Describe DB2 authentication
- Describe DB2 authorizations and privileges
- Describe the ability to set and remove user, group, and role authorities and privileges
- Demonstrate general knowledge of the Audit facility
- Demonstrate knowledge of the DB2 Security Infrastructure
Section 8 - Connectivity and Networking (5%)
- Demonstrate the ability to configure database connectivity
- Demonstrate knowledge of using JDBC/SQLJ connections with DB2
Other blog entries about IOD Events:
More information about Roger & Certification Exams:
Roger Sanders was featured in the latest edition of the DM Magazine - The Man to See About Certification - “The guru of DB2 certification tests talks about how he puts them together—and how they can help your career” by Howard Baldwin
I wrote an entry on my blog some time ago that I still recommend: Certification 101. I do need to update the entry to include the DB2 9.7 exams such as Exam 541, but the content is still accurate in most ways:
Benefits of Earning a Certification
New in Certifications
How to Prepare to Pass a DB2 Certification
Details about DB2 Certification Program
Once you are Certified... What Next?
Study Materials Electronically
Roger Sanders Books, Videos, and Crammer Course
Another book that will publish just in time to be launched at the IBM Information on Demand 2011 Conference is
By Roger Sanders
Roger will be at IBM’s Information on Demand Conference this year and will be available for two book signings at the Conference bookstore:
Monday October 24 from 3:00 – 4:00 pm
Tuesday October 25 from 2:00 – 3:00 pm
Soon you’ll be able to add this signing to your Smart Site Schedule. This book and another of Roger’s books (DB2 9.7 for Linux, UNIX, and Windows Database Administration: Certification Study Notes)that will also launch at the conference will be sold at the Conference Bookstore for a 20% discount.
About the book:
I’ve already read the book and highly recommend it. It covers everything that you need to know & do in order to get an article or book published. What makes this book so different than others of the same sort is first of all Roger’s extensive experience in writing technical books and articles and second his explanation of an author contract. I personally find legal contracts to be very difficult to understand and I have little interest in putting in the effort to understand all the terms that appear in an author contract. Roger put the effort in, and it shows. He goes through each clause and term that you’ll encounter and explains it in a very understandable way. In addition, Roger gives examples of what he’s encountered throughout his career and how the various clauses in the contract affected the situation.
Getting the skills to write well isn’t nearly as hard as you think. Here are the basics that Roger covers in his book:
1. Schedule time to write. If you wait until you’re “in the mood to write”, you’ll never get anything done! Set goals for how much you want to accomplish and move to another section if one is causing you grief. Reward yourself as targets are reached.
2. Have a strong outline before you start to write. I know it sounds cliché, but the more up front planning you do, the easier the writing will be. Even for technical documents, you should “tell a story”. Have a beginning, say a problem that needs to be solved; a middle, the search for a solution; and an end, a strong conclusion.
3. Let some personality show through in the writing. There are some cases where dry, factual writing is required, but where it’s not, let the writing be conversational or slightly casual to be of interest to the reader. Always think of your reader. Even if the writing is just for a school paper, the last thing you want to do is to bore the reader so that the ending is never reached.
4. Diagrams and tables are useful, but ONLY if they are tied tightly with the text. Don’t put them there just for filler because they’ll never be looked at. The best idea is to add reference numbers to the diagrams and have text to lead the reader from one point to the next. If that sounds like too much work, maybe the diagram isn’t really needed.
5. No one’s writing is perfect… every author needs to review and revise their work many times. Most authors get quite tired of reading what they’ve written by the time it is “finished”.
To make revision as easy as possible, Roger suggests that each time you go through your draft, look for one specific thing at a time. For instance, the first time through, check that you are using the active voice instead of passive. Next, go through and look to make sure headings and lists use parallel wording. Next, look for words that are commonly spelled incorrectly that will not be caught by a spell checker. And so on.
6. For everyone, but especially if you are English-second language, consider reading the text out loud or have the computer read it to you. You may be able to hear problems in the wording easier than you can read them. Also, look at past comments you’ve received on writing assignments. Likely you often make the same errors every time you write, so pay close attention to how your previous errors were corrected, and go through your document to specifically focus on improving these problem areas.
8. One last piece of advice. If you’re writing a technical document, your goal is not to make it “beautiful”… your goal is clarity. You want to ensure that anyone who reads what you’ve written understands your technical messages.
There are a lot more details that will help you, so I encourage you to get a copy of Roger’s book.
Roger is also attending the IOD conference to share his expertise on DB2 Certification. He was an exam developer of many DB2 exams and is the instructor at a highly rated Pre-Conference Certification Crammer Course: (9Q020) DB2 9.7 for Linux, UNIX and Windows DBA Certification Crammer for Exam 541 . His other book, DB2 9.7 for Linux, UNIX, and Windows Database Administration: Certification Study Notes will help you pass one of the free exams that are being offered to attendees.
Other blog entries about IOD Events:
Understanding Big Data: Analytics for Enterprise Class Hadoop and
by Dirk deRoos, Chris Eaton, George Lapis, Paul
Zikopoulos, Tom Deutsch
Big Data represents a new era in data exploration and utilization, and IBM is
uniquely positioned to help clients navigate this transformation. This Flashbook
reveals how IBM is leveraging open source Big Data technology to deliver a
robust, secure, highly available, enterprise-class Big Data platform.
The three defining characteristics of Big Data—volume, variety, and
velocity—are discussed. You’ll get a primer on Hadoop and how IBM is 'hardening'
it for the enterprise, and learn when to leverage IBM InfoSphere BigInsights
(Big Data at rest) and IBM InfoSphere Streams (Big Data in motion) technologies.
Deployment and scaling strategies plus industry use cases are also included in
this practical guide.
- Learn how IBM hardens Hadoop for enterprise-class scalability and
- Gain insight into IBM's unique in-motion and at-rest Big Data analytics
- Learn tips and tricks for Big Data use cases and solutions
- Get a quick Hadoop primer
This book is about Big Data: but you already knew that. Big Data
is a Big Deal! This book’s authoring team is well seasoned in
traditional database technologies; and all recognized one thing: Big Data is an
inflection point when it comes to information management technologies. In fact,
Big Data is going to change the way you do things in the future, how you gain
insight, and make decisions (the change isn’t going to be a replacement, rather
a synergy and extension). Recognizing this inflection point, the author team
decided to write this book to help you get quickly up to speed on this
technology and to show you the unique things IBM is doing to turn the freely
available open source Big Data technology into a Big Data Platform;
there’s a major difference and the platform is comprised of leveraging the open
source technologies (and never forking it) and marrying that to enterprise
capabilities provided by a technology leader that understands the benefits a
platform can provide.
By the time you are done reading this book, you’ll have a good handle on the
Big Data opportunity that lies ahead, a better understanding on the requirements
that ensures you have the right Big Data platform (as opposed to just
technology), and have a strong foundational knowledge as to the business
opportunities that lie ahead with Big Data and some of the technologies
PART 1: The Big Deal about Big Data
Chapter 1 – What is Big Data? Hint: You’re a Part of it Every Day
Chapter 2 – Why Big Data is Important
Chapter 3 – Why IBM for Big Data
PART II: Big Data: From the Technology Perspective
Chapter 4 - All About Hadoop: The Big Data Lingo
Chapter 5 – IBM InfoSphere Big Insights – Analytics for “At Rest” Big
Chapter 6 – IBM InfoSphere Streams – Analytics for “In Motion” Big Data
Chris Eaton, B.Sc., is a worldwide technical specialist for
IBM’s Information Management products focused on Database Technology, Big Data,
and Workload Optimization. Chris is also an international award winning
speaker, having presented at data management conferences across the globe, and
has one of the most popular DB2 blogs located on IT Toolbox at: http://it.toolbox.com/blogs/db2luw.
Dirk DeRoos, B.Sc, B.A. is a member of the IBM World-Wide Technical
Sales Team, specializing in the IBM Big Data Platform. Dirk joined IBM eleven
years ago, and has a Bachelor of Computer Science and a Bachelor of Arts (Honors
English) from the University of New Brunswick.
Thomas Deutsch, B.A, M.B.A., serves as a Program Director in IBM’s Big
Data business. Tom has spent the couple of years helping customers with Apache
Hadoop, identifying architecture fit, and managing early stage projects in 200+
George Lapis, MS CS, is a Big Data Solutions Architect at IBM's
Silicon Valley Lab. He has worked in database software area for more than 30
years. He was a founding member of R* and Starburst research projects at IBM's
Almaden Research Center in the valley, as well as a member of the compiler
development team for several releases of DB2.
Paul C. Zikopoulos, B.A., M.B.A., is the Director of Technical
Professionals for IBM Software Group’s Information Management division and
additionally leads the World Wide Database Competitive and Big Data SWAT teams.
Paul has written more than 300 magazine articles and 14 books on DB2 and can be
reached at: email@example.com.
Two of the publishers who I work with have sent me information on their sales that are taking place this weekend, until Sept 6. I hope you’re able to take advantage of this sale to stock up on books that will help you keep your skills as sharp as they can be:
Save 50% on all ebooks through Sept 6, 2011:
Enter coupon code LABORIBM at step 3 of checkout to save 50% off IBM Press eBooks in your shopping cart.
Books you may be interested in:
Making the World Work Better: The Ideas That Shaped a Century and a Company
by Kevin Maney, Steve Hamm, Jeffrey O'Brien
IBM Cognos 10 Report Studio: Practical Examples, Rough Cuts
By Filip Draskovic, Roger Johnson
BM Style Guide, The: Conventions for Writers and Editors
By Francis DeRespinis, Peter Hayward, Jana Jenkins, Amy Laird, Leslie McDonald, Eric Radzinski
Data Integration Blueprint and Modeling: Techniques for a Scalable and Sustainable Architecture
By Anthony David Giordano
DITA Best Practices: A Roadmap for Writing, Editing, and Architecting in DITA, Rough Cuts
By Laura Bellamy, Michelle Carey, Jenifer Schlotfeldt
Save an additional 10% off the price of all their books. Books you’ll be interested in:
DB2 9.7 for Linux, UNIX, and Windows Database Administration (Exam 541) by Roger Sanders
Here is everything you need to know to pass the DB2 9.7 for Linux, UNIX, and Windows DBA Certification exam (Exam 541)!
List Price $21.95
Our Price $19.76
(You Save 10%)
Viral Data in SOA
An Enterprise Pandemic
Author: Neal A. Fishman
Happy Holiday & Happy Reading!
Meet author Bill Carpenter and have him sign a copy of his book at the IBM Information on Demand 2011 Conference. Come to the Conference bookstore Thursday October 27 from 11:00 – 12:00 pm to meet Bill. Soon you’ll be able to add this signing and others to your Smart Site schedule.
- Quickly get up to speed on all significant features and the major components of IBM FileNet P8 Content Manager
- Provides technical details that are valuable both for beginners and experienced Content Management professionals alike, without repeating product reference documentation
- Gives a big picture description of Enterprise Content Management and related IT areas to set the context for Content Manager
- Written by an IBM employee, William Carpenter, who has extensive experience in Content Manager product development, this book gives practical tips and notes with a step by-step approach to design real Enterprise Content Management solutions to solve your business needs
- IBM FileNet P8 Content Manager, built on top of the mature FileNet platform, is a complete, world class Enterprise Content Management platform. With its rock solid document management features, tight integration with BPM systems and other components, and rich API set, it is a highly scalable and secure solution to common and uncommon Enterprise Content Management requirements.
- Written by a FileNet insider, who is an Enterprise Content Management architect and engineer, this book is a straightforward guide to effectively install, manage, and administer FileNet P8 content Manager. It emphasizes practical, specific, and hands-on information about features for building Enterprise Content Management solutions. At every step, real-world tips and important information are called out to save you time and trouble when building customized solutions.
- Beginning with an overview of Enterprise Content Management, the book moves quickly to the matter of getting a real Content Management system up and running. You learn key Content Management applications that are demonstrated to show you the major concepts that matter you as a developer, administrator, or as an end user. There are separate chapters that describe major platform features, security-related features, and integrations with other commonly used software components. A realistic sample application, designed right infront of you unfolds the genius in IBM Filenet P8 Content Manager. Finally, you take an in-depth look at troubleshooting, support sites, and online resources to help meet your security needs.
- Master the ins and outs of the FileNet P8 platform, easily
You can also see Bill in the following sessions:
Wednesday 4:30-5:45 pm
IBM FileNet Content Manager
In this session, you will have the opportunity to meet with technical architects for IBM FileNet Content Manager and discuss the future direction of the technology.
For IOD sessions related to this book, see the Enterprise Content Management Roadmaps:
- ECM Advanced Case Management
- ECM Content Analytics
- ECM Document Imaging and Capture
- ECM General
- ECM Information Lifecycle Governance
- ECM Social Content Management
Other blog entries about IOD Events:
Meet author Anthony David Giordano and have him sign a copy of his book at the IBM Information on Demand 2011 Conference. Come to the Conference bookstore Tuesday October 25 from 11:30 – 12:30 pm to meet Tony. Soon you’ll be able to add this signing and others to your Smart Site schedule.
Data Integration Blueprint and Modeling: Techniques for a Scalable and Sustainable Architecture
About the book:
The book begins with an overview of the “patterns” of data integration, showing how to build blueprints that smoothly handle both operational and analytic data integration. Next, he walks through the entire project lifecycle, explaining each phase, activity, task, and deliverable through a complete case study. Finally, he shows how to integrate data integration with other information management disciplines, from data governance to metadata. The book’s appendices bring together key principles, detailed models, and a complete data integration glossary.
Today’s enterprises are investing massive resources in data integration. Many possess thousands of point-to-point data integration applications that are costly, undocumented, and difficult to maintain. Data integration now accounts for a major part of the expense and risk of typical data warehousing and business intelligence projects--and, as businesses increasingly rely on analytics, the need for a blueprint for data integration is increasing now more than ever.
This book presents the solution: a clear, consistent approach to defining, designing, and building data integration components to reduce cost, simplify management, enhance quality, and improve effectiveness. Leading IBM data management expert Tony Giordano brings together best practices for architecture, design, and methodology, and shows how to do the disciplined work of getting data integration right.
Implementing repeatable, efficient, and well-documented processes for integrating data
Lowering costs and improving quality by eliminating unnecessary or duplicative data integrations
Managing the high levels of complexity associated with integrating business and technical data
Using intuitive graphical design techniques for more effective process and data integration modeling
Building end-to-end data integration applications that bring together many complex data sources
About the Author:
Anthony Giordano is a partner in IBM’s Business Analytics and Optimization Consulting Practice and currently leads the Enterprise Information Management Service Line that focuses on data modeling, data integration, master data management, and data governance. He has more than 20 years of experience in the Information Technology field with a focus in the areas of business intelligence, data warehousing, and Information Management. In his spare time, he has taught classes in data warehousing and project management at the undergraduate and graduate levels at several local colleges and universities.
Tony is speaking at the following sessions:
Monday 10:15 am:
Implementing Information Governance Best Practices within an Organization
In this session, information governance experts describe what they have seen, and done, and take your questions about how to best implement the various information governance practices within an organization.
Tuesday 10 am:
Selling Information Governance to the Business: Best Practices by Industry and Job Function
This session discusses the best practices used to generate sponsorship for an information governance program. It also describes best practices by job function such as marketing, supply chain, risk management, finance, and legal. It will also discuss industry-specific aspects of information governance.
For IOD sessions related to this book, see the IM Information Governance.
Other blog entries about IOD Events:
As you’ve noticed, I’m promoting the upcoming IBM Information on Demand Conference via my blog. Many Europeans are unable to afford the time or money to go all the way to Las Vegas for a conference, so the good news is there is an amazing conference planned for Prague in November. I’m hoping that I can attend in order to set up a bookstore for attendees.
IDUG DB2 Tech Conference in Prague, Czech Republic, 13-18 November 2011, at the Clarion Congress Hotel Prague
- Register on or before 17 October 2011 and receive a discount of EUR 275.
- Multiple Delegate Discount: For every three individuals who register from the same organization, a fourth may attend at the discounted rate of EUR 730.
- Mentor Program: If you have attended three previous IDUG DB2 Conferences, you are eligible to bring a first-time colleague to Prague for an 80% discount off the full registration fee.
How you can hone your expert DB2 skills at IDUG EMEA
- Members of IDUG get 45% off IBM Press books. New books include Data Integration Blueprint and Modeling: Techniques for a Scalable and Sustainable Architecture, The New Era of Enterprise Business Intelligence: Using Analytics to Achieve a Global Competitive Advantage, and The Art of Enterprise Information Architecture: A Systems-Based Approach for Unlocking Business Insight.
- On 13 November, receive FREE IBM certification training, DB2 migration workshops and sessions for CA customers.
- IBM Certified Database Administrator - DB2 10 DBA for z/OS certification Preparation Seminar - Susan Lawson
- DB2 UDB V9.7 for LUW Advanced Database Administration certification Preparation Seminar - Guy Przytula
- One-Day Educational Seminars - Friday, 18 November 2011
Registration for paid full conference delegates is EUR 450; the cost for just the one-day seminar is EUR 495. Select from the following session topics:
- DB2 10 for z/OS - In Depth, Phil Grainger, Cogito
- DB2 Intermediate and Advanced SQL, Daniel Luksetich, Yevich Lawson & Assoc Inc.
- I Didn't Know DB2 did THAT!, Bonnie Baker, Bonnie Baker Corporation
- Optimising DB2 for z/OS System Performance Using DB2 Statistics Trace, John Campbell, IBM Corporation
- Rocket Science: DB2 for LUW Performance Analysis and Tuning Workshop, Scott Hayes, DBI
- Some of the top rated speakers:
For the complete agenda and speaker list, see this site: http://www.idug.org/p/cm/ld/fid=102