Keep on Learning
Buy a book... get 50% off a certification exam! We only have a limited number of vouchers, so we're expecting to run out quickly. Be one of the first to get a voucher by buying a copy of the IBM DB2 9 New Features by Paul Zikopoulos, George Baklarz, Leon Katsnelson, and Chris Eaton. You don't need to buy the book from amazon, but you do need to send us proof of purchase.
Normally exams cost about $150 ... so this is a savings of $75. For full details, read: ibm.com/software/data/education/voucher.html
The DB2Night Show Episode 14: 12 March 2010
Back by popular request, the DB2Night Show is doing another CRUNCHING THE NUMBERS episode wherein a live performance health check will be performed for a real DB2 LUW production database. If you would like to see how this works, watch the replay of Episode #9.
Obviously, to make this show successful and educational for participants, YOUR snapshot data is needed! If you would like a free performance health check analysis completed for your database, please send the text outputs (in a zip file) from these commands to db2nightshow at dbisoftware .com:
* db2 get snapshot for database on DBNAME
* db2 get snapshot for bufferpools on DBNAME
* db2 get snapshot for tablespaces on DBNAME
* db2 get snapshot for tables on DBNAME
As you will see in Episode #9, care will be taken to not reveal the organization that provides the data - your contributions will be anonymous unless you specifically request to be identified.
One studio audience member will be randomly selected to win an Amazon.com gift certificate - use this towards a new USB thumb drive, a new book, or anything you like!
svisser1 2700018UK9 Tags:  idug conference db2 purescale certifcation warehouse sanders 5,322 Views
Another guest post from Kate Kurtz who is attending IDUG for the first time. I’m really glad that she’s doing this for me as I can’t be there this year.
Phew! The time is flying by and my only regret is that I do not have enough time to attend all the sessions. I also should have brought more comfortable shoes! Today I was lucky enough to get to see Matt Huras (Distinguished Engineer, DB2 Development) speak. He was discussing "The latest from the Lab and Customers on DB2 pureScale".
Matt was able to take a complicated concept and explain it in a simple straightforward way, making it easy to understand. In his session he mapped all the things that you know about how to manage a traditional DB2 LUW environment, to how they are implemented in DB2 pureScale, and the differences. The management concepts are straightforward, and wherever possible processes are automated. The complicated part (which does not affect the user) is the technology behind the scenes that makes it all work. With a simple, straightforward manner and excellent automated build slides, Matt made all the technology easy to understand.
Recently there have been some best practice papers published for DB2 pureScale, so if you want to read more, take a look:
In the afternoon, Kelly Rodger and I ran the session "Best Practices for Your Database Recovery Strategy". It was lively session with lots of questions from the audience. As you can imagine, database recovery planning is something that people are very passionate about. All the concepts that we covered in the session are discussed in detail, in the best practice paper: "Building a Recovery Strategy for an IBM Smart Analytics System Data Warehouse". We also added some additional content about compression which we plan in incorporating in the next update to the paper.
The presentations are amazing and IDUG is excellent networking opportunity. In addition to that, while you are IDUG you can take the DB2 Certification exams for free. What a great opportunity to test what you know!
A note about the certifications at IDUG. Roger Sanders has a new book for DB2 9.7 Certification Exams coming soon. We were hoping it would be done for IDUG, but alas. The latest DB Magazine has an article about Roger & certification that you might want to read.
Susan Lawson and Dan Lutsetich have also updated their DBA Certification for DB2 10. The book will be ready in June, so check back for ordering information.
PS… speaking of Roger Sanders, I’m reviewing the last chapter for his upcoming book “From Idea to Print”. See a blog entry I wrote about part of the book: Campus Visit: How to improve technical writing skills. Anyway, the chapter I’m reading is about the author contract and how to best negotiate the terms so that they are favourable to the author. I am completely blown away by the content and think that this chapter alone will sell the book!
Not only do we get to hear from the awesome Kevin Spacey at the IBM Insight Conference this year, but we get to hear from Grant Imahara, former co-host of Discovery Channel's MythBusters. In my line of business, I’m surrounded by people who love science and episodes of MythBusters often get discussed during lunch. “Did you see when they built a guillotine and were throwing dummies?” Etc. If you have not been so lucky to have seen many episodes, I suggest you check out the top 20 list of myths busted on the show: The 20 Best Myths Tested On Mythbusters.
My son is a huge fan of the Discovery Channel show as well, so I’ve seen most of the episodes with him. What makes the show so cool? I would say it is how creative the hosts are at building ways to test a theory they are trying to prove or disprove. They focused primarily on well know myths and urban legends of pop culture, which we’re all victims of, so having your biases challenged is fun.
At IBM Insight, Grant will be joining Beth Smith, General Manager; Information Management to make the IM Keynote will be more informative and engaging than ever. Perhaps they will challenge some of the biases we currently have about data and analytics.
I hope you can attend the conference and if you are attending, be sure to add this keynote to your agenda: IM Keynote: Monday, October 27 from 11:30 am – 12:30 pm.
Also, check out some of the buzz going on in the twitter verse this week about Grant’s keynote via Storify
These IBM Redbooks were on the top of the download list for the year (in the IM area, at least). Do you have your copy yet?
Redbooks, published 10 Dec 2009
Redbooks, published 6 Nov 2009, last updated 9 Dec 2009,
Rating: (based on 7 reviews)
Redguides, published 24 Apr 2009, last updated 7 Dec 2009,
Rating: (based on 1 review)
Redbooks, published 11 Sep 2007, last updated 2 Dec 2009,
Rating: (based on 5 reviews)
Redbooks, published 1 Dec 2009, Rating: (based on 10 reviews)
I met Sanjeev last year at the IOD Conference. We had tried to set up a booksigning, but the information came in late and there were issues. So, we’re promoting this book this year.
By Sanjeev Datta
At IBM Insight:
Buy your copy of the book at the conference bookstore and have it signed by Sanjeev in his booth. Sanjeev will be available for signing copies on Tuesday October 28 from 12:30pm – 3:30pm in the Solution EXPO - Booth 802.
About the book:
Based on in-memory technology with write-back capabilities for What-if scenario planning, Cognos Insight gives spreadsheets the rich visual appeal for better analysis and planning. Leverage the collaborative features to seamlessly publish the personal analysis workspaces to an enterprise-wide Cognos Business Intelligence solution.
"IBM Cognos Insight" is a fast paced, practical hands-on guide with step-by-step instructions to build Cognos Insight workspaces. Take advantage of the in-memory TM1 cubes as the underlying engine to answer business questions by analyzing data in the form of reports & dashboards. Share these insights with other Cognos services or mobile devices to empower users with real-time data.
This book introduces Cognos Insight as a personal yet powerful analytics application. It covers how decision making is applied in all domains of the business world and how data can be analyzed effectively in the form of fast in-memory cubes. Leverage the write-back functionality to build budgets, plans and forecasts.
"IBM Cognos Insight" will empower new and existing users to maximize the features of the application and analyze data by building visually rich reports and dashboards in minutes.
This book takes a practical tutorial approach to teaching users the features of Cognos Insight.
Who this book is for
New and existing users of Cognos Insight who are looking to gain more knowledge about the product and Business Analytics in general.
About the Author
Sanjeev Datta is a seasoned Consultant, passionate text and video blogger, and Business Analytics enthusiast. As Practice Director at PerformanceG2, Inc., he works extensively with executives and decision-makers across finance, manufacturing, retail, and pharmaceuticals as a trusted advisor in corporate performance management, building client relationships and managing Business Analytics implementations.
Sanjeev's work as a strong Project Manager, Pre-sales and Post-sales Consultant, trainer, and mentor has led to many successful implementations. While at Merador, LLC, Sanjeev worked as a Consultant/Architect building solutions for global organizations.
Previous to that, Sanjeev was a Cognos Developer/Consultant while at Softpath Systems, LLC and lead successful Cognos BI solutions and developed Cognos BI training material for numerous clients.
He is certified in numerous IBM products and is an IBM Technical Specialist and IBM Sales Mastery Professional.
Sanjeev has a degree in Computer Science from Mumbai University and a degree in Interdisciplinary Studies from The University of Texas.
Connect with Sanjeev on LinkedIn or Twitter: @1dsanjeev
One thing I neglected to mention in my blog posting yesterday about the value of Certification is the Take it Again program that is active again this year.
Earning IBM Professional Certification is a smart career move. But many of us are nervous when taking the certification exam. Our “
By the way, I've been invited to be guest on Episode #12 of The DB2Night Show coming up this 12 February 2010 at 10am CST. Our theme for this show is "The Wild Wonderful World of DB2 Information Resources". Claim your free seat in our virtual studio audience by registering at http://www.DB2NightShow.com. I hope you can join Scott Hayes and I!
Check out the guest blog entries that are coming fast & furious on the Official IOD Blog:
Stay tuned for more guest blogs from IBM’s Best Friends: IBM Champions.
Yesterday I focused on what an outstanding job our IBM Champions do to enhance the acceptance and usage of IBM products. Check out my posts:
These entries took a lot of research and the chances of me missing someone was high. And of course, I did miss a few people who I actually know quite well! My apologies to these people and any others I have missed!
Frank Fillmore will be a panelist with IBM SVP Robert LeBlanc at the 4245A Break Free Forum on Monday, October 22 from 3:20 p.m. to 4:20 p.m in the Four Seasons Ballroom.
Also check out Frank’s blog.
I had mentioned Cristian Molaro, but neglected to mention that he will be available to sign copies of the IBM Redbook that he was recently involved with:
Tell me if you want me to blog about you or be a guest blogger. I’m very happy to promote the activities of all IBM Champions.
svisser1 2700018UK9 5,295 Views
May is nearly over, but before it ends, take part in these two upcoming free webinars to learn more about DB2 10.5 with BLU Acceleration. Both special guest speakers of these webinars were authors of the chapter excerpt specifically about BLU from an upcoming Flashbook: DB2 10.5 with BLU Acceleration
Breaking News from IBM on DB2 for Linux, UNIX and Windows!
Friday May 24, 11:00 am - 12:00 noon ET
During Episode #111, George Baklarz from IBM will share latest news for IBM DB2 LUW. Of course this news will relate to the features that were recently announced for DB2 10.5, including BLU, pureScale enhancements, JSON preview and SQL.
Deep Dive on BLU Acceleration in DB2 10.5, Super Analytics Super Easy
Thursday, May 30: 12:30 – 2:00 pm ET
BLU Acceleration in DB2 10.5 for Linux, UNIX and Windows delivers results from data-intensive analytic workloads with speed and precision that is termed ʺspeed of thoughtʺ analytics. Join IBM Distinguished Engineer and DB2 expert Sam Lightstone for an in-depth discussion of the all-new BLU Acceleration features in DB2 10.5. In this deep dive, Sam will explain capabilities such as dynamic in-memory analytics, parallel vector processing, enhanced columnar storage techniques, actionable compression, and more.
More about BLU
Read the chapter excerpt specifically about BLU from an upcoming Flashbook. The authors include Paul Zikopoulos, Matthew Huras, George Baklarz, Sam Lightstone, and Aamer Sachedina
Thanks to IBM Champions Jean-Marc Blaise, Iqbal Gorawalla, and Tony Winch who joined Rick Swagerman on a special edition of Tech Talks, called Tech Bits, to share their impressions of the new DB2 10.5 with BLU Acceleration. These are short but very informative and worth the time it takes to listen.
Listen to Berni Schiefer to find out about enhancements to DB2 pureScale that make it even more able to deliver always available transactions. You’ll also learn about updates to SQL, Oracle Database compatibility, business-ready NoSQL technology and additional NoSQL plans.
More are added all the time. Hear directly from customers who spent time using the new technology with their own data and how the technology fared.
Read this article that was published on April 3, 2013 in the IBM Data Magazine (ibmdatamag.com) by Guy Lohman, Sam Lightstone and Berni Schiefer. “Imagine a database technology that gives you 10-20 times faster performance right out of the box, requires dramatically less storage, and nearly eliminates the need for tuning. Too good to be true? Not anymore.”
If you are a developer or DBA and are interested in DB2, take this opportunity to preview emerging technologies that are still in the lab. IBM wants to collaborate with you early in technology conception, prototyping, and iterative development cycles. These technologies may or may not be aligned with announced products. The features or technologies in the download do not represent a commitment or obligation on the part of IBM.
Enjoy the webinars! Learn as much as you can, because when this code is available, you’ll want to start using it in production.
When at the IBM Information on Demand Conference next week, be sure to visit EXPO Booth #832 learn how you can extend the value of your software investment. At the IBM Support & Subscription, Lab & Training Services Hub (booth 832) you’ll meet knowledgeable professionals who will help you identify effective strategies to maximize the return on your IBM software solutions.
After chatting with an expert, pick-up a pedometer to help track your steps during the conference.
I wore a pedometer to the conference a few years ago and tracked 99 miles over the course of the week! See if you can match me!
Also at this HUB:
Say HI to two of my friends while you’re there! Kate Dawson and Margie Beaudette!
See also my blog post about the Support team: IBM Information Management Customer Support - a team worth celebrating!
Want to know more about ECM or Filenet? Here are some IBM Redbooks that can help you learn more about these IBM solutions:
* IBM Content Manager OnDemand Web Enablement Kit Java APIs: The Basics and Beyond
This book is intended for application developers who are responsible for developing Web applications that interface with Content Manager OnDemand. It also serves as a good reference guide for developers and system administrators to fine-tune and troubleshoot Content Manager OnDemand Web applications.
* IBM Enterprise Content Management and System Storage Solutions: Working Together
This book provides the necessary information to IBMers, business partners, and customers on how to implement FileNet® ECM with IBM Storage Solutions.
* Introducing IBM FileNet Business Process Manager
This publication provides a basic introduction to IBM FileNet® Business Process Manager (BPM) V4.0. BPM enables organizations to create, modify, and manage content centric business processes. One key advantage of BPM is its ability to work with active content, which refers to the ability of content to trigger or affect business processes.
This book is useful for system architects, process analysts, and process designers who require an understanding of IBM FileNet Business Process Manager. It also serves as a practical guide for those who want detailed instructions in order to implement a BPM system.
* IBM FileNet Content Manager Implementation Best Practices and Recommendations
This publication covers the implementation best practices and recommendations for P8 Content Manager solutions. It introduces the functions and features of P8 Content Manager, common use cases of the product, and a design methodology that provides implementation guidance from requirements analysis through deployment and administration planning.
The book addresses various implementation topics including system architecture design, capacity planning, business continuity, repository design, security, and application design. Administrative topics covered include deployment, system administration and maintenance, and troubleshooting. We also discuss solution building blocks that you can specify and combine to build a solution.
This book is intended to be used in conjunction with the product manual and online help to provide guidance to architects and designers about implementing P8 Content Manager solutions.
See also Product Documentation for IBM FileNet P8 Platform.
* Content Manager OnDemand Guide
This publication provides helpful, practical advice, hints, and tips for those involved in the design, installation, configuration, system administration, and tuning of an OnDemand system. It covers key areas that are either not well known to the OnDemand community or are misunderstood.
I found this article about Athabasca University in Alberta that uses the concept of interactive virtual learning environments in some of their classes: Alberta university takes academic cue from video games.
From the article:Young people have long been able to hone physical skills through realistic video games, and the university wants to approach academics the same way, says Rory McGreal, the university's associate vice-president of research.
Hours spent hunched over a computer perfecting how to shoot a gun or explore a virtual environment could also be spent learning how the body works or understanding the universe.
“We're trying to harness the power of games and how we can use them to promote learning,” Mr. McGreal said Wednesday.
Personally I've witnessed the popularity of games with young people and I've also experienced the use of gaming to teach people to learn DB2. I was part of the pilot program that created the first "DB2 Detective Game" and loved seeing the students "get" the power of SQL while playing the game.
If you or someone you know needs to learn DB2... or if you like to volunteer to teach high school students, you need to check out the DB2 games:
IBM DB2 Detective Game
Getting started with DB2 9 and SQL can be easy with this new interactive game. Using an crime investigation theme, this introductory level game teaches relational database concepts and shows how technology can be applied to solving real-life problems.
IBM DB2 Business Game
Put your DB2 and SQL skills to the test with this intermediate level game! Jump into a “run your own business” scenario, where your company’s future depends on a key report needed to secure funds for a critical upgrade. Play the game and you may be ready to transfer your skills into action!
Sheryl Larson top #DB2 SQL Expert, IBM GOLD Consultant, is our special guest on The DB2Night Show 8 October 2010 at 10am Central. Learn SQL tips from the master.
Free registration and details:
This week, the DB2Night Show is going "Cross Platform DB2" presented by Sheryl M. Larsen, IBM DB2 GOLD Consultant and President of SMLSQL (www.SMLSQL.com).
The Lessons Learned from SQL Performance Review presentation divulges discoveries and recommendations from various SQL performance review assignments across platforms. Come see if you have similar SQL performance issues and get the instructions on how to fix them. Issues discussed include non-optimal index design, access paths gone wild, residual predicates, misuse of SQL, excessive sorting, delayed filtering, lack of implementation of powerful new SQL features.
The DB2Night Show on Twitter: twitter.com/db2nightshow for the latest news about new episodes and interactive discussion. Use the hash tag #DB2Night in your Tweets.
If you’ve missed a show, remember that you can always get the recorded replay.
Once again, we had an active group of experts leading the discussion and were very pleased at the responses from other attendees. Thanks to all who contributed their time and expertise! Here is a summary of the discussion:
BigDataAlex A1: In-Memory Computing (IMC) utilizes RAM-DRAM for extremely fast I/O, moving us away from slow, underutilized spinning disk
Natasha_D_G A1:In-memory tech enables businesses to utilize data stored in main memory vs fragmented/siloed trad databases
jeffreyfkelly A1 in-memory refers to storing data in main memory (DRAM) rather than spinning disk
Natasha_D_G A1: In simplest form: open book exams vs memorizing answers. Time it takes to search for answers test is over!
BigDataAlex A1: IMC reduces power and storage costs, revolutionizing access.
jameskobielus A1: In-memory puts data into RAM to enable interactive visualization exploration of patterns & real-time transactions
jeffreyfkelly A1 much faster to pull data from memory than disk - response time much quicker than spinning rusty metal allows
InfoMgmtExec Info Mgmt has always been about "managing the bottlenecks". A major one has always been the database itself. In-Memory helps a lot.
BTRG_MikeMartin In-memory enables you to increases the exploration aspect of Big Data
Natasha_D_G Without adding time to equation RT In-memory enables you to increases the exploration aspect of Big Data
cristianmolaro A1 memory access is way faster than disk I/O... even against SSD
jameskobielus A1: Speed of thought is any tech that doesn't have any architectural bottlenecks that arbitrarily slow people's explorations
CuneytG A1 In-memory means fast access to data
Natasha_D_G A1: Memory makes diff! Ability to deliver accurate answer w/o pregnant pauses impacts biz agility
cristianmolaro Combine massive parallel computing with on-memory processing and you will get a super super fast bigdata machine
BigDataAlex A2: Working with streaming data to analyze audio – processing in real-time 32 petabytes a day burn rate.
jeffreyfkelly A2 anything requiring speed-of-thought response time - allows for exploration of large data sets in near real-time
zacharyjeans A2: Logistics. SAP HANA reduced a chinese bottled water company's calculation time from 24 hours to under a minute.
katsnelson Q2 Call Detail Records processing in memory. 9 billion CDRs per day. Can't think of a better case for memory
Natasha_D_G A2: In-memory tech = gold in #CX tactics and can drive proactive #custserv: up-sell, cross sell
cristianmolaro A2 I cannot think about any application that would not take advantage of faster processing...
InfoMgmtExec A2 - Apps such as High Frequency Trading and Real-time Risk/Fraud Analysis come to mind as strong users In-Memory. Many more.
cristianmolaro A2 When you remove the I/O constraints by going on-memory you will hit the next performance wall: CPU
jameskobielus A2 Killer apps of in-mem $ visual exploration, modeling & scenario exploration. Data science, "spreadsheet on stereo
jeffreyfkelly A2 smart meter analytics
katsnelson A2 many apps where data is not valuable enough to even store on disk. In Streams we process stuff in memory and discard
dfloyer The importance of Data in Memory (DRAM or FLASH) is increasing the number of DB calls and increasing the value to the App
CuneytG A2 fraud detection and investigation is a good candidate
Natasha_D_G Good for #finserv & #insurance RT @CuneytG: A2 fraud detection and investigation is a good candidate
TerraEchos Definitely has great security applications! RT @CuneytG: A2 fraud detection and investigation is a good candidate
zacharyjeans A2 - Apps such as High Frequency Trading and Real-time Risk/Fraud Analysis come to mind
BTRG_MikeMartin National security is another area in-memory is key, however we can't say more about it than that.
jeffreyfkelly A2 investigating network traffic issues, finding bottlenecks
cristianmolaro A2 on-memory allows applications to fully exploit today's more and more powerful CPUs... good news for bigdata!
cristianmolaro then scale with more CPU in parallel! :-) Then the next performance wall will be inter-CPU communication...
IBMbigdata Always a wall RT @cristianmolaro: Then the next performance wall will be inter-CPU communication...
jeffreyfkelly A2 analyzing high-velocity financial data in trading scenarios - no time to lose in this use case!
jameskobielus A2 Killer app of in-mem is ability to rapidly evaluate, iterate, & refine statistical models
dfloyer Memory in just DRAM limits the scope of the problem that can be tackled - large memory can impact L1/L2 caching performance
tomjkunkel And who are the 3 leading providers?
jeffreyfkelly A2 iterate, iterate, iterate
troycoleman Do you see any in-memory databases running on z/OS?
jameskobielus A2: Yes. Any transactional app that demands split-second response benefits from in-memory.
jeffreyfkelly A3 ad tech - analyzing user data, real-time bidding, delivering personalized content - in milliseconds
InfoMgmtExec A3 Orgs want entire Customer Base, Product Sku's & Pricing in Memory for rapid transaction processing. Customers will not wait!
katsnelson A3 many Streams apps are transactional and Streams is always in memory.
CuneytG A3 all oltp apps need to be fast. İn memory is fast too. So any oltp app is in the scope of inmemory
IBMbigdata Impatient souls RT @InfoMgmtExec: A3 Orgs want entire Customer Base, Product Skus & Pricing in Memory
BigDataAlex A3:Connecting the Internet of Things - IP addressable sensors to real-time calibrate our models for better predictive analytics
jameskobielus A3: faster transactions that result from caching more frequently used data in RAM at the server and/or client
Natasha_D_G Needed in our "instant" mrkt RT @InfoMgmtExec: A3 Orgs want entire Customer Base, Product Skus & Pricing in Memory
zacharyjeans Is @Spotify an in memory application?
Ercan__Yilmaz @Spotify uses in-memory caching, so do millions of others
zacharyjeans I suspected as much. I can't imagine that @Spotify could deliver seamless streaming w/out in-memory caching.
jameskobielus A3: next best action, bridging analytics & transactions, could benefit from in-mem 4 low-latency data & execution
jeffreyfkelly A3 any transaction workload that requires real-time response in order to win/save/upsell the customer is in-memory candidate
johncrupi We have to treat in-memory as the new architectural tier for real-time analytic apps
jameskobielus A3: transactions are C (create), U (update), & delete (D) intensive...all can go faster if in-mem & no disk access
troycoleman Do customers tend to replicate data to the target platform that is running the in-memory database?
BigDataAlex A3:Working in the oil and gas industry-energy exploration requires millions of transactions a day for discovery of new resource
jasebell Think Point of Sale predicted coupons via receipts, customers don't want to wait. In memory wins.
rkeshavmurthy To increase analytics speed and reduce manual tuning. e.g custom coupon generation
furrier where are all the big data apps? they are already here. Analytics & in memory make them better
furrier A3: memory is making up for disk speed & is now becoming more important in software models-big oppty
johncrupi analytics is the killer use case for in-memory, IMO.
furrier last night I spoke with SAP execs - apps are all the ones we know about & they are all getting bigdata upgrades with new tech
rkeshavmurthy customers are using in-memory approach for simply accelerating traditional BI, recently with analysis of sensor data
InfoMgmtExec A3. In-Memory db will allow Predictive Models to be deployed into Transactional Work Flows for real-time scoring & predictionzacharyjeans I imagine In-Memory computation could benefit @NASA's efforts to identify asteroids & their object oriented trajectories.