Q5 Once you decide what data to keep, how do you make sure it goes to the right systems and people?
jeffreyfkelly From a developer perspective, Big Data app dev tools need to improve, make it easier to deliver insight to business users.
BTRG_MikeMartin You must increase control of wasteful data even with Big Data Management, archive/retire & dispose http://t.co/Ta361ASBkP http://t.co/904FjTm2yA
Again it's a matter of information liability and asset management.
Which is more critical to your organization? Cutting risk or mining
BTRG_MikeMartin Big Data doesn’t change retention. Keep the data you need, get rid of the rest . You can’t afford to keep it all. http://t.co/Ta361ASBkP
dvellante This is a metadata problem / opportunity
craigmullins Policies, procedures, automation and education are needed to ensure that Big Data makes its way to the right systems + people?
Big Data approach needs to include improved business outcomes which
requires people process & technology working in harmony.
BTRG_MikeMartin You need to instrument processes to not only govern but make the best use of valuable data.
Data is code in the new paradigm of new apps & services - lots of
issues so developer create & data can learn & be smart.
furrier The integration of data create new datasets - future is smart data and learning data - data is code.
jeffreyfkelly Exactly, and new data sets could be highly sensitive - need governance RT @furrier: the integration of data create new data sets
Betharonoff Data as a commodity already exists, so economy is only a few steps down the road.
furrier Meta data practices will be impacted in this data quality and data-as-code concept.
furrier One aspect of this chat is business competitiveness in integrating data as code into business lifecycle and processes.
BigDataAlex Metadata tags are aligned to role based systems - automated systems.
If you don’t improve processes with Big Data management and create
better business outcomes your Big Data initiative isn’t a success.
Kari_Agrawal When and how do we decide to discard the extremely old data? Or do we retain it as in Data Warehouse?
craigmullins You need policies and automated procedures based on retention requirements.
PPB13 How do practitioners overcome emerging skepticism in the marketplace? http://t.co/x0pHJCDcfN
@BTRG_MikeMartin: You need to instrument processes to not only govern but make the best use of valuable data
BigDataAlex Moving "beyond search"
TheSocialPitt ALWAYS start Big Data project by thinking+planning. More data does not fix bad process.
BTRG_MikeMartin What processes to improve: ediscovery, ECM, Data Governance, Data Security, data retention, and data quality.
IBMbigdata Who decides "best"? RT @BTRG_MikeMartin: You need to instrument processes to not only govern but make best use of valuable data
joycetompsett Data quality has to take on the idea it will be moving around different sys/APIs Big Data management > critical for security #RSAC
BTRG_MikeMartin - Without the right tools data retention with Big Data could be a nightmare
skenniston RT @furrier: We all want data retention but who owns it after it's retained..will a data marketplace economy develop?
BTRG_MikeMartin That's where determining business value, legal and regulations come in typically only 30% of data.
Kari_Agrawal How exactly do we begin to classify data in case of Big Data?
craigmullins MT @PPB13: How do practitioners overcome skepticism... <-- by continuing to do work that adds value to your company
dvellante @BigDataAlex yes re: search - it's sometimes used as a 'blunt instrument'
Aarti_Borkar Deciding what data to retain needs to start with business policies defined upfront - its not an "on the fly" decision.
praxsozi RT @jeffreyfkelly: Q5 Big Data requires rethink of business processes - this is NOT a trivial exercise
Natasha_D_G Culture also plays role RT @jeffreyfkelly: Q5 Big Data requires rethink of business processes - this is NOT a trivial exercise
TheSocialPitt Data antique dealers RT @furrier: We all want data retention but who owns it after it's retained.will data marketplace develop?
tomjkunkel @BTRG_MikeMartin Integrated effort with Legal, Finance, Sales, Marketing with IT serving through best architecture.
BTRG_TomNestor The process must lead to better data which should drive better business opportunities.
BTRG_MikeMartin Big Data is not immune to the laws of information economics: http://t.co/Ta361ASBkP #CGOC
Q6 How does Big Data affect data lifecycle management? Does big data introduce new stages to the info lifecycle?
Summary of top answers:
BigDataAlex Yes, new stages - stages we haven't even imagined yet. Data needs to update itself into authoritative sources.
craigmullins One issue that arises is "How can you create realistic test data for testing Big Data systems and applications?"
jeffreyfkelly Yes, but we are just starting to understand Big Data lifecycle mgt - need to build out best practices.
BTRG_MikeMartin Big Data might not create new stages in life cycle management, but certainly with new domains we have to extend the data lifecycle to new platforms.
I disagree - I think new stage of LCM includes emergence of new data
sets created from integration of other data sets and then yet new data
sets created from integrating new new data sets, and on and on and on.
Aarti_Borkar Big Data makes handing the lifecycle of data a far more complex problem than before.
Natasha_D_G Can u say more? RT @Aarti_Borkar: Big Data makes handing the lifecycle of data a far more complex problem than before
Aarti_Borkar Big Data does not create new stages - just new ways to apply the existing stages to different use cases.
Dmattcarter What are some of those new use cases?
Aarti_Borkar Test Data and Privacy for Big Data is critical - as we bring in more data potentially creating a bigger security threat.
BigDataAlex Is there a new data management paradigm emerging?
craigmullins A new paradigm may indeed be emerging.
BTRG_MikeMartin RT perhaps a refined one
TheSocialPitt One new stage = 'ephemeral'.
craigmullins Let's not burden Big Data with things little data has not yet mastered.
craigmullins Sometimes we forget that - in practice - many orgs do not follow a lifecycle, practice data governance, ensure quality, etc.
craigmullins So yes, Big Data should do these things, but it is not failing if it does not.
Big Data Management requires identification & deletion of
ROT-redundant, obsolete & trivial data, which reduces storage &
StevenDickens3 What role does the community see for the original Big Data system of record the mainframe?
BTRG_MikeMartin Consider impacts of eDiscovery, governance, security and #ILM on Big Data stores how do we move traditional methods to Big Data management.
BigDataAlex Many organizations can only afford to store 20 copies of the same data - they are looking for authoritative against process.
jeffreyfkelly definitely, data sprawl becomes an issue
Kari_Agrawal How do we deal with redundancy in case of Big Data?
StevenDickens3 What is the collective view of centralised data vs multiple federated copies ?
Could some Big Data mgmt stages be the elimination of stages? Using
data/ data analysis without constraint and eliminating steps.
Aarti_Borkar Masking test data is essential to Big Data development: what the enterprise considers private needs to always be privatized.
craigmullins My next two Tweets mentioned some of them. Not saying Big Data shouldn't just that our stds should not be too high
craigmullins @BigDataAlex Yes #littledata concepts apply to Big Data... but many orgs still struggle with managing little data
tomjkunkel @Kari_Agrawal Destroy it! I can provide insight on best practices
Dmattcarter Pretty intense data quality and Big Data conversation going on around Big Datamgmt chat!
is already a data problem with #smalldata carrying it over to Big
Datamgmt. Too costly to delete all data that has no value.
BTRG_MikeMartin You must increase control of wasteful data even w Big Datamgmt, archive/retire & dispose: http://t.co/Ta361ASBkP http://t.co/904FjTm2yA
Q7 Are new tools and platforms required to manage Big Data and the new dimensions of the data lifecycle?
Summary of top answers.
craigmullins Tools for performing advanced analytics on Big Data – though not new to the industry – will be new to many organizations.
BigDataAlex Yes. We need new tools, platforms, and systems....it is happening. Calling for massive innovation - love #DataAsCode.
BTRG_MikeMartin @craigmullins It's called defensible disposal http://t.co/Ta361ASBkP
craigmullins Hadoop-based products will need to be augmented with mission-critical DBMS capabilities to become de rigueur.
craigmullins But I think DB2 (and other RDBMS products) could be extended with Big Data capabilities before that happens.
BTRG_MikeMartin Flexibility & scalability of Big Data platforms will themselves assist in helping Big Datamgmt security & controls...
BigDataAlex We need DigitalDNA - anticipating the Internet of Things - World Wired Web.
Aarti_Borkar It’s a mix of new tools and enhancing existing tools. The core solution does not change it morphs
StevenDickens3 All depends where data resides today and whether the current platform/tools are fit for purpose, if yes why move or retool?
tomjkunkel Legacy storage assets can't handle the high availability,low latency applications and need to be displaced.
jeffreyfkelly Yes, a major topic at #strataconf is making Big Data enterprise ready -need better mgt, data gov, DQ capabilities.
Aarti_Borkar Key innovation is required to ensure that both traditional and#big data are uniformly governed.
BTRG_MikeMartin Infosphere Optim helps you get control of structured data to feed only the good into Big Datamgmt: http://t.co/Y5Jniunn6N
jeffreyfkelly And don't forget security - #RSAC - must keep Big Data secure
craigmullins Which brings up regulatory compliance... another big issue
Aarti_Borkar Big Data gov starts with a uniform set of data classification and policies that cover ALL data. Metadata is the magic here.
craigmullins If the Big Data contains PII then all the regulations that apply to PII still apply - doesn't matter how big the data set is.
BigDataAlex Does this spill over into machine learning? Can we reduce dimensionality of data through associative memory?
BTRG_MikeMartin Yes innovation & big ideas as well as change our paradigms.
Betharonoff Interesting query RT @BigDataAlex A7: Can we reduce dimensionality of data through associative memory?
Q8 How does Big Data impact data stewardship? Who “owns” particular data in a big data environment?
BigDataAlex Great question - ownership is beginning to blur - standard licensing models for data are being challenged.
craigmullins All data is owned by the company, whether it is Big Data or not…
jeffreyfkelly Ah, but is it? social data, market data etc.
Internal ownership of Big Data while beyond traditional areas should
still be based on business value, compliance or legal hold.
craigmullins Of course proper data governance policies need to be enacted by the corp to confer #datastewardship and ensure proper treatment
BTRG_MikeMartin Without good data stewardship & Big Datamgmt it will difficult to unlock the value of big data: http://t.co/hGJ3QkTiJf
Aarti_Borkar Ownership of replicated data is the original biz owner- governance of that data is still their problem.
jeffreyfkelly This is a really hard one, again new biz processes informed by Big Data will impact who owns the data.
Aarti_Borkar Stewardship does not change just because a new copy of the data was created.
craigmullins True, but some Big Data is all new.
craigmullins The word "own" is always so troublesome, isn't it?
BTRG_MikeMartin Yes it needs to be well defined .
Aarti_Borkar @craigmullins - Oh so right! .. think "Responsible for".. is better than "own"...
BigDataAlex If DataAsCode, then if DataAsCode is viral, can it be controlled? Do we want it to be controlled? What does ownership mean?
BigDataAlex How does OpenSource apply to our Data?
BTRG_MikeMartin For more Big Datamgmt resources Data Privacy and Security: http://t.co/UL0VNCiivP
is a lot of information! I hope you can follow the discussions. I
tried to clean up a little bit and hope that I didn’t change any content
from the participants.
Today I spent an hour taking part in the TweetChat at Big Datamgmt focused on governance to avoid a data landfill: http://t.co/j2wojSb9Hf: "Getting Control of Data in Big Data Era"
it went too fast for me to actually be a contributor, so I was
participating as a reader / listener. This kept me busy enough since by
the end we had generated a fair about of Big Data ourselves: 647 tweets, 180 users with reach of 136,229 & 1,506,585 impressions.
Who were the experts?
and facilitators / moderators:
There were 8 questions posed over the hour, but I'm only posting the first 4 here.
Q1 In this Big Data era, do traditional concepts data quality, data governance & data stewardship even apply?
A summary of the answers:
Big Data refers to datasets whose size, type and speed of creation make
it impractical to process and analyze with traditional tools. That Big
Data definition comes from wikibon; see http://t.co/awsPyuqXjZ. So given that, definitionally then, traditional concepts are at the very least “impractical”… no?
dvellante My belief is that ingest process & analysis of data changes with big data.
BigDataAlex Yes, I think they apply. Our clients are very concerned about these issues and it does apply.
jeffreyfkelly Absolutely, but vastly more complex.
Natasha_D_G Traditional concepts are even more critical in Big Data era especially in data governance.
craigmullins But, of course data quality, data governance and data stewardship SHOULD apply in the age of Big Data Management.
You still need clean and common policies for data taxonomies; but the
unstructured and semi-structured data texture requires some new thinking
and technology. Specifically ideas around function shipping, name value
pairs, Hadoop, etc - applying traditional concepts to new model.
Dmattcarter In order for Big Data to be enterprise-ready, it needs to include those traditional concepts.
jeffreyfkelly The challenge is applying DQ and governance to high velocity data - hard enough with "traditional" data, ie CRM, ERP.
craigmullins Failing to apply these concepts will result in poor data quality. Analytics performed on bad quality data produces bad results.
BigDataAlex I think transparency is important too in this era of Big Data and how we govern. I would suggest Big Data Ethics manager.
BTRG_MikeMartin IG concepts apply to Big Data even more so as the issues solved by Information governance are only exaggerated.
furrier Data quality has to take on the idea that it will be moving around different systems/APIs.
Yet there are issues and adaptations that will be required as we apply
data quality, data governance and data stewardship to Big Data
BigDataAlex Love the challenge on high velocity data....algorithms in streams.
jeffreyfkelly Big Data is experimenting with data sets, while governance is applying policies that sometimes restrict experimentation.
BTRG_MikeMartin You can’t make good business decisions on bad data. http://t.co/8J1pQPy6eW
Natasha_D_G Data quality is an issue as "94% biz believe some of their customer/prospect info is inaccurate".
Data governance is critical in the Big Data management era as it makes
small problems bigger. You need data quality to enable Biginsights http://t.co/yVTA9NpXIB
furrier Data as a resource for applications; ownership of data is important to individual and/or company.
BigDataAlex In health care sector, orgs are combining medical ethics with their CIOs.
Aarti_Borkar Governance is even more important with Big Data as the security and trust is a bigger business issue now.
dvellante In part this is a discussion around the balance between data being an asset an a liability - good DQ is important for both.
searchCIO Metadata practices are gaining momentum as companies tackle Big Data. http://t.co/DSkdH4Yk6S
Q2 With data at unprecedented speed/volume, how can data quality measures be applied in time for analysis?
A summary of the answers:
With data quality, cleansing can occur as humans eyeball the data -
most raw Big Data is not eyeballed. In some cases (e.g. medical
devices, automated metering, etc.) only rudimentary cleansing (if any)
may be needed. At least as long as the meters are calibrated and
BigDataAlex Real-time analytics is critical. We love Streams. The right algorithm at the right time.
Natasha_D_G Trust = Word we try to avoid. @Aarti_Borkar: Governance is even more important with Big Data as security & trust bigger biz issue.
To deal with Big Data, speed, and volume: be proactive by starting
Big Data Management across the enterprise now & maintain http://t.co/hGJ3QkTiJf
Aarti_Borkar Data Quality for Big Data can be handled right upfront before starting Big Data analysis
BigDataAlex A next-generation of KPIs for quality vs. quantity are being implemented to separate quality from quantity in real-time.
furrier Data quality is about the context of the application & what users experience for each use case is not always the same.
jeffreyfkelly Machine learning is required to improve data quality for Big Data - velocity too high for human methods IMHO
nenshad Variety of algorithms include semantics
zacharyjeans Ask your Big Data well crafted questions. Sloppy questions lead to sloppy answers.
craigmullins Speed + volume make data quality challenging…
searchCIO Data Quality is essential to master Big Data Management http://t.co/pxZ49Xgimm
BTRG_MikeMartin Start now on data quality because if you don’t have it in now Big Data only magnifies data issues http://t.co/hGJ3QkTiJf
Natasha_D_G Excellent question especially given social media data and its 18 minute life span
jeffreyfkelly Also with Big Data, volume of data can sometimes smooth over anomalies in data quality.
Aarti_Borkar Data quality should also be handled as the results of the analysis are merged back into the reporting marts.
BigDataAlex The right analytics at the right time against the systems of systems integration.
dvellante Perspectives from a former CIO on the importance of data quality http://t.co/mYPfqNCCjm
nenshad It’s all about the data first
dvellante In my view you can't deal with Big Data quality unless you can automate the classification of data at the point of creation.
Kari_Agrawal How exactly do we clean the data when it has no structure?
BTRG_MikeMartin You can’t make good decisions and enable business biginsights without high data quality.
furrier Dirty data equals poor user experience. I wrote about it in 2009 re: twitter facebook & social data http://t.co/vpkfB0xS3h
Aarti_Borkar Data quality should be handed as part of data integration as the Information Server customers do - its the same with Big Data.
Q3 How do data governance policies apply when the point of Big Data is to explore novel use cases?
A summary of the answers:
craigmullins Finding novel uses of data does not diminish the need for data governance policies.
Natasha_D_G True, but still need boundaries.
BTRG_MikeMartin Exploring Big Data still requires trusted data so you must secure and govern even more so. http://t.co/UL0VNCiivP
craigmullins The novel uses need to be documented as part of the data governance policies.
BigDataAlex The right policy at right time. I think you can agility with accountability.
craigmullins Keeping in mind that even under ideal circumstances data governance policies can be difficult to enact.
Big data isn't just for novel new business cases - it can also vastly
improve value in existing ones - i.e. R&D, cust service.
craigmullins Consider non-intrusive data governance; see this article by my friend Bob Seiner http://t.co/GogojXCcoV
Seiner states: data governance refers to the administering
(formalizing) of discipline (behavior) around the management of data.
craigmullins And data governance is an on-going process; it should formalize what already exists + address opportunities to improve.
jeffreyfkelly There is a need to set up boundaries but give analysts freedom to explore Big Data.
furrier Innovation will not come from regulations but creative developers to play with data -#slipperyslope
Q4 How does Big Data change data retention policies, ie, deciding what data to keep vs dispose?
A summary of the answers:
tomjkunkel Formal Data Destruction processes minimize the growing data landfill and need to be incorporated into Data Lifecycle Mgmt.
dvellante: Still must be able to defensibly delete data. you may not want WIP data hanging around - too much of a risk.
BTRG_MikeMartin Big Data is not immune to the laws of information economics: http://t.co/Ta361ASBkP
BigDataAlex Focus on workflow, business process, optimization. There is no set answer. Filtration - distillation
BTRG_MikeMartin Velocity of Big Data means current best data is changing rapidly, you want decisions on the best info.
BTRG_MikeMartin: It is important to have Big Data Management framework for good business outcomes inc. policy, security, ILM & quality.
Data is retained for internal + external reasons... Internal because
the org needs it for business – external because the law demands it.
tomjkunkel Isn't there also a need for Data Entrepreneurs (A business perspective with a knack for data)?
You may choose to retain more data for Big Data Management analytics
but be careful because data once retained is discoverable during court
furrier Big data complicates data retention policies - we have shadow IT and now "shadow data" or what I call "dark data".
Natasha_D_G Big Data can extend data retention esp in R&D. Pharmas can leverage old research to accelerate new research.
jeffreyfkelly This is a major issue: with hadoop you can now store all data inexpensively - not possible before and new challenge.
BTRG_MikeMartin NO still too costly.
Kari_Agrawal If we see the huge amount of IP packets flying around, can we process those packets to get something meaningful?
craigmullins There are over 150 different regulations (at the local, state, national, and international levels) that impact data retention.
Aarti_Borkar Retention is about storing what the business needs later vs everything - that core concept does not change with Big Data.
BigDataAlex Do we need to store everything? Can we, should we?
craigmullins No, no, and no to that last series of questions!
Natasha_D_G Data hoards say keep all! Fear of losing critical info.
jeffreyfkelly Nothing worse than looking for data you know you had only to remember you threw it away!
Aarti_Borkar Defensible disposal of data becomes harder if multiple copies are made as part of Big Data analytics.
craigmullins MT @Aarti_Borkar: Defensible disposal of data becomes harder if... hence the need for #datagovernance policies!
furrier We all want data retention but who owns it after it's retained..will a data marketplace economy develop?
TheSocialPitt Storage is a huge challenge, especially in cases with many streaming video feeds, e.g. defense.
Keep in mind regulations haven't caught up w the technology - industry
needs to be proactive on this issue or the government will.
Aarti_Borkar Big Data allows for pattern searches and trends in retained data that was not easy to do earlier.
is a lot of information! I hope you can follow the discussions. I
tried to clean up a little bit and hope that I didn’t change any content
from the participants.
To find out more about managing big data, join IBM for a free event: http://ibm.co/BigDataEvent
* Originally published on July 30, 2012 by Crysta Anderson at MasteringDataManagement.
Champion for DB2 Bjarne Nelson has trained DB2 users around the globe,
from his home in Switzerland to multiple Korean engagements. Regardless
of location, though, Bjarne bonds with students who really want to learn
how to get more out of DB2.
1985, Bjarne has consulted in multiple industries, becoming an IBM Gold
Consultant in 1996 and working closely with IBM Labs. He also serves on
the IDUG Board of Directors and has chaired the European Conference.
know Bjarne by his nickname, Kermit. Of course, we had to ask the
origin of the moniker. Bjarne explained that he once had long hair, a
“very green car,” and did a spot-on impression of Kermit the Frog as he
introduced his rock band to crowds.
name stuck, as has the role of mainframes. Bjarne disagrees that
mainframes are going the way of the dinosaur. He explains, “Of course
there are huge changes in technology, but the mainframe is capable of
still surviving. It’s probably the longest living platform that we have
in IT…that continues to maintain its virtue of stability, performance,
high scalability. And at the same time, you can still build modern
programming paradigm applications,” while letting the data reside safely
on a “good old mainframe.”
while mainframes persist, organizations face evolving compliance and
security needs that require new software and solutions. Bjarne has found
DB2 10’s new features very helpful for meeting these needs, though new
software means migrations.
all have to go through migrations,” Bjarne noted, explaining that
conference and web sessions dealing with migration topics and best
practices tend to stand the test of time long after their original
presentation. Such migrations – which may lag several years after a
release – make it more important than ever to collaborate throughout the
migration is an educational experience, Bjarne said, and afterwards,
many “are proud of what they know” and want to share their insights.
Hence, Bjarne has worked with IDUG to assemble a content committee that
shares information between conferences. The resulting IDUG DB2 Tech Channel on BrightTalk has proved very popular, with more than 1200 technical webcasts ready for viewing.
that doesn’t mean conferences are going away. Face-to-face
conversations and the opportunity to ask questions are very valuable.
Bjarne explained, “If you go to an IDUG conference and pick up 3 or 5
good things you can go home and implement…that’s enough to pay for a
conference because quite often, it’s the issues you’d otherwise spend
hours and hours trying to solve yourself.”
his spare time, Bjarne is an avid SCUBA diver, particularly in the Red
Sea, and has an impressive music collection that he uses to bridge the
generation gap with his children and granddaughter, Naia, when she gets
old enough to choose her own music.
year, David Pittman and Crysta Anderson interviewed many of our IBM
Champions and created a series of blog entries and podcasts on
“MasteringDataManagement". This site is going away, but I thought the
content was still very valuable.
a result, I have permission to republish these articles on my blog. I
plan to do one every day or so until I’ve finished. Here are the
entries I plan to publish:
- Bjarne Nelson
- Phil Gunning
- Roger Sanders
- From Idea to Print
- Cuneyt Goksu
- Julian Stuhler
- Martin Hubel
- Bonnie Baker
- Iqbal Goralwalla
- Alex Philp
- Dan Luksetich
- Dave Beulke
- Isaac Yassin
- Cristian Molaro
- David Birmingham
- Kim May
- Fred Sobotka
- Sheryl Larsen
- Scott Hayes
- Mike Martin
I’ll update the entries if necessary and will add links to this page so you can easily find out about our awesome IBM Champions!
Thanks to Crysta and David for all their hard work. I hope you find this series as valuable as I do.
Modified on by svisser1
IBM InfoSphere: A Platform for Big Data Governance and Process Data Governance
by Sunil Soares
ISBN 978-1583473825, MC Press, February 2013
Governance has taken a backseat to the analytics and technologies associated with big data. However, as big data projects become mainstream, we anticipate that privacy, stewardship, data quality, metadata, and information lifecycle management will coalesce into an emerging imperative for big data governance.
Foreword from David Corrigan, Director, Product Marketing, InfoSphere
The importance and the role of a governance strategy are still not well understood. Information Governance is a business strategy that has a series of IT deliverables. Sunil has been one of the pioneers in this area, defining the Unified Information Governance Process several years ago. He defined several key steps, such as identifying a business problem and executive sponsor, setting up cross-functional governance boards, and measuring and communicating success. He has applied this process at hundreds of clients and has helped them achieve successful implementations. His approach can also be applied to governing big data. It has helped many organizations get the business involved in governance and establish trusted information for a key enterprise application.
In short, this process helps you move beyond an IT project toward a true business strategy. It helps by getting business executives and owners involved in the process of governing data. It helps ensure successful outcomes. Sunil, thank you for continuing to contribute to the discipline of Information Governance and move it into the new era of computing—the era of big data.
And to the readers of this book, remember that the competitive advantage you seek from insights garnered from big data has two components: big data analytics and trusted information. Information Governance creates trusted information from very uncertain sources, enabling you to trust and act upon the insights from analytics. I wish you well in your big data strategy.
Sunil’s other books:
The IBM Data Governance Unified Process (MC Press, 2010)
Details the 14 steps and almost 100 sub-steps to implement an information governance program. The book has been used by several organizations as the blueprint for their information governance programs and has been translated into Chinese.
Selling Information Governance to the Business: Best Practices by Industry and Job Function (MC Press, 2011)
Reviews the best practices to approach information governance by industry and function.
Big Data Governance: An Emerging Imperative (MC Press, 2012)
Discusses the governance of different types of big data.
Congratulations to Sunil on this latest book!
the three judges of the DB2Night Show “DB2’s Got Talent” Competition
presented along with host Scott Hayes. It was fun, and hopefully
informative to the audience.
presented first and sadly ran out of time! It is much harder than it
looks. My presentation was on the Power of Social media. I gave tips
on how people can promote the work that they do to the DB2 community,
how to listen to what is going on in the community, and how people can
contribute via likes, comments and sharing the content provided by
I didn’t come close to finishing (did I spend too much time on my bio?
give too many examples?), I’ve posted my slides in SlideShare. Here’s
where you can get them: Db2 night social_promoting_susan.
presented next: “How I Started Dropping DB2 Indexes & Put my Girls
Through College”. Martin is a great teacher and his presentation was
very informative. He also ran out of time and reminded us that he’ll
being doing a similar presentation at IDUG this year.
was next: “ Breakthrough DB2 LUW Performance Every Day!” . I loved the
one quote that Scott used: “Success comes to those who focus on helping
other people be successful”. I would say that all involved in this
show have this sentiment in mind! Entertaining presentation and even
Scott ran out of time!
came Klaas: “Da House is on Fire: How to Fix a Performance Problem”.
Very interesting graphics, but very effective as well. Klaas is a
natural speaker and if you get a chance, you should attend his talks.
Of the 4 presentations, Klaas was the only one who finished on time!
voting was interesting, but not surprising. Clearly the audience
prefers technical content about DB2. My presentation was in a
completely different category! Congratulations to Martin for being the
overall “winner” of this show.
- Attend the shows in March to witness our finalists presenting in aim of the big prize.
- Watch the replays! There have been 220,000 downloads so far. Only 30,000 more to get to ¼ million!
- When you watch today’s replay, be sure to fill out the survey. Every week someone wins a $25 Amazon.com gift certificate.
tweetchat happens when a group of people all tweet about the same topic
using a specific hashtag that allows it to be followed on Twitter.
We have one planned on Wednesday, February 27 at noon ET, so you can give it a try! Follow #bigdatamgmt to
follow a discussion with panel of experts discussing how data
governance can handle big data. Our panel includes experts from across
the spectrum, including:
and perhaps a few other people as well!
Follow along with the #bigdatamgmt to see the conversation and share your thoughts.
If you’re not yet on twitter, you can still listen to the conversation. Simply use Google Search on the term #bigdatamgmt and you’ll see what has been posted using this term.
The Value of Real-Time Decision Making with PureData Systems for Operational Analytics
host Serge Rielau and guest James Cho (PureData Systems architect) for
an in-depth discussion about how the PureData System supports real-time
decision making. Real time operational warehousing can help your
organization by providing immediate analytics to speed decision making,
for example, fraud detection.
will also explain the features and advantages of this expert integrated
system that provides simplified deployment, maintenance, and more.
world of data management and analytics is changing very rapidly. This
talk is a great way to expand your knowledge to include some of the
emerging techniques and technologies.
Click to register.
Date: Thursday February 28, 2103
Time: 12:30-2:00 PM Eastern / 11:30 AM Central / 9:30 AM Pacific / 17:30hrs London/ 18:30hrs Frankfurt, Paris / India 11 PM
DB2 Tech Talk series is a free set of monthly in-depth technical
webinars from IBM technical professionals. Join the experts to learn
about DB2-related breakthrough technology, best practices, and get your
questions answered during live Q & A sessions at the end of each
Replays of many of the newer webcasts are available on the DB2 Tech Talks page.
Here are some replays you may wish to check out:
Join this free SSWUG Webcast: The Big Deal about Big Data,
staring expert Paul Zikpoulos. The webinar is free only during the
live broadcast that takes place Wednesday, February 20, 2013, 1:00 PM
Eastern. What you’ll learn: Big
Data can mean a lot of things to a lot of people; but one thing we're
sure of, it's the hottest thing to hit the IT landscape. In this chat
you'll get a comprehensive introduction to Big Data. You'll learn how to
spot Big Data, it's characteristics, and what the opportunities are.
(Hint, be prepared to get shocked on Volume and more). You'll get a
taste of Hadoop, but realize Big Data is so much more. Paul
will also share top things to consider in the Big Data world that's
often overlooked (governance, integration, search, and more). Consider
this: if you Google search 'What is Big Data', you will get almost 1
billion hits!!! If you attend this session, you'll never have to Google
search this phrase again.To register and for additional information see: Webcast Structure and Cost
Note that listening to the live broadcast is free, but ordering the replay has a charge.
had 4 great presentations again this week! I’d really like to thank
all participants... I know it isn’t always easy to present, but it is
very important for us to share our knowledge with others!
Mikko - Summary Table Design & Monitoring
A little trouble following the story of the problem & solution. Summary tables are a huge topic. Well presented.
Umesha - Detailed steps to configure DB2 Distributed Data Facility (DDF) for z/OS
Good slides. Not practised? Went over time. Missing the “why”. Only “how” was covered in presentation.
Sreeharsha - High Data Availability: solutions and techniques
clearly defined. Problem clearly defined. Presentation was well put
togetehr and rehearsed. This is the way it’s done!
Armando - Dare to be Different (Against all Dogmas) SQL Performance Tuning
Be open minded on how to solve problems. You may find a creative solution that wasn't’ expected
Entertaining and engaging presentation with a great amount of style.
Congratulations to Sreeharsha, Armando, and Mikko for moving to the next level.
sure to join us next week to hear presentations from me, Martin, Klaas,
and Martin. Your turn to evaluate us! Just for fun, of course. Come
and have some fun!
Over 219,000 downloads of past DB2Night Show episodes! Congratulations Scott!!
If you missed the show, see the replay!
sure to fill out the survey when you do... you may be the lucky winner
of a $25 certificate to use at Amazon.com. Perhaps there is a DB2 book
you’d like to purchase?