The issue or potential bug i am raising is not specific to Activities Monitor alone.
At a workload level, there are following clauses:
COLLECT LOCK TIMEOUT DATA,COLLECT DEADLOCK DATA,COLLECT UNIT OF WORK DATA
At the DB CFG level there are equivalent parameters:
If you set MON_parameter in DB CFG, then no need to set at Workload level.
Creating a Lock Timeout Event Monitor and set MON_LOCKTIMEOUT parameter and it will capture Lock time out events, even though you did not set COLLECT LOCK TIMEOUT DATA at Workload level.
Whereas the same behavior is not observed when using MON_ACT_METRICS parameter.
A functionality when introduced should be consistent across all variations. Else it should be documented properly as limitations or Exceptions. Neither was done in my Example.
Hope this helps you in understanding the issue and why I am emphasizing about this functionality. If already fixed in some release let me know.
My Environment : DB2 V10.5 Fix Pack 0 Express-C Edition on Windows 7.
In Terminal 1:
db2 activate db test
db2 connect to test
db2 create event monitor actevmon for activities write to table
db2 update db cfg using mon_act_metrics extended
db2 connect reset
db2 deactivate db test
db2 activate db test
In Terminal 2:
db2 connect to test
db2 +c "select * from t_tableA with cs"
db2 +c "update t_tableB set name='XD' where id=5"
db2 +c "call myproc()"
db2 connect reset
In Terminal 1:
db2 connect to test
db2 select count(*) from CONTROL_ACTEVMON
Output is 2: Indicating Event Monitor Start Up.
db2 select count(*) from ACTIVITY_ACTEVMON
db2 select count(*) from ACTIVITYMETRICS_ACTEVMON
db2 select count(*) from ACTIVITYSTMT_ACTEVMON
db2 select count(*) from ACTIVITYVALS_ACTEVMON
All Query Output is 0: Indicating Events not captured.
A continual stream of data reporting requests can be a time consuming process that takes you away from key priorities. Each request may involve tuning, partitioning, creating indexes and more. When the report needs changing or a new report is required, you have to repeat the process. This takes lots of time.
Check out BLU Acceleration next generation in-memory computing on the IBM BLU Acceleration hub to make data reporting and analysis fast and simple—no tuning, no tweaking. A recommended page for technical professionals to get started is here: http://www.ibmbluhub.com/get-technical
Bookmark this site and use the share buttons to share BLU Acceleration with your colleagues.
The 3rd KIDUG DB2 event is happening this November, 15 from 9:30 am to 3:30 pm. No registration fee and open to IBMers as well as non-IBMers.
When : 15th Nov 2014 (Saturday) from 9:30 Am to 3:30 PM
Venue : Techno India Campus,Salt Lake,Sector V, Kolkata, India
Who can join : Anyone who is having interest in DB2 or working on DB2
How to book your seat : Send a mail from your official mail id to email@example.com with subject line "I will attend"
Leading Speakers from : Capgemini, MJunction, TCS, IBM
Few lucky KIDUG participant will be able to WIN FREE DB2 Books of worth US$ 17.95 through Lucky draw
The Bengaluru DB2 Users Group is excited to announce its upcoming Users Group Conference, scheduled for Saturday the 21, June.
The meet, lined up with an amazing set of speakers, promises to provide a greater understanding and enablement of products using DB2 and more. So if you are a DB2 expert or, simply an eager learner then join us for the event happening at the IBM office, Embassy Golf Link, Bengalore.
Pre-register for the event by sending an email to firstname.lastname@example.org with "I will attend" as the subject line.
Encourage you to share the information to other DB2 users in your company.
See you there.
Many times we do convert Oracle PL/SQL Stored procedures as it is to DB2 and face issues in debugging them, especially if they are nested, In DB2 we do not have any easy method right now to do this debugging other than to use some kind of variable tracking in Data Studio. But problem becomes more when we have nesting and would end up re-writing the stored procedure in DB2 SQL to overcome the debugging issue and improve performance.
The other way would be to break the code in parts and add in chunks and execute.It will make the job simpler and can identify the error easily.
You can create a interim table and put a checkpoint by entering the status of execution as part of the procedure or package to narrow down the problem.
Let's all face it, Cloud Computing was a Fad in 2010, 2011 it drove influence to key IT Decision makers to conduct Cloud POCs in the Enterprise, and in 2012, the drive to adoption will further increase if the current rate of demand continues.
Today, I delivered a webinar on DB2 and Cloud Computing with Databases in the Cloud on IBM DeveloperWorks.
It was an amazing experience talking to DB2 Users and IT Pros in general.
The focus of this article is to illustrate some of the key benefits and resources available for implementing a Cloud Solution.
I have attached the presentation that outlines the presentation.
So please feel free to refer to the above resources and send me an e-mail in case you have any questions.
Look forward to my blog post, where I shall be teaching you on how to get started with Building VMs from Scratch.
President, DB2 India Users Group
DB2 and Cloud Computing Evangelist
Follow me on Twitter : @dbhitman
On April 7, the Object Management Group (http://www.omg.org/), announced the Cloud Standards Customer Council (http://www.cloud-council.org/) to guide how we, as an industry, evolve cloud standards based on real world use cases and experiences.
OMG Press release - http://www.omg.org/news/releases/pr2011/04-07-11.htm
The goal of the initiative is to:
- Drive user requirements into the standards development process.
- Establish the criteria for open standards based cloud computing.
- Deliver content in the form of best practices, case studies, use cases, requirements, gap analysis and recommendations for cloud standards.
As Cloud Computing continues to evolve it is very important that the cloud remains Open and that there is a process to make cloud standards customer driven.
The membership has grown from 45 at the launch to over 100 members in a very short two weeks. Members represent a variety of industries and geographies all committed to working towards an Open Cloud.
Mel Greer of Lockheed Martin has been appointed the interim chair of the Council’s Steering Committee. Formal elections of the Steering Committee will take place at the first face-to-face meeting on June 21st.
Your support for the Open Cloud Manifesto is appreciated and we see the Cloud Standards Customer Council as an extension of the Manifesto.
You are invited to have your company join the Cloud Standards Customer Council and add your voice and your requirements to the community and become part of a team dedicated to acceleration the adoption of Cloud Computing and ensuring flexibility, portability and interoperability of Cloud services by ensuring that the cloud remains open.
Membership is free for qualified end-user organizations. The membership application is available at http://www.cloud-council.org/application. Vendors may join as sponsors. For membership or sponsorship questions, contact Ken Berk at email@example.com or +1-781-444 0404.
Make Cloud Standards Customer Driven.
Look forward to your participation in the Cloud Standards Customer Council.
Thank you, Dave
The first virtual meeting of the CSCC (www.cloud-council.org) has been held. Work is underway to prepare for the first Face to Face meeting in Salt Lake City on June 21st.
Several key areas were discussed, but the ones that surfaced to the top were the identification of candidate Working Groups and also the nomination process for the Steering Committee.
The working group discussion evolved around identifying areas which the members felt that further discussion was needed to ensure that the members' requirements were addressed without reinventing work that has already been completed. The following is a list of the candidate working groups, of which, the ones with the highest priority will be launched on the F2F:
+ IaaS - Evolution from Infrastructure to Workload Management
+ PaaS - Landscape and Boundary
+ SaaS - Industry Vertical ( Retail, Finance, Telco and others)
+ Mapping the Cloud Space
+ Social Business Standards for Cloud
+ Practical Guide for Cloud Computing
+ Cloud Computing Reference Architecture
+ Cloud Computing for Media
+ Cloud Computing for Legal
+ Industry requirements for Cloud Computing
= Financial Services
At the F2F meeting, the chairs of the respective WGs will be elected by the WG participants.
In order to ensure order as the CSCC evolves, a Steering Committee will be put in place with the members of the steering committee coming from the membership. Elections of the Steering Committee will be held on June 21st. To take a leadership role within the CSCC as a member of the Steering Committee, one must be a member of the Council first. The process is simple and for qualified enterprises, there is no fee. If you have any questions on the membership process, please contact Ken Berk, firstname.lastname@example.org or 1.781.444.1132 ext 150
The membership application process can be initiated at www.cloud-council.org/application.
NOTE: Membership has tripled since the announcement in April and is now over 140 companies.
Once the membership process has been completed, one can submit their nomination to be a member of the steering committee by sending an email to email@example.com.
Look forward to your participation in the CSCC and adding your voice to Open Cloud Computing.
Modificado el por MahadevKhapali
Comparison between DB2 Universal and DB2 Server_t images on v10.5 Fix Central
Introduction to IBM Support Fix Central
Below is the homepage for IBM Support Fix Central which provides fixes and updates for all DB2 versions and releases.
Step 1: Go to Select product tab and select the required product, version and platform for which you are looking for fixes and updates. Press continue.
Step 2. If you are looking for all the available fix pack updates then select “Browse for fixes' Else select the relevant option for specific APAR or fix Ids.
Step 3. Below page shows all the available Fix Pack downloadable for DB2 v10.5 fix pack 4 for linuxamd64 platform. Select the required image and download.
Comparison between Universal and Server_t image
Universal image: DB2-linuxx64-universal_fixpack-10.5.0.4-FP004
Server_t image: DB2-linuxx64-server_t-10.5.0.4-FP004
Above DB2 Server_t image is used to update binaries in cases where user have only one product installed. For example server_t image can be used to update products like: DB2 Server Edition, DB2 Connect Server Edition, DB2 Express Edition, DB2 Client and DB2 Runtime Client.
New Video Uploaded for Securing Business logic in DB2 by OBFUSCATING DDL.
Simple yet Powerful.
Conceal/Mask/Hide your critical business logic in Database using Obfuscation of DDLs.
Method 1: db2look with WRAP Option
Method 2: Function DBMS_DDL.CREATE_WRAPPED(string)
Method 3: Function DBMS_DDL.WRAP(string)
Concealing Business logic in Database - OBFUSCATE DDLs
Happy Learning & Sharing
Did you know? Your experiments on Bluemix can stand you a chance to win movie tickets?
You heard it right! All you need to do is build an app on Bluemix, Share it on your social network and wait to get lucky.
Dr. Angel Diaz, VP, IBM SWG Standards is the thought provider behind an
article on the activities to ensure that the standards being developed
for the Open Cloud are drivers by the User.
an easy read, here is the link to the article -
OpenStack - http://www.openstack.org/
Cloud Standards Customer Council - http://www.cloud-council.org/
TOSCA - https://www.oasis-open.org/committees/tc_home.php?wg_abbrev=tosca
Modificado el por Prashant Dagar
Cloud computing has almost become a rage among IT industry and businesses today. It is transforming the way organizations create software, applications and the way they do business. The fundamental need of focusing on core business, controlling IT expenditures and adaptability to changing business ecosystem is driving companies to move to cloud.
Establishing a in-house data warehousing and business intelligence (BI) environment is not a trivial task, organizations have to spend millions of dollars to procure hardware, software and then spend months in installation, configuration and optimization before they could actually start using these systems. In addition to this, top it up with the investment in resources for continuous administration and the periodical hardware upgrades to manage growth and keep the momentum going.
All the above factors combined together make it very compelling for companies to make a radical shift of some of their on-premise analytical data warehouse environments to cloud. It simplifies and speeds up analytics without the need of deploying heavy weight infrastructure and teams. On-demand resource provisioning helps in accommodating real-time workload surges without much manual interference. Imagine having a plethora of compute capacity lying idle in server rooms for once or twice a week reporting vs. having it on cloud and only paying for the required usage.
However, irrespective of these innumerable benefits which a cloud can provide it still involves certain challenges which may make businesses wary of putting their data warehouses on cloud:
What type of data can be put on public cloud?
How secure it is to put sensitive organizational data on public cloud?
What volume of data can a cloud environment support? Loading huge amount of data which is very typical of a data warehouse requires high bandwidth, how efficiently can a cloud handle that?
Performance of a virtual machine on cloud may not match that of a bare metal server.
This may impact the complex analytics being performed on a data warehouse.
What could be the impact on business due to loss in transaction latency arising out of communication over a network (large distance between datacenter and users and/or lower bandwidth), especially in financial world?
There are several vendors who now provide data warehouse and BI as a service but everyone may not be able to handle the complexity of a data warehouse and analytics ecosystem on cloud.
IBM’s BLU acceleration on cloud is an offering which provides a self-service BI and data warehousing on cloud using best in class security and other features to support even the most complex production environments. It is powered by IBM DB2 with BLU Acceleration a next generation in-memory technology. Columnar data processing and high compression rates combined with an enterprise class BI and DW tools like Data Architect, Cognos and compatibility with R help customers transform their data into insights at speed of thought. Through BLU Acceleration on Cloud now even small organizations who could not afford to establish data warehouses earlier can have access to one of the most advanced analytical environment and make the best out of their data at a very low cost.
BLU Acceleration on cloud is available on IBM Softlayer and Amazon Web Services (AWS). I am really excited to invite you to get hands on experience of the technology through a Free Trial (in Beta). Do let us know your feedback or any queries which you may have on this forum.
Although I work for IBM, the views expressed are my own and not necessarily those of IBM and its affiliates.
Please refer to the below link for this blog content
IBM DB2 clubbed with BLU Acceleration on IBM Power System help provide a game changing innovation needed to make you a leader in analytics. We invite you to the webinar on DB2 BLU on October 2, 2014, where we will cover the benefits of out latest Cancun Release of DB2 BLU and highlight the key points for customers and partners to consider when making data base platform choices.
To participate, register at http://bit.ly/1x4Rthq