The architecture of IBM Cognos Dynamic Cubes is something that many find it hard to conceptualize. Cognos Dynamic Cubes are really just an extension of Dynamic Query Mode (DQM) which in turn is part of the three tiered Cognos BI architecture familiar to authors, modellers and administrators alike. So why is it that as we delve into the Cognos BI stack tracing these extended features it becomes difficult to stay on track?
I find that the trick is to separate the concepts of Cognos Dynamic Cubes as a cubing engine and dynamic cubes as a data asset (or reporting data source). For those familiar with TM1, a similar distinction can be found between the TM1 server and the memory-resident cubes it hosts. In the case of Cognos Dynamic Cubes, the cubing engine is presented as the Query Service and the running cube instances reside in the associated Query Service JVM.
A further challenge for those familiar with the Cognos BI architecture is that unlike other reporting data sources accessed by Cognos BI, a dynamic cube can have multiple running instances which reside entirely inside the Cognos BI software stack. So, what is created when you publish a cube model and what are you connecting to when you open a dynamic cube package from one of the Cognos BI studios?
To answer this question I like to start with the physical components in a dynamic cube. By physical components I mean the actual bits and pieces that reside in the Content Store database. Extending my comparison with TM1, here I would be talking about the data files where the cube dimensions, rules, leaf level data, and so on required to load a TM1 cube into memory are stored. Coming back to Cognos Dynamic Cubes, the corresponding component is the cube model, but wait … there is more.
First, there is the published cube model. This object contains the cube specification (foe example, relational metadata, dimension structures, measure mappings, in-database aggregate specifications, virtual cube specifications, security views, and so on). The published cube model does not contain any cube data and is not accessed directly by the reporting studios.
Next is the dynamic cube data source connection. This object contains the connection information required by packages to access the dynamic cube. It also has a property, known as the access account, which stores the credentials used to access the cube's underlying database. The dynamic cube data source connection is created automatically on publish, it cannot be constructed manually.
Finally there are the cube configurations. Cube configurations specify which dispatchers can host a running instance of the cube and include properties such as size limits for the cube caches.
So putting these pieces together:
A dynamic cube, defined in IBM Cognos Cube Designer, is published to the content store in the form of a published model and a data source connection (with an access account property).
Dynamic cube configurations are created when the cube is configured against each dispatcher.
The published model, access account and cube configurations are used by Cognos Dynamic Cubes to build the various running instances of the cube in the appropriate Query Service JVM's and populate the caches of each cube instance from the underlying data source.
The single dynamic cube data source connection is referenced by Cognos BI packages thus making the running cube instances available to the Cognos BI studios.
To find out more about where the data in a cube comes from, how it gets there and what is involved in managing your cubes take a look at new 2nd edition of the IBM Redbooks publication IBM Cognos Dynamic Cubes, SG24-8064-01. For a high level overview of IBM Cognos Dynamic Cubes see the IBM Redbooks Solution Guide Big Data Analytics with IBM Cognos Dynamic Cubes, REDP-5265
As both a consultant and educator, I am often asked for help with the transition from the training room to real business processes. I have found the IBM Cognos Dynamic Cubes book an invaluable resource in bridging this gap and recently had the privilege of working with a team of Cognos experts from IBM to co-author this latest edition.
MaryAlice Campbell is a Senior Consultant and Business Analytics Technical Practice Leader at ISW, Australia. She has over 20 years of experience as a business analytics specialist. MaryAlice is an IBM Cognos BI veteran having gained experience with the early, pre-web versions of IBM Cognos PowerPlay and IBM Cognos Impromptu®; she contributed to beta and training programs, and worked with all subsequent releases. MaryAlice is also an IBM Certified Solution Developer, internationally recognized educator, and a Master Instructor of the IBM Analytics curriculum. MaryAlice is one of the authors of the IBM Redbooks publication IBM Cognos Dynamic Cubes, SG24-8064-01 and the IBM Redbooks Solution Guide Big Data Analytics with IBM Cognos Dynamic Cubes, REDP-5265.
It's been 25 years since the iconic 1980s movie Top Gun hit the big screen, but the message from the movie is still true today � �We have the need, the need for speed.�
If you�ve never seen the movie, it�s about Lt. Pete "Maverick" Mitchell, played by Tom Cruise, and his adventures to overcome shortcomings as a fighter pilot at the Top GunFighter Tactics Instructor program.
Maverick was dangerous, he took chances, relied on his gut, made poor decisions, and �his ego wrote checks his body couldn't cash.� (Sound like anyone in your organization?)
And most importantly, he didn't buy into the classic fighter pilot methodology, the OODA (Observe, Orient, Decide, Act) Loop, developed by U.S. Air Force Col. John Boyd, and taught at the real-life Top Gun.
Boyd's philosophy was simple: Those who could quickly process this loop and react to real-time events better and faster than their adversaries could then anticipate their adversaries� thought processes and decision-making to gain an upper hand.
It's actually the same strategy that is being applied today from commercial and government organizations with IBM SPSS Decision Management technology.
The Need for Speed
Decision Management � through the combination of predictive analytics, business rules and optimization � enables organizations to automatically deliver high-value, high-volume decisions at the point of customer impact. Watch a demo here.
Essentially, it gives organizations the ability toensure optimal outcomes by injecting predictive analytics directly into the business process, such as cross-sell or up-sell marketing campaigns, reducing customer churn or minimizing the impact of fraud.
Without the combination of analytics + rules + optimization to improve a business process, an organization can effortlessly increase the velocity of bad decisions. To paraphrase from the movie, �I�ll hit the brakes and the competition can fly right by.�
(Watch the short video below of James discussing Decision Management and his new book.)
For example, if a high-value customer dials into the call center of a retail bank to complain about a product or service, IBM SPSS Decision Management may predict, based on the customer's past behaviors and interactions, that this individual is likely to churn.
The information about the current complaint, combined with the customer's history, can then be used to create a customized retention offer on the spot to prevent churn.
The bank has easily removed any blind spots that had kept them from making the right decisions, every time, with its customers.
And in an indirect way, it has turned the call center into a profit center, empowering employees to become an extension of the sales team rather than just �complaint takers.�
Never Leave Your Wingman
By the end of the movie, Maverick had finally realized that by trusting not only himself, as well as the philosophy of the OODA Loop, he could be a successful fighter pilot.
In other words, Decision Management becomes any organization�s ultimate wingman, giving the confidence to make the right decisions, at the right time, with amazing speed and agility.
Do you have the need for speed?
James Taylor, CEO and principal analyst at Decision Management Solutions, talked with us at the IBM Information on Demand conference in Las Vegas about how Decision Management works, why it's so popular, how customers are using it and best practices to get started with this technology.
Predictive maintenance involves predicting when a component will fail or require service, before that failure happens. It goes well beyond traditional approaches, such as reactive maintenance (replace components after they fail) or scheduled maintenance (replace components on a pre-determined schedule). Instead a predictive maintenance solution uses historical data to predict when a failure is likely to occur, so you can take action and avoid costly downtime.
IBM has launched an industry leading Predictive Maintenance and Quality solution, built on IBM's extensive product portfolio and extensive real world predictive maintenance experience. But what products are involved? And how does it work?
Here are the six steps that make up the IBM Predictive Maintenance and Quality solution.
1. Loading master data
Before you can start making predictions you'll need to load some master data into the analytic data store (the DB2 database central to the Predictive Maintenance and Quality solution). Master data includes a list of all the devices to be monitored by the solution.
Let's use a scenario to illustrate this. Consider a manufacturing firm that's having difficulty maintaining the power transformer that feeds electricity to their production lines. If they wanted to predictively maintain the power transformer they would need to provide master data about the transformer device itself.
You can edit master data using IBM Master Data Management, or simply use a spreadsheet application to edit the data directly. Either way, you'll export .csv files which are loaded into the analytic data store.
2. Loading and storing events
The second thing the analytic data store needs is to receive events from monitored devices. In the power transformer scenario, the device being monitored (the transformer) might provide events with several different observations such as temperature and current load measurements.
These events are received either in real time or batch. The Predictive Maintenance and Quality solution requires these events to be stored in a particular format. To achieve that, an IBM WebSphere Message Broker message flow transforms the message format sent by each device into a standard event format.
With the events delivered, they need to be stored in the analytic data store. The event processing flow records them into the DB2 database.
3. Performing aggregation
Events are aggregated into key performance indicators (KPIs) and profiles using measurement types and profile variables. A measurement type defines how to interpret a particular device reading (so a reading of "107" is understood to be a temperature reading and not something else). Profile variables designate a specific profile calculation that should be performed on the incoming data (for example to calculate the average temperature of the transformer and its current load).
Scoring is where the magic starts to happen. Predictive models are created in IBM SPSS Modeler. These predictive models use historical data to determine the probability of certain future outcomes. For example, a model could be created based on historical data regarding transformer temperature, current load, and occurrences of failure. The score that is returned can be thought of as an estimate of the likelihood that the transformer will fail within a designated period of time, based on the most recent readings.
5. Decision management
With scores calculated, it's time to start making decisions. With SPSS Decision Management, rules can be authored, tested, optimized, and deployed. For example the recommended action that results from the predictive score may be to perform a detailed on-site inspection to look for early signs of trouble. When the predictive score shows a particularly high probability of failure, the action may be to transfer the load to another device and shut down the transformer for a component-level inspection and possible repair.
6. Dashboards and delivering recommendations
The communication of recommended actions (such as to perform an on-site inspection) can be accomplished by the creation of work orders in IBM Maximo. The accumulated KPIs and current profile values (such as the average temperature of the transformer) can be viewed in IBM Cognos Business Intelligence reports.
How many times did you use CICS Transaction Server this year? This week? Today? Unless you're already familiar with IBM's 43 year-old transaction server, you might be scratching your head and thinking �I've never used it!�.
Have you had lunch yet? If so, did you pay with a debit or credit card? Then you've used CICS. Did you pay for lunch with cash instead? CICS entered your life then too � when you went to the ATM to withdraw the money.
And you're not alone. CICS Transaction Server handles a dizzying number of transactions every day. More than 30 billion transactions a day in fact (and at least three CICS customers are exceeding one billion transactions a day each). In the course of a week, those transactions are valued at over $1,000,000,000,000 (that's one trillion dollars). Every single week.
Almost every commercial electronic transaction that you make is processed by CICS. Consider the transactions involved in taking a business trip by train. You'll search for available travel times, book a train ticket, purchase travel insurance, and check in to a hotel room. Each one of those transactions needs to be completed quickly, securely, and reliably, and it's CICS Transaction Server that's behind them all.
CICS and System z: perfect partners
So what is CICS, and how is it still so relevant after 43 years? It's a transaction server that runs primarily on the IBM System z mainframe. System z is well known for its high availability, averaging about 5 minutes of downtime per year (by combining System z mainframes, that downtime is reduced to almost zero). Of the world's 25 biggest banks, all 25 use System z. A single System z mainframe is highly scalable � it can comfortably run over a thousand virtual Linux images on a single box. CICS Transaction Server is designed to take full advantage of the System z platform, controlling the interactions between applications and users.
CICS provides applications with an extensive range of system services, such as security and transactional integrity. Application programs written for CICS use an application programming interface (API) to request these CICS services. The CICS API is provided in multiple languages, from COBOL to Java. There are APIs for presentation services (for user interfaces), data services (for retrieving and updating data), and business services (for manipulating data).
Out with the old, in with the new
The real beauty of CICS � and a reason it is still going strongly today � is the ability to separate and reuse business logic. CICS applications that were designed to work with a green screen 3270 terminal 20 years ago can be modernized to support web services today, without making changes to the original business logic of the application. CICS has remained current with changing middleware technologies: CICS has embraced HTTP web servers, Enterprise JavaBeans, Java adapters, and SOAP web services in recent years.
CICS and cloud computing
Today IBM announced a new version - CICS Transaction Server V5.1. This new release addresses over 100 customer requirements � a record for a new CICS release.
One of the improvements continues the CICS tradition of adopting emerging technologies with support for cloud computing. CICS provides operation efficiency and service agility with cloud enablement.
Adopting CICS into your architecture
To learn more about CICS Transaction Server, and how application architects can incorporate the value of CICS into their business, take a look at the newly published IBM Redbooks publication Architects Guide to CICS on System z.
Martin Keenis an IBM Redbooks Project Leader. He leads publications on many areas of IBM software, including WebSphere, Messaging, and Business Process Management. Follow Martin on Twitter at @MartinRTP.
Watson Natural Language Understanding (NLU) is a service composed of a collection of text analysis functions that derive semantic information from content. You can input text, HTML, or a public URL, and leverage sophisticated natural language processing techniques to get a quick high-level understanding of your content and obtain detailed insights.
With Natural Language Understanding (a new rendition of the Alchemy Language API which has been deprecated), developers can analyze semantic features of text input, including categories, concepts, emotion, entities, keywords, metadata, relations, semantic roles, and sentiment. All developers have to do is to call a REST API within their application.
But organizations keep asking "What can I do with the Natural Language Understanding service?", " What are practical applications that can leverage the capabilities of this Watson API?".
Here is a list of use cases that can be implemented with the Natural Language Understanding service:
Social sentiment analysis
Have you ever seen those dashboards that depicts how good or bad your brand or a certain product is doing in the market based on Facebook posts or Tweets? To build these dashboards, you need to understand the sentiment towards your brand or product conveyed in the text of those posts or Tweets. You can used the targeted sentimentfeature in the Natural Language Understanding service to accomplish this goal.
Do you have a bunch of documents to order and categorize according to their content? Well, by using the Natural Language Understanding categories feature you don't have to read them at all. This service will read the document for you and return the appropriate categories.
If you are training a chatbot by reading emails or web pages to figure out how the bot should be able to respond, stop wasting your time. Use the Natural Language Understanding keywords feature to read through emails or web pages for you and extract all the relevant.
If your organization is working on having a virtual assistant or already has one, the use of this NLU feature is a must. The emotion feature provides the ability to understand how your interlocutor is feeling. As the user’s emotions change, so should your answer in order to keep pace or try to influence the user’s mood in a positive direction.
- Creating a Natural Language Understanding service in Bluemix
- An example use case with working code in GitHub
Enjoy and have fun using the Natural Language Understanding service!
Sebastian Vergara is an Expert Certified Architect in IBM Sales & Distribution, IBM Uruguay. His areas of expertise include cloud computing, DevOps, Design Thinking, and cognitive computing. He has over 8 years of experience in the IT industry. Sebastian led several projects to design and build cognitive solutions, such as the development of a transactional virtual assistant for an international bank and a cognitive chatbot for a major pharmaceutical company in Latin America that uses Watson Natural Language Classifier, Text to Speech, Natural Language Understanding, Visual Recognition, and other Watson technologies. Sebastian teaches at the Engineering College in the Universidad de la República Uruguay (UdelaR) where he introduces students to architecture and design, integration, cloud computing, and trending technologies.
This blog post is written by Jackie Zhu and Ed Stonesifer.
IBM Content Manager OnDemand (CMOD) provides enterprise-wide report management solution for computer-generated reports and many other type of content. The most recent release of CMOD added key features to help companies to meet compliance requirements:
Supports individual documents holds through the new Enhanced Retention Management feature
Supports integration with IBM Enterprise Records to provide full records management functions.
Enhanced Retention Management
The new feature, Enhance Retention Management is one of the most valuable features created for CMOD. It provides an immediate process to find and hold documents to prevent normal document deletion process. It enables you to lock down individual documents within a report that is managed by time-based retention.
This is critical to a company because any company might be forced to go through a legal inquiry. When that time happens, all documents related to the inquiry are required to be retained and not deleted during the inquiry process.
With the new Enhanced Retention Management feature, CMOD provides document retention management with one or more of the following ways:
Time-based retention at the application group level.
Efficient document deletions for different media types.
Native support for putting individual documents on hold.
Integration with IBM Enterprise Records.
Applying hold on documents
There are many ways to hold documents captured and managed by CMOD. One way is to first search for the documents and select them; then, you click the Action drop-down box and select Apply Hold to hold the documents. See the figure below. The user interface used for CMOD in this example is powered by IBM Content Navigator. When putting documents on hold, you also specify the hold reason.
Things to know about CMOD holds
You can put documents on hold through a CMOD Windows or web client.
Documents can be held base on different reasons, for example, legal investigation.
Holds PREVENT documents from being expired or deleted but they do not change or manage document expiration policy.
You can put a large number of documents on hold at once.
A document can be put in multiple holds for multiple reasons. Only when all the holds on a document are removed, can the document goes through the normal expiration process.
Implied hold enables management of document retention by an external system.
For Content Manager OnDemand related blog posts, see:
Servando Varelais the Program Director for ECM Mobile Applications and has over 15 yearsof experience working in the Enterprise Content Management industry. Servando first startedin 2000, where he worked in the FileNet R&D group for the FileNet P8 platform. In 2008, hestarted working in the IBM ECM Product Management team. In his current role, ProgramDirector, Marketing and Strategy for ECM Mobile Applications, Servando manages the ECMMobile offerings, as ECM Mobile became a stronger imperative for IBM and IBM's customers.Servando has an MBA from Pepperdine University, and a B.S. degree in Computer Scienceand Engineering from California State University Long Beach.
In the industry today, "App" is a shortening of the term "Application Software". It is the de factostandard term used to describe mobile software. The word become popular after the 2008release of the iTunes App Store, Apple's purchasing and installation mechanism for iOS.Google followed shortly after by the Google Play Store.
Mobile apps have an implied simplicity due to the constraints of mobile devices, namelyreduced screen real estate, limited battery life, constrained input methods, and unreliableconnectivity.
To be successful, mobile apps must aid the normal mobile device user ("normobs") in dealingwith mobile device constraints. This focus on user experience (UX) gave way to a revolution insoftware design. Normal mobile users care more about convenience and eas of use thanchecklists of software features.
The mobile strategy for IBM ECM is threefold:
First, the strategy is to provide apps for the key offerings that fully complement the core solutions. For example, IBM provides a Case Manager Mobile application that works in tandem with the IBM Case Manager product and complements the desktop experience. It does not replace the desktop experience. It is similar to how business people use email today; we have a desktop where we can sit and write complicated emails, attach documents and do that with great ease when we are at our desk. However, we can also be mobile and read email, monitor for important emails we are expecting, and even respond to email if necessary. This requires that our mobile apps be highly targeted and the UX be specially focused on the intended use of the app. For example, if it is a Case Management app, then the app should be totally focused on that function and we should not cram other non-case management related functionality into the app.
Secondly, the IBM ECM mobile strategy requires that IBM deliver the best possible mobile experience for the normobs and this usually translates into delivering fully native applications for the most relevant mobile operating systems, which today are iOS and Android. This way iOS or Android normobs will easily adapt to any of the IBM ECM mobile apps, because the controls will feel second nature to them and the adoption of the app will be frictionless.
Lastly, the strategy calls for delivery reusable and extensible libraries that customers, business partners, and any mobile developer can use to easily integrate the IBM ECM mobile functionality into their own custom applications. Delivering this type of value is key for offerings where IBM customers have requirements to produce bespoke mobile apps.
IBM ECM has three mobile offerings that complement the key parts of the ECM portfolio, andthese are:
IBM Navigator Mobile: the UX for this app is primarily around providing users easy access to their content that is stored in the FileNet Content Management system. This app is currently available for iOS. It includes the ability to browse, view, search, retrieve, add, delete, and edit content that is stored in the repository. This app also provides the ability to view and add comments to documents, participate in Teamspaces, and offline access to documents. This app integrates with IBM Content Navigator.
IBM Case Manager Mobile: the UX for this app is primarily focused around the Case Manager case worker who needs to have access to case information while on the road and needs to be able to update and complete tasks. Much of the functionality in this app is centered around case work task execution and it includes a mapping capabilities so that case works can see where their tasks lie on the map relative to their current position. This app integrates with the IBM Case Manager server and any solution designed in IBM Case Manager can be leveraged by this mobile app.
IBM Datacap Mobile: the UX for this app is primarily centered around capturing or imaging documents using mobile devices. This app includes advanced imaging capabilities, such as automatic edge detection, auto capturing of images, the ability to add metadata to documents for easier indexing, on-device object character recognition (the ability to auto ready text), on-device barcode decoding (to populate data fields automatically), and it integrates with the IBM Datacap server. Any solution defined in IBM Datacap can be leveraged by the mobile app.
It is worthwhile noting that the current state of these apps is not their final state. IBMcontinues to enhance these apps and continues
After the conference, you can find archived content here:
http://www.livestream.com/ibmsoftware/folder?dirId=add716e6-047a-4e49-a94f-adc1148de8c8 Video on Demand from Information On Demand 2010
View all of our video content, at your leisure. Videos updated daily. http://youtube.com/iodgc�
APAR number PI62566 delivers significant new functionality to the LOAD utility in the ability to specify constants during the LOAD process. The constant can be a character string, integer, null or date timestamp.
The following examples show how to use LOAD CONSTANT with different data types.
Here are some examples of the use of CONSTANT in a LOAD job for a character field and 2 date/time related fields:
When talking about IT infrastructure nowadays, you will have to think about high availability as well. Especially when you work in a sector where data security and integrity is of very high importance such as banking and insurance. Law restrictions and competitive pressures force every business to think about how to keep their data safe. Therefore, high availability is not just a trendy word, but essential to you and your clients. Think about it, high availability not only keeps your things running, it also can help to prevent you from spending huge amounts of time and money on disaster recovery.
High availability means that even in case something happens, you and your data are still safe. To achieve exactly that you not only need a systems that notices a failure in an instant, but further offers a solution that reacts quickly and makes sure your date won't be lost. This is where the IBM MQ Appliance comes in.
The IBM MQ Appliance provides you with the opportunity to bring the messaging you know and the high availability you need together in one place. For that purpose, a pair of MQ appliances is deployed in a high availability group which also contains high availability (HA) queue managers. While the HA queue managers take care of your messaging traffic the appliances take care of the HA queue managers. The appliances do that by communicating with each other. If one appliance realizes the other one is not responding it takes over and thereby makes sure you don't have to worry about data loss.
As you can see, with the IBM MQ Appliance a new, high availability configuration is now possible with a pair of appliances that mirror messages and enable seamless takeover upon failure of an identified primary appliance. This solution includes failure detection for hardware and software problems. In addition, it supports manual failover for rolling upgrades and it is easier to set up than other high availability solutions due to the fact that among others shared file systems and discs are no longer needed. To ensure 100% fidelity, replication is synchronous over ethernet.
Doesn't that sound promising? What scenarios can you imagine to leverage from the IBM MQ appliance's high availability? You want to know more about high availability on the IBM MQ appliance? Read the IBM Redbooks publication:
Hello all.���I'm Bryan Casey and as a market manager within the IBM Security Solutions organization I have the privilege of working with some of the smartest folks in the industry. It's an exciting time of the year to be in the security business with RSA happening this week and Pulse coming up at the end of the month. But before we all head down to Las Vegas, let's first take a quick stop in San Francisco.� �
RSAhas been dubbed as the conference where "the world talks security" and this year the world has been talking a lot about IBM.��With new product announcements, several IBM�led speaking sessions, client and business partner events, and the SC Magazine Awards,IBM's agendahas been full and exiciting.� While RSA is coming to a close, it's worth noting some of the key�conference highlights.�
�For the past 14 years SC Magazine has honored the achievements of the people and organizations that have made significant contributions within the field of IT Security. With over 650 annual nominations across a variety of categories, these awards take a very comprehensive view of the security industry. This year,�IBM Tivoli Identity and Access Asurance was selected as the Best Identity Management Application.� This is an great win for IBM�because the�dedication to providing innovative�security solutions as they relate people and identity is one of the core elements�of our overall strategy and of the IBM Security Framework.
�The�framework represents�the way we help�organizations approach security from a�holistic, business driven perspective, and IBM Tivoli Identity and Access Assurance was recognized as a leader�for its ability to help�organizations achieve�those results.� It gives the right users, the right access and information, at the right time.�
"It provides identity management from on-boarding users and assigning appropriate access rights, to changing user roles and modifying privileges, to terminating user access rights at the end of the user lifecycle.� Access management provides secure authentication of users, including unified SSO (enterprise, web, federated), and enforces access policies once the user has been authenticated.."
������ - SC Magazine
IBM Tivoli Identity and Access Assuranceadditionally reduces costs and headaches by automating several of the�key processes associated with compliance and audit.� To read more, you can read the SC Magazine Article by clicking here.�
It�was not the only IBM Security product that was recognized.� IBM Tivoli Access Manager for Enterprise Single Sign-On was a finalist for the Best Multifactor award, and IBM Security Network IPS was a finalist for the Best Web Application Firewall award.
In my role as a Mobile Solution Architect I have spoken at length with several customers about the need to tie existing business processes in with their mobile strategy. There is a wealth of value in this approach because it enables organizations to enhance the customer interaction while streamlining the overall process.
In this blog I discuss the value of combining IBM Smarter Process and IBM MobileFirst initiatives to achieve an IBM Mobile Smarter Process. To be clear, a Smarter Process is simply a business process, which is invoked in response to an event (for example, a loan application or an insurance claim). The IBM MobileFirst platform enables an enterprise to support a complete mobile strategy.
Mobile Smarter Process helps customers to reinvent how their business executes by exploiting mobile technologies. The goal is to fundamentally change how an organization does business by integrating Mobile and processes. It enables a radical simplification of every customer interaction and leverages new mobile contextual opportunities for customer engagement.
Mobile provides the channel and the functionality. Process provides the rigor and the discipline. Organizations should identify the mobile interactions that add value to their processes and then re-designed or re-invent those processes to harness the mobile interactions. Mobile creates numerous timely interactions that can be harnessed for goal-oriented results with Smarter Process.
Harnessing Mobile interactions with Smarter Process
By definition, IBM Mobile Smarter Process enables organizations to take advantage of mobile technologies and embed those very technologies directly into the enterprise. It promotes radical simplification of every interaction, allows organizations to use new mobile contextual opportunities for customer engagement and personalization, and it allows the clients to interact with the organization when, where, how and as much as they want to.
Additionally, IBM Mobile Smarter Process creates a "never before seen" level of personalization between the organization and the client. It allows organizations to empower their employees. No longer are employees tied to their desks in an office, now they have the ability to work more closely with their clients, make improvements, and come up with new innovations to drive greater value and efficiencies. Lastly, it allows organizations to establish their ecosystem where new partners, new sensors, and new data lead to the establishment of completely new set of experiences and possibilities.
IBM Mobile Smarter Process is a true game changer, and it's ready for primetime now! The IBM Redbooks publication Extending IBM Business Process Manager to the Mobile Enterprise with IBM Worklight outlines an approach that organizations can use to identify where within the organization mobile technologies can offer the greatest benefits. This book discusses architectural patterns for exposing business processes to mobile environments and it includes practical implementation examples.
Steve Mirman is a Senior Certified IT Architect focusing on IBM Worklight and the MobileFirst portfolio. He has over 15 years of IT experience as an application developer, solution architect, and product specialist. In his current role, Steve leads proof of concept implementations and consults with customers throughout North America on mobile strategy, architecture, and integration. He holds numerous IT certifications and has authored several technical articles. Steve is a co-author of the IBM Redbooks publication Extending IBM Business Process Manager to the Mobile Enterprise with IBM Worklight
In my job as an IBM Mobile Solution Architect in Europe, I frequently get in touch with organizations that are interested in IBM MobileFirst strategy and that have a certain footprint of business processes supported by SAP systems. These clients want to leverage the strength of IBM Mobile solutions without disrupting their existing investments on the SAP domain.
Since the first day of my 17 year career with IBM, I always had some affinity with SAP solutions. To make a long story short, there is always one thing in common to the many jobs and roles I had: ensure that clients design and implement the optimal solution for their business needs integrating SAP implementations with IBM enterprise software.
Today I work in the IBM Mobile consulting practice in Europe and I love talking to customers that are designing their mobile strategy and introducing mobile capabilities into their business processes. Naturally, I gravitate towards projects that have SAP heritage in their landscape. It is like speaking a special IT dialect that translates IBM software value proposition for the SAP-minded clients.
To help clients and consultants to design heterogeneous solutions, IBM created the IBM Reference Architecture for SAP, a prescriptive blueprint for using IBM software in SAP solutions. The IBM® Redbooks® publication IBM Software for SAP Solutions explains the value of integrating IBM software with SAP solutions. It describes how to enhance and extend pre-built capabilities in SAP software with best-in-class IBM enterprise software, enabling clients to maximize ROI in their SAP investment and achieve a balanced enterprise architecture approach. I'm the author of the mobile chapter in this book.
With the rising popularity of mobile solutions we saw also a need to provide more technical details and implementation scenarios that are common in the IBM and SAP mobile solutions domain. For that reason we developed the IBM Redpaper publication Extending SAP Solutions to the Mobile Enterprise with IBM MobileFirst Platform Foundation which I authored with the support of colleagues in my consulting practice and in the IBM development teams. Working on this paper afforded me the unique opportunity to develop reusable assets that can be used as a base when implementing mobile solutions and shared the knowledge I gained through the documentation of the usage scenarios.
Mobile application authentication with SAP ERP systems.
Mobile application single sign-on (SSO) using the IBM MobileFirst Cast Iron adapter as the integration component.
Mobile application single sign-on (SSO) using the IBM MobileFirst SAP NetWeaver Gateway adapter as the integration component.
Offline SAP business data storage and synchronization in mobile scenarios.
In this paper, we made some assumptions for the implementation of the scenarios which might be different for a particular client scenario but I tried to cover many combinations in the paper's chapters so that you can understand the basic concepts and customize the examples to your particular needs.
If you have further questions around IBM and SAP integration topics, specially when it comes to mobile solutions, feel free to drop me an e-mail. I'm eager to get in touch with you and discuss your specific business challenges.
Khirallah Birkler is a Certified IT Specialist and IBM/SAP Mobile Solution Architect in the IBM Software division, IBM Germany. Khirallah helps clients to identify requirements, design, and architect mobile solutions based on the IBM MobileFirst portfolio. A special focus of his work is the integration of enterprise data into mobile scenarios, which is always a key requirement for enterprise customers. Khirallah holds a Bachelor of Science degree in Information Technology Management from the University of Cooperative Education, Stuttgart. Khirallah is the author of the IBM Redpaper publication Extending SAP Solutions to the Mobile Enterprise with IBM MobileFirst Platform Foundation. He can be reached at BIRKLER@de.ibm.com.
In my various roles as an IBM level 2 support engineer and an IBM Added Value Specialist, I spend much of my time working out why things don't work in quite the way we expect them to.
I have recently published two IBM Redbooks web docs to aid anyone who is taking advantage of the exciting ability to integrate a Box repository environment with IBM Content Navigator.
The document How to Successfully Integrate IBM Content Navigator with Box walks you through the technical integration process of configuring the Box repository and configuring IBM Content Navigator. It includes step-by-step configuration instructions and screen shot examples for many of the available options.
Example Box repository configuration in IBM Content Navigator
These publications highlight some pre-requisites and essential configuration detail that may not be immediately obvious from the current documentation on this topic.
This guidance primarily relates to IBM Content Navigator version 2.0.3, but can also be used in version 3.0
Steve Cleasby is an IBM ECM Software Engineer currently working in the IBM Analytics platform level 2 (L2) support team. He specializes in technical support of the FileNet P8 product set. Steve has worked in the IT industry for over 30 years and spent 11 years with FileNet UK Ltd as a technical support consultant and has continued to work with the product after FileNet was acquired by IBM in 2007. Steve is a highly regarded senior member of the IBM ECM L2 support team. He is also an Added Value Specialist for two major global IBM ECM clients as part of the IBM Added Value Program, providing knowledge gained from years of experience in IBM ECM proof of concept projects and subsequent deployments to production.
We all know about MQ as middleware software - and now there is an appliance as well. But why? And what are the benefits? Let's take a look at current business challenges to answer these questions.
First of all, businesses become not only more and more complicated, but more and more geographically distributed as well. Messaging infrastructures face increasing demand and seamless connectivity is not only expected, but essential for successful business operations. Saying this, it becomes obvious that messaging has to perform at the highest levels. At the same time organizations have to think about cost efficiency. So here is the dilemma: Costs have to be under control while today's businesses have to deal with maintaining and running many messaging servers with longer configuration times and higher maintenance changes. Additionally, the mixture of different hardware and operating system levels makes this an even more complicated task.
The answer on how to solve this dilemma is also the answer to the question we started with: let the IBM MQ appliance help you. The IBM MQ Appliance M2000 provides an optimized version of IBM MQ V8 that runs on a hardware appliance. That way it offers rapid deployment of enterprise messaging and simpler administration, both in day-to-day operations and also when applying maintenance. To make things even easier for your clients the appliance can be deployed remotely, which means you neither have to do the set-up locally nor do you need in-house experts. But the IBM MQ Appliance provides you with even more benefits such as:
Seamless MQ integration: The IBM MQ Appliance can easily be integrated with existing MQ environments
Simplicity, high availability and built-in-features: Paired connectivity to another appliance offers a simpler and more robust solution without external dependencies like shared file systems or disks
A new MQ console: Browser-based user interface, personalized monitoring and configuration
Reduced total cost of ownership (TCO): Decreased data center space, power costs and management burden by using fewer appliances to do more work
Simpler maintenance: Fix packs delivered as certified firmware updates onto a locked down appliance reduces dependencies and simplifies maintenance tasks.
Maximum performance with built-in components: pre-optimized for the hardware with no tuning or additional configuration needed
Overall, it is safe to say that the IBM MQ appliance's convenience, fast time to value and low total cost of ownership is the ideal solution to meet your client's needs - no matter if you use it as a messaging hub running queue managers accessed by clients or to extend MQ connectivity to a remote location.
Knowing all this, what do you think how your clients can benefit from an IBM MQ Appliance? Or have you already made some experiences with it? Please share! And to find out more, read the IBM Redbooks publication:
Hassi Norlen is an advisory engineer in information development. He started his career at IBM a dozen years ago in the Enterprise Content Management (ECM) field, and has since migrated to database management and monitoring software solutions. His areas of expertise are up and running documentation and user interface development with the progressive disclosure methodology. Hassi holds degrees in physics and science journalism and works out of the IBM Washington, DC office.
Before making changes to your production database environment it is vital to test and validate that these changes will not affect the accessibility and reliability of your databases. These tests will differ depending on the type of change that you will be implementing, such as software upgrades, database schema changes, or hardware upgrades, but common to the any testing scenario is verifying that your current SQL workloads will work as expected in production after the change.
IBM InfoSphere Workload Replay is a versatile tool that lets you record the actual SQL workload in your production environment, and then analyze how this captured workload improves or degrades when replayed in a pre-production environment with the planned production changes implemented. With InfoSphere Workload Replay you do not have to create and run elaborate and error prone test scripts to complete your database testing, instead you use actual SQL data from the same production environment that you are upgrading.
The workload capture, replay, and reporting process consists of four main steps: Capture, Transform, Replay, and Report. Each step lets you configure how the data is captured, and then modified to fit your pre-production environment. For example, you can filter out workload data that is not relevant to your testing, capture and then replay LOB and XML data as actual data or as generated data of the correct length, map captured user IDs and schemas to pre-production user IDs and schemas as needed, replay workloads at different speeds to simulate different traffic loads, group and mask SQL statements in your reports, and so on.
With InfoSphere Workload Replay you set up a baseline workload to use in the pre-production environment, and then replay this workload after making changes to your pre-produtioin environment.
After analyzing the effects of the changes you can remove SQL data that is not important to your workload, and then test the replay with this modified data. You can also fine-tune the changes to pre-production and run the workload again in an iterative test cycle.
When you are confident that your workload behaves correctly with the environment changes in pre-production you can apply the changes in production.
Systems of Insight provides the ability to apply analytics and rules to real-time data as it flows within, throughout and beyond the enterprise to gain the desired insight. IBM Operational Decision Manager Advanced -- along with complementary IBM software offerings -- provides a way to deliver these systems of insight to deliver the greatest value to your customers and your business.
This residency is to produce a book about how Systems of Insight can help companies to improve identifying opportunities, respond to market demands, minimize risks, and operate consistently.
To see the details about the residency, click here.
If you have an IBM Db2 database, chances are it might be using automatic storage table spaces. With Db2 automatic storage table spaces, storage is managed automatically by Db2. Because Db2 controls the table space containers, users cannot configure the containers, making them tricky to manage.
If you want to learn more about automatic storage table spaces then the IBM Redpaper IBM Db2: Investigating Automatic Storage Table Spaces and Data Skew is for you! This paper will guide you through the process of understanding their behavior. Understanding automatic storage table space characteristics will increase your Db2 problem determination skills and general Db2 knowledge
This IBM Redpaper will step you through:
Automatic storage table space concepts
Identifying automatic storage table spaces
Data skew and physical data imbalance concepts
Determining which automatic storage table spaces might have data imbalance
Table space maps
Db2 restore utility
Problem determination for automatic storage table spaces ”gotchas”, common problems and their solution
The scope of this paper is limited to Db2 Linux, UNIX, and Windows operating systems. This document will give the reader an understanding of automatic storage table spaces s and how to manage them.
George Wangelien is a Senior Db2 UDB DBA for IBM Hybrid Cloud. George has over 22 years of Db2 for Linux, UNIX, and Windows experience at IBM. His interests include consulting, problem determination, high availability solutions, proof of concepts, and writing ksh scripts. His areas of expertise are production Db2 for Linux, UNIX, and Windows support and also the administration of SAP Db2 databases. George holds numerous IT and Db2 certifications, a Db2 patent and a Bachelor’s degree from Montclair State College, New Jersey, USA. George co-authored the IBM Redpaper IBM Db2: Investigating Automatic Storage Table Spaces and Data Skew.
Marcel Kostalis an ECM Solution Consultant with IBM Software Group. He has more than 12 years of experience in designing, developing and delivering J2EE solutions. For the past five years, he has focused on ECM solutions primarily in banking sector. His areas of expertise include solution architecture, application design, implementation, integration, and technical enablement with IBM FileNet P8 product suite.
IBM Content Navigator (ICN) is no longer just a unified and cool user interface for your enterprise content management (ECM) solution that allows you to extend its functionality via plug-ins. No no -- It is more than that! It is an application framework that provides tools and interfaces that allows you to build customized user interface and applications to deliver value and an intelligent, business-centric experience. Ha??? If you do not understand the last sentence, don't worry. Just keep reading. I will explain - hopefully then you will understand.
First, let's take a look at IBM Content Navigator architecture, to understand the available development options and ICN interfaces.
IBM Content Navigator architecture
Here it is….
I wish the picture is self explanatory so I don't need to write anything more, but adding all the information into one static picture can be quite confusing and a mess.
If you remove the colors from picture, an important thing becomes clear -- the layers:
Two layers (presentation and data access layers) are on the client side. They run in your browser.
Two layers (Service and backend systems layers) are on the server side.
The Mid-tier Services provides the REST services for accessing the underlying repository, configuration database or your own custom backend system. Although this interface is not made public for you to use it directly, you can extend it with your own services or modify and replace existing services by intercepting the requests and responses messages.
Now, if you put the colors back in the picture, you see the yellow boxes spread around the picture. They highlight the places where you can implement your own code and create new or customize existing front-end application.
Let's look at them in more details:
1. Configuring ICN
If you are searching, where's the yellow box for this on the picture, then stop it! You will not find it. I add this option because you can customize existing front-end application by simply configuring ICN, and it is the easiest way to change the appearance of ICN. Describing all the configuration options is beyond this blog. Make sure you understand them, before you start implementing something that you can simply configure without doing any coding.
2. Implementing the EDS interface
This allows you to change behavior of property edit fields on several screens in light-weight manner -- without the need to understand Java or implement an ICN plug-in. EDS is not a golden hammer! It is designed to cover just few specific use cases, so do not overuse it. Typical EDS use cases include:
Pre-fill properties with values
Look up the choice list values for a property or dependent property
Set minimum and maximum property values
Set property status or controls, such as read-only, required or hidden
Implement property validation and error checking
3. Implementing a plug-in
As you can see on the picture above, the plug-in goes through all the layers. Therefore, it is the most powerful and flexible approach for customizing user experience within your solution. It enables developers to add new functionality, change existing behavior and appearance of the application or create completely new application.
A plug-in consists of one or several extension points that can be split into two groups:
Server side extension points - Allow you to add new services or intercept the request and response messages to mid-tier services layer
Client side extension points - Allow you to add new action, new feature, new viewer or completely change the layout of the application.
4. Developing or integrating custom application
ICN provides two main approaches for integrating custom applications:
Unbound - A lightweight integration using ICN URL API, to render a specific desktop, feature, folder, document or search template in an iframe of your application or in a new window.
5. Delivering mobile access
Everybody wants it. And not just on iOS. Everybody wants to have access to corporate systems, including ECM applications from their mobile devices. ICN provides two options:
Develop mobile application - Using IBM WorkLight that can be extended to your needs. It builds a hybrid application that can be deployed on several platforms such as Android, iOS, etc.
So, that's the framework, described in short.
If you are interested, and need more information, check out the IBM Content Navigator Redbooks publication.
For IBM Content Navigator related blog posts, see:
ONE UI - IBM Content Navigator as an application framework
The IBM International Technical Support Organization just published a new IBM Redbooks publication called Extending IBM Business Process Manager to the Mobile Enterprise with IBM Worklight. The book provides insights regarding how to connect customers and employees to their critical business processes while on-the-go. The book also provides the sample code for the implementation of the use case scenarios described in the book.
This is a short summary of the chapters in the book:
Chapter 1 provides an overview of the market forces that push organizations to re-invent their process with mobile in mind. It describes IBM Mobile Smarter Process and explains how the capabilities provided by the offering help organizations to mobile-enabled their processes.
Chapter 2 provides an overview about the IBM Worklight platform. It describes the main components, business value, and what's new in Worklight V6.2.
Chapter 3 provides a general overview of IBM Business Process Manager (IBM BPM) product and includes concepts such as understanding business process management, the lifecycle of a business process, and new features in IBM BPM V8.5.5.
Chapter 4 discusses architectural patterns for exposing business processes to mobile environments. It includes an overview of IBM MobileFirst reference architecture and various deployment considerations.
Chapter 5 describes important aspects of mobile security that enterprises must consider when designing a mobile app. It also introduces features of IBM Security Access Manager and new security features in IBM Worklight V6.2.
Chapter 6 describes the solution design and implementation for a scenario that represents a cable TV installation service company. Some of the Worklight features described in this scenario are adapter-based authentication, geolocation, offline storage, push notification, and adapters for integration of the mobile app and IBM BPM. The IBM BPM features shown are business process definition, the Java integration service, and IBM Bluemix integration.
Chapter 7 builds on the implementation of the same scenario described in Chapter 6 by adding advanced features from IBM Workligt and IBM BPM such as LTPA authentication and token propagation, device single sign-on, simple data sharing, integration with IBM Bluemix Mobile Data, IBM BPM responsive coaches and IBM BPM case management.
Chapter 8 provides an overview of the basic analytics capabilities available for IBM BPM and IBM Worklight.
Finally, ask your doctor if reading an IBM Redbooks publication is recommended to cure your insomnia.
Joking apart, IBM customers, IBM Business Partners, and IBM consultants leverage IBM Redbooks publications every day to find solutions to real technical problems. You can download the book from http://www.redbooks.ibm.com/abstracts/sg248240.html
In the spirit of full disclosure, I am one of the authors of this book. You can share your thoughts by leaving a comment in this blog post.
Jorge Gonzalez-Orozco is a senior Mobile Solution Architect in IBM. He has experience in Mobile, eCommerce, enterprise integration and software development life cycle. Jorge is a hands-on IT Architect who has successfully led multiple teams in the role of lead architect and project manager. Jorge has led the strategy, desig,n and delivery of IT solutions for the automotive, retail, insurance, and energy industries by managing large global delivery teams. Jorge is also an IBM RedBooks thought leader and he holds the IBM IT Architect and SOA Solution Designer certifications. Jorge is currently based in Raleigh, NC USA.