- IBM is one of the few vendors paying attention to the configuration implications of cloud.
- IBM is actively addressing Cloud capacity planning within IBM Smart Cloud Monitoring.
- IBM is one of the few vendors who has thought through monitoring and management of complex cloud transactions
marvin_goodman 11000085U5 Tags:  cloud_cost_management cloud-monitoring smartcloudmonitoring 5,232 Views
In February, the IT industry analyst firm Enterprise Management Associates (EMA) released its first Radar™ for Application Performance Management (APM) for Cloud Services report.
IBM scored above the norm on all five axes measured – Architecture & Integration, Functionality, Deployment & Administration, Cost Advantage and Overall Vendor Strength.In addition to garnering the highest scores of any vendor, IBM was also noted for demonstrating “an in-depth understanding of the complex factors that support quality Cloud deployments." EMA goes on to extol the virtues of IBM's APM solution, asserting that:
Echoing recent analyst briefings on the use cases and customer value proposition vision of SmartCloud Monitoring, EMA said “We are seeing a new role within IT organizations called the ‘Cloud administrator’. Some companies are using an existing virtualization person, skilled in VMware or other virtualization technology. It is really important for a company to have such a role, and to have designated “Cloud architects”, as these skills are necessary to take a cross-platform, hybrid approach to Cloud rollouts.”
Read the full EMA report at the following URL: http://w3-03.ibm.com/software/analyst/articles/ema/emaapm.pdf
Pino 100000UGHN Tags:  servicemanagementconnect service-management virtualization ism cloud user-data 4,801 Views
I really liked the post “Rapid deployments with IBM
SmartCloud Provisioning” that explains
how simple and fast it is to deploy instances using SmartCloud Provisioning.
IBM SmartCloud Provisioning provides in the launch instance panel, and also using the CLI, the “user_data” text field that can be used for this scope.
It is inspired to the Amazon EC2 instance metadata and here you can find an interesting article on it: http://alestic.com/2009/06/ec2-user-data-scripts
The “user_data” field is a free text field so for example it can contain:
The launched instance can easily retrieve the user data field invoking the predefined URL http://169.254.169.254/latest/user-data and processes it according the needs.
It can be achieved by exploiting the current integration between IBM SmartCloud Provisioning and the Image Construction and Composition Tool (ICCT), available in IBM SmartCloud Provisioning version 1.2, creating a new bundle, the User-Data consumer bundle that contains a script that retrieves the “user-data” and process it based on his needs.
An interesting scenario is the capability of passing directly one or more scripts to be invoked at deployment time to have a really dynamic configuration. In this way, a new image can be configured/customized at deployment time.
If you want to have more information on user-data capabilities and examples take a look at the Ubuntu cloud-init component described here https://help.ubuntu.com/community/CloudInit
For further information about IBM SmartCloud Provisioning and Image Construction and Composition Tool see IBM SmartCloud Provisioning Information Center.
The open beta program for the upcoming IBM SmartCloud Provisioning release started:
Sreek Iyer 2000001K7N 4,042 Views
Among top challenges for cloud, Security is the top most concern. There are several concerns with regard to securing the cloud. Cloud computing tests the limits of security operations and infrastructure for the various security and privacy domains. Check out all the details of these interesting topic discussed as part of these blog posts.
Follow/ Click the tag “stepbystep” once you are on cloud computing central to see all the previous posts on the topic.
marcese 11000065AG Tags:  low-touch pxe cloud smartcloud isaac configuration installation image 6,186 Views
In my previous blog I talked about the speed of deployment of virtual machines when using IBM Smart Cloud Provisioning. I showed that virtual machines can be started and configured in a matter of seconds and I described a little in bit in details how this could be achieved in terms of the internal infrastructure of the product.
Let's consider, for example, how you can manage of one of the core elements of the solution: the compute node (i.e. the node where the virtual machines are hosted and run).
Several variations from the basic setup described above are possible depending on the actual topology of the environment and on the kind of nodes to be installed.
If you're interested in trying IBM Smart Cloud Provisioning, you can download a demo version from here:
ChrisNero 120000EDBS 3,992 Views
A usual adoption pattern for cloud computing are desktops. It's really straight forward because in general each company has standardized desktops: only some specific version of the operating system are supported, only specific flavours, only some applications are allowed and typically everything is managed by the IT team.
If we think at the benefits of adopting desktop cloud, some of them really jump powerfully in front of the eyes: the IT team can really enforce standardization (e.g. you can select as desktop only one of the proposed flavours); the maintenance of the hw becomes far easier given its consolidation; old, out-dated PCs can be used just as connectors to the desktop hence gaining new life. From the desktop user point of view he does no longer need to carry on some company asset to work: healthier (no more heavy hw to take home or travelling); safer (data is in the cloud).
But this is nothing new, desktop cloud solution are already on the market, so let's see if IBM SmartCloud Provisioning can bring additional benefits to the desktop world.
What if we start dealing with non-persistent desktop images?
Non-persistent images are the ones that disappear once you shut them down. You might be asking yourself “well, that's not so clever, what about my data? Are they lost?”. This is actually a very good point and this is the keystone of the benefits coming with the adoption of non-persistent images.
The idea is that all user data get stored into external (persistent) volumes that can be attached/detached on demand to the non-persistent image.If we now apply this technology to the desktop world, it shed an interesting new light on some typical and painful scenarios:
In a traditional infrastructure, when the operating system goes or is getting close to go out of maintenance, a massive migration campaign starts: all desktops need to be migrated. Now the migration statistically does not go smoothly for all users and hence some of them will be stuck even for days. If you use non-persistent images, you can easily overcome this either creating a new master image with the new operating system or upgrading a single instance of the image, do your test campaign to make sure everything keeps working, then deploy it in as many instances as the desktops you need to upgrade are, attach to the new images the volumes with the user data and get rid of the old images. If you leverage the incredible deployment speed of IBM SmartCloud Provisioning, you'll have a brand new set of desktops in minutes.
Analogously we may think about patching the operating system or a software running on the desktop: they key idea behind this is that you're always going to patch either the operating system or a specific software, never the user data that keep living into separate volumes.
If we think at the compliance aspect, remember that the user cannot save any change he does on the boot disk of the image since nothing gets ever stored on the disk. He is only empowered to write his own stuff on the additional volumes. This should discourage him from even trying installing new software or editing the operating system configuration, since everything will be lost at the first shutdown.I know in your company you may have different configuration flavours of the same operating system according to the department for which the desktop is tailored. For example you may need to have different firewall configurations according to the security level the end user is entitled to. Well, with IBM SmartCloud Provisioning you can leverage the User Data field at deployment time to specify these special configurations. Of course this may even not be shown to the end user, but you may mask it enlarging the list of the offering with the specific configuration. Under the covers the instance is launched with the proper parameters: no master image duplication, no manual configuration; everything is automated and standardized.
What about optimizing resources? Desktops by their nature have all the same operating system and configuration (at least for department),usually they come also with the same applications installed on top. If you deal with non-persistent images you are just saving lots of duplicated, useless copies of the same operating systems and software on the disk. Then, if you think that once the desktop is shut down, its resources are released (i.e. cores and memory), you can better optimize your hardware using those resources for other applications/users (they may even be server application or desktops for users residing in a different timezone).
New employees coming on board? A project outsourced to an external work-force?
You may want to have this people productive more than immediately. With IBM SmartCloud Provisioning, their desktops will be up and running in seconds.
See IBM SmartCloud Provisioning working in a recorded demo
Understanding IT Costs, Cost Variance and Budgets versus Actuals with New TUAM Cognos Reports and Dashboards
DLawson 27000369MF Tags:  tuam cognos reports accounting tcr cloud_cost_management chargeback showback 12,583 Views
The TUAM team are pleased to announce the delivery of a further 17 TCR Cognos reports. These enhance the existing TUAM reporting set plus allow you to:
Dashboards allow users to get an immediate understanding of the situation at-a-glance. As an introduction to Dashboards in Cognos, reports to demonstrate two methods for creating dashboards have been provided in this report pack. Users can build pages in Cognos containing reports as well as other Cognos navigation objects and set this to be their home page when accessing Common Reporting:
For more information about Dashboards, log on to the IBM Integrated Service Management library.
Two reports are also provided to allow users to understand how they are doing against their budget. The Line Item Budget report compares usage against budget at service level to help identify those services using more than their allotted budget. Similarly, the Client Budget Report is a new report showing a comparison of the client level budget with the actual charges for the client, with any deviations from the budget highlighted:
The Client Budget Report can also be used to help monitor the actual costs of a Cloud project. For example, the budget for different periods can be updated when you get a charges estimate for a new Cloud project and the report can then show the difference in what you were actually charged later. The difference will be as a result of server configurations or the duration of project changing after the estimate was produced. More details on this can be found in the cost preview blog entry here.
The Percentage report allows users to understand the charges by sub-client and service by showing the total charges and its associated percentage of the overall costs by both client and service. For cloud users, this allows you to drilldown to understand the distribution between projects and teams. Users can expand and collapse the report to get a full understanding of all the areas being charged.
Reports to compare both the charges and usage between periods have been provided in this report pack. The Cost Variance report compares the charges from the current and previous period by client and service providing an understanding of the changes in charges over time. Similarly, the Resource Variance report compares the usage in the same way so users can see in detail how usage is changing from period to period.
Drill Down Reports
Drill down through the account hierarchy and see the services being used by each client with the Application Cost Report. Cloud users can drilldown into their projects and teams to get an understanding of what resources are being used and their charges. Users can see the charges for each level in the Account hierarchy and the associated charge for each service and service group being used.
SmartCloud Provisioning limited-funciton version download, nice to pratice on this version when you only have 1 HW
SmartCloud Provisioning released its stand-alone, limited-fcunction version here:
If you only has 1 HW, you can play with it.
Nice to have it in hand to understand the product and learn how to use it.
The fix is downloadable from Fix Central and it is identified as 1.2.0-TIV-ISCP-IF0001
It addresses the following problems:
After the iFix installation the IHC component will be upgraded form version 0.20.2 to 1.0.0
For further details read the readme file associated with the interim fix
We're pleased to make available as a beta Service Health for IBM SmartCloud Provisioning. As this is a beta we welcome any and all feedback.
Service Health (Beta) for IBM® SmartCloud Provisioning provides prebuilt integrations between IBM SmartCloud Provisioning and IBM SmartCloud Monitoring. This solution allows you to easily monitor your IBM SmartCloud Provisioning infrastructure to identify and react to issues in your environment.
This solution is available via the IBM Integrated Service Management Library( ISML ). You can find it here -> Service Health for IBM SmartCloud Provisioning. Please use the "Comment or Review" link on that page to post feedback. You may also use the "Contact Provider" link as well.
There is a brand new demo for IBM SmartCloud Provisioning 1.2.
It is launchpad based hence allowing you to dive into various capabilities individually with a short and quick overview.
It covers the main IBM SmartCloud Provisioning capabilities:
Enjoy it clicking here
Would you like to integrate Tivoli Usage and Accounting Manager (TUAM) with enterprise planning software allowing you greater flexibility to budget for, and forecast your IT usage costs?
The TUAM reporting team has developed an initial integration with IBM Cognos® TM1® to provide an environment for developing timely, reliable and personalised forecasts and budgets.
IBM Cognos® TM1® is a complete enterprise planning solution. It supports a full range of enterprise planning requirements including financial analytics and financial modelling.
It provides a facility to load data from a variety of sources and model this as OLAP cubes. Rules and calculations can then be added to these cubes prior to being made available to the users who can then interact with the cubes to work with their data. A flexible modelling environment is provided whereby users can perform adhoc analysis of the data in the cube, or remodel the data in the cube by amending values and seeing the effects.
IBM Cognos® TM1® is fully scalable and capable of handling large, sophisticated models and large data sets. Furthermore, role-based security is available that supports multiple users and ensures that users see only those portions of the plan that they need to.
There is also a choice of interfaces available including Microsoft® Excel® and Cognos TM1 Web allowing you to work with your preferred look and feel.
How will this work with TUAM?
The first integration scenario covered is to provide a way to calculate the values for rates based on usage for cost recovery.
The TUAM integration blueprint provides a set of processes to create cubes containing data from TUAM. Summary data will be available for use by forecasting and planning processes and can also be made available for reporting, providing users with the ability to slice and dice the data to get a full understanding of their costs and usage as well as monitoring how costs are being recovered. Processes will also be available to write back any relevant calculations to the database which in the initial case will be the calculated rate values.
Using and then extending these processes will help users to model what-if scenarios to understand what the effects would be of adding additional costs, changes in usage or other scenarios that can occur across a financial year. Users can amend their costs or forecasted usage and redistribute these across clients or financial periods and see immediately the effect this will have on cost recovery.
IBM Cognos® TM1® is compatible with Tivoli Common Reporting so reports can be written based on the cubes defined in TM1. Extending the existing capability will allow users to create reports to show Actuals against Budget and Forecast so users can stay up-to-date with their cost recovery. Users will also be able to create their own reports to show data for their own specific needs.
By working with this initial package or blueprint, all the benefits of IBM Cognos® TM1® can be utilised to forecast and plan usage and monitor the progress of cost recovery.
What can I expect to see?
The initial stages provide processes to create and incrementally update the summary data stored in TUAM. This allows users to be able to quickly query the data and get an understanding of the usage and charges being accrued. This will be created with a view to being a data source for the usage data required to calculate the values for the rates in the modelling process.
Furthermore, processes for creating a rate cube to contain all the rates and their new values will be provided. There will also be processes to take this data and write it back to the database for use in TUAM.
If you would like more details of this and some assistance with getting started then please post an entry in the TUAM forum.
How can I learn more?
Details of the benefits and functionality of TM1 can be found here.
An example of how TM1 can be applied to a industry solution such as Banking and Insurance can be found here.
How do I get this solution?
This is a integration solution so you will need to purchase IBM Cognos® TM1® separately. See here for details.
Note: The provided package has been installed and tested successfully by TUAM development on 32 bit systems. There are known issues with 64 bit systems which the team will address in the future if the functionality proves popular. All feedback is welcome but please ensure you install on a 32bit system before looking for help with installing on the TUAM forum (link). General feedback is welcome in the comments section on this page
We have created a dedicated SmartCloud Monitoring DeveloperWorks forum:
Please visit the IBM SmartCloud Monitoring forum, where users can discuss technical topics related to the product and its deployment. SmartCloud Monitoring is a product bundle featuring existing Tivoli infrastructure management products with new enhancements and integrations. Forum users are encouraged to share their solutions and ideas in open discussions, including customizations, custom reports, integrations with other Tivoli and third-party products, user scenarios and workarounds to product limitations.
My name is Marvin Goodman, and I'm the Product Manager for SmartCloud Monitoring, a new offering that was released very late in December. It's a new bundle of existing technology (IBM Tivoli Monitoring operating systems agents and iTM for Virtual Environments hypervisor monitoring agents) so it's got a substantial pedigree, despite its relatively new name. The goal was to provide a simple-to-purchase solution (one part number to monitor both the virtual infrastructure and the virtual machines running with in it.
As a new offering, we're still working feverishly to develop new product collateral, including PowerPoint presentations, white papers, demo videos and other materials. Those materials will be shared here, as well on the product's main web page: http://www-01.ibm.com/software/tivoli/products/smartcloud-monitoring/
AHUP_Gianluca_Bernardini 120000AHUP Tags:  provisioning cloud bot decentralized smartcloud scale-out hslt p2p 6,366 Views
SmartCloud Provisioning is designed to minimize the use of a centralized “command and control” approach, in favor of scale out management, where endpoints can participate in management activities and do not depend on a single configuration management database.
This allows SmartCloud Provisioning to handle multiple provisioning tasks in parallel, across an unlimited number of servers.
Cloud users can request deployments of virtual machines and have access to the provisioned systems in very few seconds, thanks to the parallel and distributed processing that happens transparently and under the covers.
Let’s drill down into the details about this distributed management approach.
SmartCloud Provisioning internally uses a peer to peer (P2P) messaging infrastructure to pass provisioning and management messages between agents, which contribute to the decentralized control.
Agents are installed on the compute nodes (i.e. the hypervisors) as well as on the storage nodes, where images and volumes reside.
The P2P connections between agents not only allow self-monitoring of their health in order to implement a low-touch management infrastructure, but also allow orchestrating the communications to achieve an effective load distribution and decentralized management of the requests performed by cloud users.
The P2P communication overlay is backed by a distributed lock service, which is based on ZooKeeper.
ZooKeeper is a distributed, open-source coordination service for distributed applications, which exposes a simple set of primitives that distributed applications can build upon to implement higher level services for synchronization, configuration maintenance, and groups and naming. It is designed to be easy to program, and uses a data model styled after the familiar directory tree structure of file systems.
Like the distributed processes it coordinates, ZooKeeper itself is intended to be replicated over a set of servers that must all know about each other. They maintain an in-memory image of state, along with a transaction logs and snapshots in a persistent store.
SmartCloud Provisioning agents connect to a single ZooKeeper server. Each agent maintains a TCP connection with the Zookeeper server, through which it sends requests, gets responses, gets watch events, and sends heart beats. If the TCP connection to the server breaks, the agent will connect to a different server.
When a deployment request is received by SmartCloud Provisioning, the request is processed by the Web Services layer, passed to the management infrastructure, and managed by the agents and the ZooKeeper services.
The following steps describe in more details the internal communications, as depicted in figure 1 below.
This processing happens in a transparent way for the end user, who just sees the deployment request being served in few seconds.
As I said, this processing happens under the covers in a very fast way and the user does not have to worry about any of the steps above.
This allows reaching high levels of parallelism, decentralized management, as well as scale-out capabilities that can be easily reached by increasing the number of servers.
If you're interested in trying the SmartCloud Provisioning distributed management capabilities, you can download a trial version from the following link: