Brocade CTO Named to TechAmerica CLOUD(2) Commission
Commission to Provide Recommendations on Deployment of Cloud Technologies to the United States Federal Government
SAN JOSE, CA -- (MARKET WIRE) -- 04/15/11 -- Brocade (NASDAQ: BRCD) today announced that Dave Stevens, the company's chief technology officer (CTO) has been named a Commissioner on the TechAmerica Foundation's "Leadership Opportunity in U.S. Deployment of the Cloud," known also as CLOUD(2).
The commission's mandate is to deliver recommendations to the U.S. government on ways it can effectively deploy cloud technologies and set specific public policies that will help drive further cloud innovation in both the private and public sectors.
Brocade has direct and highly relevant experience in the challenges and opportunities that the CLOUD(2) Commission is addressing by the virtue of its 15 years of experience in building mission-critical data center networks for some of the most demanding IT environments in the world. This experience and expertise has positioned Brocade to address the challenges on moving to more agile, flexible cloud IT models.
The Brocade approach, as defined by its Brocade One™ strategy, is to help its customers migrate smoothly from current networking architectures to a world where information and applications reside and can be accessed anywhere through open, multivendor cloud technologies.
"Brocade is an established leader in building and deploying fabric-based data center architectures, and customers continue to trust their networks to Brocade as they move to highly virtualized and cloud models," said Dave Stevens, chief technology officer at Brocade. "I am honored to serve as a commissioner for CLOUD(2), and I Iook forward to the opportunity to leverage our experience in this space and to play a key role in advancing the deployment of cloud architectures."
The commission will make recommendations for how government should deploy cloud technologies and address policies that might hinder U.S. leadership of the cloud in the commercial space. Recommendations for government deployment will be presented to Federal Chief Information Officer Vivek Kundra. Commercial-facing recommendations will be shared with Commerce Secretary Gary Locke and Commerce Under Secretary Pat Gallagher.
"The Obama Administration has demonstrated a clear understanding of the need to adopt cloud technologies across the government enterprise," said Dallas Advisory Partners Founder, and TechAmerica Foundation Chairman, David Sanders. "CLOUD(2) represents a broad range of companies, and is well-positioned to provide diverse insight on issues critical to the cloud. These new commissioners will be essential to the continued advancement of U.S. innovation, and we look forward to providing the Administration constructive recommendations that address these critical issues."
The commission is composed of 71 experts in the field, from both the business and academic worlds. Leading the CLOUD(2) commission are co-commissioners Salesforce.com CEO and Chairman Marc Benioff and VCE Chairman and CEO Michael D. Capellas, as well as CSC North American public sector president Jim Scheaffer and Microsoft corporate VP of technology policy and strategy Dan Reed.
Also joining co-chairmen Benioff and Capellas representing academia will be John Mallery of Massachusetts Institute of Technology (MIT) Computer Science and Artificial Intelligence Laboratory, and Michael R. Nelson, visiting professor of Internet studies in Georgetown University's Communication, Culture and Technology Program.
A full list of commissioners is available at http://www.techamericafoundation.org/cloud-commission-commissioners
To learn more about CLOUD(2), please visit http://www.techamericafoundation.org/cloud-commission
Brocade (NASDAQ: BRCD) networking solutions help the world's leading organizations transition smoothly to a world where applications and information reside anywhere. (www.brocade.com)
Brocade, the B-wing symbol, BigIron, DCFM, DCX, Fabric OS, FastIron, IronView, NetIron, SAN Health, ServerIron, TurboIron, and Wingspan are registered trademarks, and Brocade Assurance, Brocade NET Health, Brocade One, Extraordinary Networks, MyBrocade, VCS, and VDX are trademarks of Brocade Communications Systems, Inc., in the United States and/or in other countries. Other brands, products, or service names mentioned are or may be trademarks or service marks of their respective owners
Chapter 14 - Management Platform & Managed Environments
To design a good cloud management platform we need to
understand the managed environment. As we know that the workloads would include
not only stuff running on virtual infrastructure but also traditional
infrastructure. So we need to design a management platform that can support
delivery of traditional services as well as cloud services.
The advantage of using IBM reference architecture (refer
previous chapter) is that we the service management cost to a minimum and be
able to manage multiple services (IAAS, PAAS, SAAS, Traditional Services)
through a single management platform (Common Cloud Management Platform).
The design of the management platform is mainly driven by
what platforms we need to manage as well as the services we have to deliver.
The core components of the management platform are determined by the amount of
service automation expected to be provided by the platform.
The cloud management platform can be thought of like a
Service Delivery Platform as applied to Telecommunication industries. The term Service Delivery
Platform (SDP) usually refers to a set of components that provides a
services delivery architecture (such as service creation, session control &
protocols) supporting multiple delivery models of service.
The core components can be again classified into the
business support (BSS) components and the operational support (OSS) components. The
business components include ways to manage the customer, subscription, offering
& catalog, contract, order, billing, and financial aspects of the platform.
The OSS deals
with the backend aspects of fulfilling the service request. So it includes
components like service automation, provisioning, monitoring and management.
The IBM Tivoli suite of products supports addressing almost
all of the OSS
requirements as well as some of the key components in the BSS components. As an
architect, the key decisions to take are to look at the capabilities required
based on the client needs and create a platform that is extensible. This needs to be done keeping flexibility in
mind which means you have the capability to add and remove components to
support different capabilities. In an
established and mature Data
Center, it is highly
unlikely that all these components are delivered by a single vendor. That’s why
an architecture build on open standards is critical to the success of building
a good management platform.
IBM is leading the efforts for adoption of standards by
different cloud providers, consumers and tools vendors. The work being done by
IBM with Open Group and Cloud Standards Customers Council are
some examples for the same.
Once we have determined the functional components of our
solution we need to worry about the non-functional requirements. These include
aspects like security, availability, resiliency, performance, scalability,
capacity planning and sizing. We will
need to determine these aspects for the management platform based on the size
and heterogeneity of the managed environment. We will discuss these aspects in
the next chapter.
Teresa Takai, the Defense Department's chief information officer, says the "paramount" goal of effective security in a cloud computing infrastructure is best achieved using an internal "private" system, though she wouldn't rule out use of commercial providers.
In oral testimony at a hearing of the House Armed Services Subcommittee on Emerging Threats and Capabilities on April 6, Takai said Defense could opt for public cloud services offered by companies such as Google and Microsoft Corp.
In response to questions from Rep. James Langevin, D-R.I., Takai said, "There will be instances where we [can] use commercial cloud providers ... [if] they meet our standards." She did not specify what type of applications Defense would host on a commercial cloud.
Takai added the department plans to tap the Defense Information Systems Agency, which already is providing private cloud services to the Army and email service for 1.4 million personnel. The Army, Takai said, is "looking to move [its] apps to the cloud."
One of her key priorities is to secure the Pentagon's classified networks after masses of data were illicitly siphoned off last fall to the WikiLeaks website, said Takai, who took office last October. In her prepared testimony, she said Defense plans to deploy a public key infrastructure-based identity credential on a hardened smart card for use on the department's Secret classified networks. It is similar to, but stronger than, the technology in the Common Access Card on unclassified networks.
Defense also plans to use a Host-Based Security System to protect classified networks, a tool that "will allow us to know who is on the network" and detect anomalous behavior, Takai told the hearing.
Intel® Cloud Builders Reference Architecture Library
Key challenges and focus areas for IT include enhancing efficiency,
security, resource utilization, flexibility, and simplifying data center
management, among others. Intel works closely with leading systems and
solution providers to deliver proven reference architectures to address
IT challenges. This work is based on IT requirements—from a wide range
of end users—that address challenges in evolving to cloud and next-
generation data centers, including the evolving usage requirements of
the Open Data Center Alliance.
This lab-based experience is embodied in Intel® Cloud Builders
reference architectures. Each reference architecture provides detailed
instructions on how to install and configure a particular cloud software
solution using Intel® Xeon® processor-based servers.
Developed with ecosystem leaders, the following reference architectures relate to building a cloud, or Infrastructure as a Service (IaaS), and to enhancing and optimizing cloud infrastructure with a focus on security, efficiency, and simplifying your cloud environment.
Learn more about how to build and optimize your cloud infrastructure via reference architecture guides below. Read More>
ARMONK, N.Y. - 07 Apr 2011: IBM (NYSE: IBM) has joined more than 45 leading cloud organizations to form the new Cloud Standards Customer Council, which is managed by OMG®. Organizations including Lockheed Martin, Citigroup and North Carolina State University have already joined the Council, which will help advance cloud adoption prioritizing key interoperability issues such as management, reference architectures, hybrid cloud, as well as security and compliance.
The Council will complement vendor-led cloud standards efforts and establish a core set of client–driven requirements to ensure cloud users will have the same freedom of choice, flexibility, and openness they have with traditional IT environments. The Cloud Standards Customer Council is open to all end-user organizations and further enhances customers' abilities to offer both public and private cloud offerings through a standardized platform.
IBM is inviting all of its users to participate in the CSCC and work together in addressing the challenges faced while implementing Cloud Computing. The group will work to lower the barriers for widespread adoption of Cloud Computing by helping to prioritize key Interoperability issues such as cloud management, reference architecture, hybrid clouds, as well as security and compliance.
“To make Open Cloud successful and reflective of real business needs, IBM is asking for client feedback regarding their direction and priorities around cloud standards development,” said Angel Diaz, vice president, IBM Software Standards. “This council is designed to focus on the reality of what provides the greatest cloud computing benefits for clients. Ultimately, this effort is about how organizations can use what they have today and extend their business - using open standards - to get the greatest benefits from cloud.”
In our previous posts on the IT industry’s shift to the Cloud Services era, we’ve provided definitions
, market context
, user adoption
trends, and user views about cloud services benefits, challenges
In this post, We offer our initial forecast of IT cloud services delivery across five major IT product segments.we
offer our initial forecast of IT cloud services delivery across five
major IT product segments that, in aggregate, represent almost
two-thirds of enterprise IT spending (excluding PCs). This forecast
sizes IT suppliers’ opportunity to deliver their own IT offerings to
customers via the cloud services model (”opportunity #1“, as described in our recent post Framing the Cloud Opportunity for IT Suppliers).
The development of this forecast involved a team of over 30 IDC analysts, led by Robert Mahowald (Business Applications/SaaS), Tim Grieser (Infrastructure Software), Steve Hendrick (Application Development & Deployment Software), Matt Eastwood (Servers) and Rick Villars (Storage), with additional contributions from David Tapper (Outsourcing/Hosted Services) and John Gantz (Global Research).
SAN FRANCISCO, CA,
07 Apr 2011:
IBM (NYSE: IBM) today
unveiled its next generation IBM SmartCloud, an enterprise-class, secure
cloud specifically created to meet the demands of businesses.
To accelerate the shift from experimentation, development and
assessment to full scale enterprise deployment of cloud, IBM is building
out its existing cloud portfolio with IBM SmartCloud, enterprise cloud
technologies and services offerings for private, public and hybrid
clouds based on IBM hardware, software, services and best practices.
As part of this announcement, IBM is demonstrating a next-generation,
enterprise cloud service delivery platform currently piloting with key
clients and available later this year. For the first time, enterprise
clients will be able to select key characteristics of a public, private
and hybrid cloud to match workload requirements from simple Web
infrastructure to complex business processes, along five dimensions,
· Security and isolation
· Availability and performance
· Technology platforms
· Management Support and Deployment
· Payment and Billing
The IBM SmartCloud includes a broad spectrum of secure managed
services, to run diverse workloads across multiple delivery methods both
public and private. It includes customer choice with the potential for
end-to-end management of service delivery from the server and operating
system to the application and process layer.
“The new IBM SmartCloud allows for the best of both worlds – the cost
savings and scalability of a shared cloud environment plus the
security, enterprise capabilities and support services of a private
environment,” said Erich Clementi, senior vice president, IBM Global
Technology Services. “In thousands of cloud engagements, we have
discovered that enterprise client wants a choice of cloud deployment
models that meet the requirements of their workloads and the demands of
This level of choice and control translates into capabilities
customized to your needs and priorities, whether you’re deploying a
simple web application, an ordering logistics system or a complete ERP
The new IBM cloud can enable organizations, their employees and
partners, to get what they need, as they need it – from advanced
analytics and business applications to IT infrastructure like virtual
servers and storage or access to tools for testing software code - all
deployed securely across IBM’s global network of cloud data centers.
The IBM SmartCloud has two implementation options: Enterprise and Enterprise +.
- Enterprise – Available today and
expanding on our existing Development and Test Cloud allowing customers
to expand on internal development and test efforts with reduction of
application development tasks from days to minutes via automation and
rapid provisioning with over 30% reduction in costs versus traditional
application environments. This offering is available immediately.
- Enterprise + -- To be made
available later this year, Enterprise + will complement and expand on
the value of Enterprise, offering brand new capabilities provide a core
set of multi-tenant services to manage virtual server, storage, network
and security infrastructure components including managed operational
Cloud computing fundamentals
Summary: A revolution is defined as a change in the way
people think and behave that is both dramatic in nature and broad in scope. By
that definition, cloud computing is indeed a revolution. Cloud computing is
creating a fundamental change in computer architecture, software and tools
development, and of course, in the way we store, distribute and consume
information. The intent of this article is to aid you in assimilating the
reality of the revolution, so you can use it for your own profit and well
being. Learn more>
Last year’s acquisition policy pronouncements are starting to be felt
across to the U.S. Army, with upticks in cloud computing initiatives,
increasing use of fixed-price contracts and adoption of social media.
“Army IT spending will remain stable; the goal is to optimize the IT
[spending]. Optimization will be guided by computing trends,” said Gary
Winkler, Army program executive officer for enterprise information
He was one of several Army acquisition speakers at the AFCEA Belvoir
Industry Days conference at the National Harbor in Oxon Hill, Md. Winkler also recently announced he is leaving the Army.
Efforts to improve efficiency, realign spending priorities and
streamline a cumbersome acquisition process were launched during the
past year amid a tightening national budget by Defense Secretary Robert
Gates and Ashton Carter, undersecretary of defense for acquisition,
technology and logistics.
Leading the charge for the Army’s efforts to hold down spending and
become more efficient are cloud computing initiatives, mobile
technologies, data center consolidation and social collaboration,
Winkler said that mobile data traffic is on track to increase by 39
times between 2009 and 2014, and the social software market is showing
40 percent growth per year through 2013 — also contributing to getting
the Pentagon’s policies rolling further down in operations.
The Army also wants to increase use of firm fixed-price and
multiple-source contracts, as directed in Carter’s Better Buying Power
initiative, and looking to maximize broadly scoped contracts that can be
used for a variety of missions.
However, there are still plenty of challenges, and there likely will
be more to come. Winkler predicted that force reductions could still lie
ahead for DOD, citing his own experience in the 1980s when, like now,
an insourcing effort was followed by a hiring freeze — which was later
followed by layoffs.
“We can tighten our belts and squeeze a little bit [as directed by
the Pentagon] — but I think it’s going to be more than just a little
bit,” Winkler said.
Still, PEO-EIS has been involved in the development of Better Buying
Power tenets, including helping shape concepts and strategies for
improving tradecraft services, establishing common taxonomy and
reforming IT acquisition — all banner items in Carter’s 23-point
acquisition reform plan released last September. Read More>
"Provision public cloud resources or securely extend your internal
virtualized infrastructure into the public cloud with VMware and our
vCloud Powered service providers, the largest ecosystem of cloud
computing partners. Leverage secure hybrid cloud resources with
confidence while providing choice and flexibility, ensuring
interoperability and portability of workloads between cloud environments
with a VMware vCloud infrastructure built on
VMware vCloud Director
"Security often comes up as a big stopping point for cloud computing.
One of the ways around this is to build a private cloud – one that
remains within the corporate firewall and wholly controlled internally.
That was the approach taken by Los Alamos National Laboratory as it
seeks to create an infrastructure on demand (IOD) architecture to
simplify the rollout of new technology projects and to eliminate delays
in storage, server and network provisioning.
Anil Karmel, IT manager at Los Alamos National Lab noted four tenets that played a major role in the private cloud decision:
• green IT
• streamlined operations
• rapid scaleup/down
“As we deploy more virtual servers, we consume far less power and also
reduce electronic waste,” said Karmel. “We estimate eventual savings of
$1.3 million annually due to IOD.”
Server capacity on demand is now achievable in a few clicks. Instead of
30 days to provision a server, it now takes less than 30 minutes.
The organization is utilizing HP c7000 blade enclosures along with HP
Virtual Connect Fibre Channel/Flex 10 Ethernet. HP BL460c and BL490c
blades are used, with each blade containing multiple quad-core and
A NetApp SAN was brought in to add storage capacity. This is based on
the NetApp V Series with 2 PBs of Tier 2 SATA storage. Tier One is
provided by existing HP arrays.
The cloud itself consists of four elements: a web portal at the front
end; Microsoft SharePoint as the automation engine for cloud workflows,
and also as the integration point for functions such as chargeback;
VMware vCloud Director to manage and operate the cloud; and VMware
vShield to provide security at both the application level and at the
user device level.
“Any virtual environment has to be cost effective, so that means it has
to be simple while being aware of any and all changes in real time,”
This is especially important in the security arena. Traditional security
operates at the hardware or software layer. But the addition of a
virtualization layer, said Karmel, provides too many gray areas for such
security tools to operate effectively. Hence security itself is now
being virtualized to eliminate yet another wave of security holes
showing up in the corporate networks.
Using Infrastructure on Demand, the National Lab is creating virtual
security enclaves using vShield that prevent one desktop or client from
infecting others, and keeps virtual machines (VMs) out of harm’s way.
Rules are set indicating access rights, as well as security protocols
based on threat detection. Traditional security tools interface with
this virtual security layer to keep servers and devices more protected.
Any time a threat is detected, the offending virtual computer is sent to
a remediation area, which has no network connectivity with which to
“This all occurs automatically based on preset policy,” said Karmel. “If
a VM is moved from one host to another, the security policy given to it
moves with it.”
To prevent VM sprawl, VMs are given an expiry data. This is one year by
default, though that can be adjusted. 30 days before the due date, an
email is automatically generated asking the VM owner about renewal.
Another similar email is relayed with 10 days left and then again the
day before expiry. As soon as the VM is turned off, the user is informed
of the fact and asked if he/she wants it back on line. Even then, 29
days later, the user is told that VM is scheduled for deletion. The next
day it is deleted.
However, a backup is retained for seven years just in case. The NetApp
storage is used to create snapshots of VMs before they are retired to
tape. For now, restores are not automated. But in the next version of
Infrastructure on Demand, users will be able to restore VMs they desire
in a few clicks.
“Lifecycle management of VMs is very important,” said Karmel.
The organization has erected a chargeback structure. Cloud resources are
priced according to CPU, RAM and disk. Users can see the total cost
before submitting a request for IT resources. Following a request, the
line manger has to approve and accepts the charges to that unit.
“You have to build best practices around our workloads,” said Karmel.
Service Level Agreements (SLAs) are set at four 9’s. If some hardware
goes down and Infrastructure on Demand doesn’t meet the SLA, it doesn’t
charge for that resource for that month. In addition, uptime and
availability metrics are regularly published so users are fully
At the moment, separate network, security and virtual server teams are
being maintained to monitor the infrastructure. Over time, this may be
streamlined to one centralized unit."
Chapter 13 - Cloud Computing
of the important things to decide when you discuss Cloud Service Strategy and
Design is the consideration for a Reference Architecture. This is something that is useful to align to
as it represents the blueprint for your cloud and make the implementation risk
free. The Cloud Computing Reference
Architecture (RA) is intended to be used as a blueprint / guide for
architecting cloud implementations, driven by functional and non-functional
requirements of the respective cloud implementation. The RA defines the basic
building blocks - architectural elements and their relationships which make up
the cloud. The RA also defines the basic principles which are fundamental for
delivering & managing cloud services.
architecture is more than just a collection of technologies and products. They
consist of several architectural models and are much like a city plan. The RA defines how your cloud platform should
be constructed so that it can satisfy not you’re your current demands and but
also be extensible to support the future needs of a diverse user population. So
this blueprint should be responsive to changing business and technology
requirements and adaptable to emerging technologies. Existing “legacy” products and
technologies as well as new cloud technologies can be mapped on the AOD to show
integration points amongst the new cloud technologies and integration points
between the cloud technologies and already existing ones. By delivering best practices in a standardized,
methodical way, an RA ensures consistency and quality across development and
IBM Cloud Computing RA is structured in a modular fashion with each functional capability
(architectural elements), the user roles (that we discussed in Chapter 12) and
their corresponding interactions. The IBM CCRA is created based on several
cloud engagements and incorporates all the good practices and methods
implemented across these projects. So for an end user adopting these good
practices the risk and cost of implementation of their cloud will be low. The
CC RA is built on the ELEG ( Efficiency, Lightweightness, Economies-of-scale,
of the principles that I want to highlight here is the Genericity Principle –
That’s the capability to define and manage generically along the Lifecycle of
Cloud Services: Be generic across I/P/S/BPaaS & provide ‘exploitation’
mechanism to support various cloud services using a shared, common management
platform (“Genericity”). As we know or
discussed in the cloud delivery and deployment models (Chapter 3) there can
many models for deployment and delivery of a Cloud Services. As we know Cloud
Service can represent any type of (IT) capability which is provided by the Cloud
Service Provider to Cloud Service Consumers - Infrastructure, Platform,
Software or Business Process Services. The beauty and significance of the IBM
Cloud Computing Reference Architecture is that it can cater to any of these
service delivery and deployment models. So if you are building your private
cloud or public cloud or using cloud to deliver IAAS, PAAS or SAAS the RA
remains the same and handle all of these combinations. We have seen the
capabilities that we need (Chapter 6) for implementing a common cloud
has recently submitted
Cloud Computing Reference Architecture 2.0 (CC RA) (.doc) to the Cloud
Architecture Project of the Open Group,
a document based on “real-world input from many cloud implementations across
IBM” meant to provide guidelines for creating a cloud environment. Check
out this link
which has the interview with Heather Kreger, one of the authors of Cloud Computing
Reference Architecture as well as the details of the components that make up
the topic there is also an article that I found on syscon cloud computing
journal which is comparing the Reference Architecture of the Big Three (
IBM, HP and Microsoft) which is an
before we get into the details of the Service Implementation / Transition phase
it is important that we understand the bigger picture. The word document IBM
Cloud Computing Reference Architecture 2.0 (CC RA) (.doc) provides a great
description of this bigger picture and going into the details as required. The
architectural principles define the fundamental principles which need to be
followed when realizing a cloud across all implementation stages (architecture,
design, and implementation). This is a must read for all - development teams
implementing the cloud delivery & management capabilities as well as
practitioners implementing private clouds for customers.
-By the End of the Decade One in Four UK Power Stations are Set to
Close and UK Gas Production is Expected to be Half of Current Levels,
yet Demand for Electricity is Expected to Increase by More than 50 Per
Cent by 2050
-This Collaboration is to Create a Flexible,
Secure and Scalable Data and Communications Hub to Support the UK
Government's Smart Meter Implementation Programme and its Strategy to
Cut Emissions by 80 Per Cent by 2050
21 Mar 2011:
IBM (NYSE: IBM)
and Cable&Wireless Worldwide (LSE: CW.L), today jointly announce
their collaboration to develop a new intelligent data and communications
solution, UK Smart Energy Cloud, to support the UK's Smart Meter
Implementation Programme, which aims to rollout more than 50 million
smart meters in the UK.
UK Smart Energy Cloud has the potential to provide a complete
overview of energy usage across the country and pave the way for easier
implementation of a smart grid. The solution will utilise the extensive
experience IBM has gained from leading and implementing smart grid
programmes around the world and its proven enabling software and
middleware. The solution will be supported by C&W Worldwide's
extensive, secure next-generation network and communications integration
There has never been a more challenging time for the energy industry
with decisions being taken to protect the country's energy supply that
will have significant implications for everyone in the UK. Both smart
meters and the smart grid are significant steps on the journey to a new
energy future, potentially changing for the better the way we consume
and distribute energy.
- Improve performance and scalability by optimizing IT assets based on workload to ensure the ideal elasticity of your cloud.
- Enterprise quality of service (QOS) virtualization provides the best
foundation for your mission-critical applications running in the cloud.
- Automated management, provisioning and optimization of your physical
and virtual cloud resources ensure optimal utilization to meet changing
- Self-service portal and standardized service catalog leverage all
the features of your cloud infrastructure to enable automated delivery
of services without IT intervention.
- Metering and billing features provide the capabilities to improve
cost transparency and offer more flexible pricing schemes for your cloud
The unprecedented interest and projected IT spend on cloud computing
is coming from all types of organizations, businesses and governments
that are seeking to transform the way they deliver IT services and
improve workload optimization so they can quickly respond to changing
business demands. Cloud computing can significantly reduce IT costs and
complexities while improving asset utilization, workload optimization
and service delivery.
Today’s IT Infrastructures face challenges on many levels:
- Composed of silos that lead to disconnected business and IT infrastructures
- Contain static islands of computing, which result in inefficiencies and underutilized assets
- Struggle with rapid data growth, regulatory compliance, integrity and security
- Continuous rise of IT administration costs
As a result of these challenges, organizations are demanding an IT
infrastructure and service delivery model that enables growth and
innovation. An effective cloud computing environment built with
IBM Power Systems™ cloud solutions helps organizations transform their
data centers to meet these challenges:
- Delivering integrated visibility, control, and automation across all business and IT assets
- Is highly optimized and scales IT up and down in line with business needs
- Addresses the information challenge by delivering flexible and secure access to data and mitigating risks
- Utilizes flexible delivery models, automation and virtualization to
greatly simplify IT service delivery and provide enterprise QOS
capabilities including higher application availability, improved
performance, more scalability and enterprise-class security.
Power Systems cloud solutions enable customers to build an effective
cloud computing environment, enabling organizations to reduce IT costs,
improve service delivery and enable business innovation.
24 Mar 2011:
IBM (NYSE: IBM
launched new, cloud-based software designed to help marketers gain
real-time, actionable insight from data available across social media
(Photo: https://photos.prnewswire.com/prnh/20110314/NY64247-a )
The new software expands IBM's business analytics capabilities by
enabling organizations to develop faster, more precise social media
marketing programs that support their brand's total online presence
through a cloud-based delivery model.
The first product, IBM Coremetrics Social, helps companies analyze
the business impact of their social marketing initiatives, while IBM
Unica Pivotal Veracity Email Optimization Suite analyzes email links
that are shared across social network platforms, enabling marketers to
better capitalize on opportunities across channels.
Today's news follows IBM's recent announcement
of new software and the creation of a new consulting practice dedicated
to the emerging category of "Smarter Commerce," which is focused on
helping companies swiftly adapt to rising customer demands in today's
digitally transformed marketplace. Smarter Commerce includes new cloud
analytics software that enables companies to monitor their brand's
presence in real-time through social media channels to better assess the
effectiveness of new services and product offerings, fine tune
marketing campaigns, and create sales initiatives in real-time.
"IBM's approach to social media analytics is based on the
understanding that people interact with an organization's brand in a
number of ways—including email, social networking sites and company Web
sites—and the true measure of business impact demands a fully integrated
view of the interaction with these resources," said John Squire, chief
strategy officer, IBM Coremetrics. "The new social
media analytics software unveiled today will help marketers develop more
targeted, highly-measurable, and effective social media marketing
IBM Coremetrics Social enables organizations across a wide range of
industries to measure the effectiveness and return on investment (ROI)
of their social marketing initiatives by gaining insight from data
that's publicly available on social media websites.
This Smarter Commerce offering delivers real-time intelligence on the
social media response to a particular brand, or the products, content
and services being offered, and enables clients to make fact-based,
accurate decisions about marketing expenditures. As a result, marketing
teams can easily attribute business impact to social referrals in the
context of other marketing programs.
Using the analytics foundation of the Coremetrics Continuous Optimization Platform™
and its complete suite of marketing optimization applications, IBM
Coremetrics Social provides cross-channel reporting and benchmark
capabilities to track and improve social marketing campaigns. With
social benchmarking, brands can evaluate the effectiveness of their
social initiatives relative to their peer companies, and understand
where they excel, and where there is opportunity for improvement.
It has become routine for social networks to be used as a resource to
broadly share links to special offers made available by companies via
email. Well-known brands can expect to see as much as 38 percent of
their special offer email links shared across social networks. An
average of 28 percent of these links is then 'liked' or commented on.
The new IBM Unica Pivotal Veracity Email Optimization
suite tracks and analyzes email links that are shared across social
network platforms, delivering actionable insights which marketers can
turn into recognizable profit. Unlike other technologies, this new
offering opens the doors for marketers to identify, track, and improve
the perception of their brands across channels. The Social Email
Analytics software tracks all links associated with a marketer's brand
and email, not just the intended links a marketer shares. This approach
better encompasses and reflects the emerging complexities and
ramifications of consumer interactions with brands, starting with email
and ending up in the social realm. With this new software, marketers can
also hone Web pages for social networks and better identify
opportunities across channels.
For more information on IBM's Smarter Commerce initiative, please visit: http://www-03.ibm.com/press/us/en/presskit/33983.wss
For more information on Coremetrics, an IBM Company, please visit http://www.coremetrics.com/
For more information on Unica, an IBM Company, please visit http://unica.com/