The March issue of the IBM Software Newsletter has been in inboxes for just a little more than 24 hours, but already it looks like the new IBM SmartCloud offerings, announced at Pulse 2012, are getting most of the attention from our readers. Here�s a quick reader�s/viewer�s guide you can use to ramp up on the offerings, and what they can do for your organization:
Start with the press release announcing all the new IBM SmartCloud offerings
Another SmartCloud capability, announced in advance of Pulse, is also attracting clicks: IBM SmartCloud Enterprise � Object Storage, which dramatically simplifies management of unstructured data (emails, social content, documents, and other non-row-and-column data) in the cloud. For more information, visit the SmartCloud Enteprise web page, and scroll down.
Midsize businesses may want to please customers -- but do they always know how? Paul Graham -- one of the founders of Viaweb, which was sold for a zillion dollars to Yahoo back in the nineties -- once said this: �I think most businesses fail because they don't give customers what they want.�
I love that quote because it has the effect of demystifying business success. You don't have to go to Wharton to understand this root principle: Money is dear. And people won't generally spend it unless, by doing so, they expect to get something even dearer.
This being so, you might wonder why businesses don't all take Paul's advice -- just give customers what they want. I can think of three primary reasons:
1. They can't (because it's too complex, too costly or too difficult to scale -- the music industry's refusal to give away all its music for free is a good example).
2. They won't (because they think they can offer something even better -- Steve Jobs was the leading evangelist of this line of thought).
3. They don't even know what their customers want.
Of the three, by far the most common (in my opinion) is the third. This seems a little ridiculous on the face of it; how can organizations of any competence fail to know what their own customers want?
Well, the fact is that quite often they really just don't. Consider the history of Coca-Cola -- a company in one of the world's most static markets, which has been dominated by two companies, and two major products, for more than a hundred years. You'd think in this scenario it would be easy enough for those two companies to extrapolate exactly what their customers want: more of the same.
And you'd be wrong. You might recall that, based on what it considered sound empirical data, Coca-Cola tried revamping its core product in the eighties. The failure of the revamped offering was so spectacular that, 20 years later, New Coke remains one of the leading examples in the history of corporate crash-and-burn. (I myself will fake-spit if you say �New Coke� out loud.)
For enterprises like Coca-Cola, of course, cash reserves and brand momentum are saving graces in such a crisis. Sure, okay, maybe their whole strategy cratered, but they had the time, resources and customer loyalty to get their game back. Today they're doing just fine.
But what about midsize businesses? That's a little dicier. Organizations in this space are peculiarly vulnerable, to my way of thinking, because they usually don't have a giant pile of money, and a century of history and customer trust, backing them up when they take a major wrong step.
They're also, ironically, too large in certain ways. Specifically, because they have layers of management and thus need lots of authorization and collaboration to roll out major strategies, they lack the agility of nimbler startups with flatter hierarchies, who are continually sneaking up behind them with a lead pipe and a sinister smirk.
So in the midmarket, it seems to me, it's extraordinarily important to be clear about exactly who the customers are, what they want and how best to give it to them (or some reasonably close approximation of it).
Q: Where's that information going to come from? A: CRM.
CRM redux: The central source of customer insight and strategy development
Remember Customer Relationship Management (CRM)? You may have written off CRM years ago -- especially in the midmarket -- because the available solutions were seen as... hmmm, how to put this... �lame.�
Even when they worked, they often required a huge initial investment, ate a lot of computational resources for lunch, weren't suitably configurable and hence didn't really pull their own weight in the IT infrastructure.
For midsize businesses, with more limited budgets and lower tolerance for risk than their enterprise brethren, no part of that sounded very appealing.
But the times are changing. So is the tech -- and the way businesses and their customers use it. As a result, CRM is experiencing something of a resurgence in the midmarket.
How and why? A few points for you to ponder:
� Modular design and open source. Current CRM solutions are almost like Legos; you can build what you need, or imagine, based on interoperable parts. Don't need a certain part? Don't buy it and don't implement it. And when those parts are based on open source code, �buying� doesn't even enter into it.
� Cloud. One of the coolest things about cloud implementations < http://www.ibm.com/innovation/us/engines/solutions_smartcloud.html > is the way they allow organizations with limited budgets to experiment safely with new ideas. Because the cost of cloud services typically reflects actual usage, the risk of creating a new service is pretty close to zip. If the service fails, it will be because it didn't see much use, so the bill will be low. But if the service succeeds, and the bill is high, it will be because the service is rocking somebody's world.
Apply that idea to CRM solutions and you begin to see how midsize businesses can dip their toes in the customer relationship waters without worrying they will get bitten off. Cloud architectures are also intrinsically scalable and self-managing, and thus help organizations increase their overall business agility -- which is exactly what midsize businesses quite often need to do.
� Social media and the incredible new world of useful information it can deliver. Would Coca-Cola have rolled out New Coke if today's Internet had existed in 1985? I'm not at all sure of that. I think Coca-Cola would have tried a small-scale pilot project and would have tracked the customer response as comprehensively and organically as possible.
And by �organically� what I mean is not �asking blindfolded people what they think of the flavor in a taste test booth in a mall,� but �aggregating and reading all available customer tweets and public Facebook posts and blog entries about this wacky new kind of Coke.� That same kind of thinking, stirred into CRM solutions, is already creating a revolution in the way CRM helps midsize businesses understand who their customers are, what they want and how best to give it to them (which may be a familiar phrase if you've read this far).
� Analytics -- often applied to social media data. It's one thing to aggregate social media data on a mass scale and quite another thing to sift through it looking for meaningful patterns < http://www.ibm.com/innovation/us/engines/solutions_cognos.html > that can really take your organization from a worse place to a better one. Today, even on the scale of hundreds of thousands of tweets, midsize businesses are achieving new power to find those patterns, leverage them and ultimately get to the promised land of business success. Here, too, the opportunity to integrate with CRM is incredibly powerful and useful.
Imagine, for instance, something like a brand thermometer (my own concept). The brand thermometer gets hotter in proportion to successful new strategies and colder in proportion to the introduction of New Cokes. Using analytics-driven CRM, midsize businesses can actually create something resembling that thermometer -- tracking not just how well the IT infrastructure is fulfilling target service levels, but what people out in the real world think of their stuff.
� Cross-integration with related business domains. What happens when all the insight I describe above is routed, in usable form, to every point in the organization that touches customers and affects customer relationships? Think of call centers, where technical specialists try to resolve problems or sales pros take orders. Those folks can really benefit from CRM-based insight and understanding, in ways ranging from messaging/positioning to knowing what kinds of problems to expect to drumming up new sales prospects. They can also, based on their experiences, generate new data, which is fed back to the CRM solution for further analysis.
Call centers are really only the beginning of this idea; other, similar possibilities would include retail presences, on-site service personnel and even corporate branding and marketing. CRM-driven clarity about customer interests could even be used to guide midsize businesses as they choose their strategic business partnerships, merge with other midsize businesses or acquire smaller players.
A few weeks ago IBM announced that it was transitioning its Symphony productivity suite development effort to Apache, to work on the OpenOffice project This news raised questions about what IBM's strategy was in the document space. I would like to set the record straight. Let me be perfectly clear � IBM is not stepping back from its investment in Symphony. We are donating our Symphony code to the Apache Software Foundation and directing our development effort to work directly on the combined code in the Apache OpenOffice community. We plan to continue to distribute OpenOffice technology, though likely not under the Symphony brand but instead under the name �Apache OpenOffice 4.0 IBM Edition�, or something like that.
I have further been asked, by many of late, to explain what IBM's document strategy is and why it has chosen to engage with the Apache Software Foundation. To properly answer that question a discussion of the back story is in order.
OpenOffice has been the leading open, multi-platform, free document editor suite in the market for a decade. Many companies distribute it, including IBM (in a version called Symphony which adds a number of IBM enhancements). The project had been led and managed by Sun and then Oracle (by virtue of their purchase of Sun). A couple of years ago a number of community members decided to fork the code and created a derivative known as LibreOffice. A little bit later, Oracle decided to donate the OpenOffice code base and trademarks to the Apache Software Foundation and encouraged a community of interested parties to co-invest and work towards delivering new innovation and value to the market in an open community foundation. IBM and many other parties, new as well as longstanding OpenOffice community members, have been working on the Apache OpenOffce code for about eight months now.
LibreOffice and Apache OpenOffice are very similar and share much of the same code. They are licensed under different licensing regimes. LibreOffice is licensed under a copyleft regime (LGPL) and Apache OpenOffice is licensed under a permissive licensing regime (Apache 2.0).
For some background on open source, licensing and open source business models I recommend taking a look at a paper I wrote a number of years ago:
So what is IBM's document strategy and why did it decide to work with the Apache OpenOffice project instead of the LibreOffice project?
We think that documents represent a very pragmatic way to capture, record and share information. They provide a powerful and flexible substrate for bringing ideas together, refining them and structuring them in a manner that efficiently communicates those ideas.
The world is awash in unstructured data. Much, if not most of it, is in the form of documents, and their number and importance in creating value will increase quite dramatically in the next number of years. Most IT storage specialists, including those managing cloud resources, are planning for unstructured data to bypass structured data in terms of storage , sometime this year. More than that, the value of those documents is being magnified dramatically as multi-user, co-editing tools become available, as they are enriched with semantic, linked data capabilities, as they understand more and more of the social business context in which they are created, evolve, and can potentially be used, and because of the emergence of advanced text analytics, pattern analytics, and social analytics which allow deeper insights to be garnered, activities to be better prioritized, and decisions to be more efficiently and better informed. All of these things represent significant value to our customers and...well... we are in the business of delivering value to our customers.
�.So we care about documents.
Our interest in documents goes far beyond document composition and includes storage and content management, Big Data, various forms of insight-enhancing analytics, Deep Q&A (Watson for example), attention management, risk and trustworthiness, business intelligence, etc... and we are investing in all of them.
Of course, document composition is an important part of this value creation web, so it only makes sense that we invest in it as well. Document editing suites have long been an important fixture of the PC era, and will continue to be important for many years to come, although commoditization pressures will continue to reshape that market. Other significant document technologies have recently come to prominence, most notedly mobile editing, collaborative web editing and wiki's. Undoubtedly other new innovative models will emerge as well. Many of these new models address some of the shortcoming of the traditional document editing suite or leverage some new communication or social infrastructure, but they are also in many ways adjacent and complimentary to traditional document editing suites. That being said , document editing suites still offer very useful power features, functions, and experiences that are very useful for the document creation and editing cycle. IBM's view is that many tools will coexist and compliment each other and as a result we need to make investments in many parts of the document ecosystem and work to extend each of them towards each other of them and allow them to work together. That is where you will get serious value creation acceleration.
Along with IBM Connections social business suite, which has blogs and wiki's and such, and the content management and analytic technologies mentioned above, IBM is also making a large investment in a web-based collaborative editing tool called �IBM Docs� (you can try a pre release version at https://www.lotuslive.com/en/catalog/labs). This tool not only integrates with team content repositories seamlessly and allows for multiple people to co-edit a document simultaneously, but it also integrates with the Activities workflow engine to allow people to assign sections and reviews and to-dos, and of course complete those assignments, mark them as done, and track the progress in the Activity.
That brings us to desktop editing. Apache OpenOffice is an important piece of this ecosystem, and it needs to evolve and innovate to take its rightful place among leading document tools. Oracle's decision to donate the code and trademarks to Apache opened up a wonderful opportunity to assemble companies and individuals that have a vision for the future and to focus their energies and resources. Community development of one of the core pieces of the grander document landscape, a piece that is very mature and has been rapidly commoditizing, makes sense.
So why did IBM decide to invest its resources at Apache? Apache is an open community with a well-known brand with a mature and proven governance model. This reputation allows for the efficient recruitment of co-investors and gives confidence to customers of the technology, thereby lowering barriers to consumption. Along with its well-earned reputation, the fact that Apache is a large and diverse organization means that it comes with the obvious advantages of economies of scale. Apache also promotes a licensing regime that lends itself to innovation and participation of well-resourced organizations and has a higher comfort level with many corporate customers.
The question of licensing models is for some a key issue that merits examination. The LibreOffice community works with a copyleft regime. This is partly because, when that community forked the code from the OpenOffice project, it was the only licensing regime available to them, and partly because the core of that community comes from Linux vendors that are very comfortable with, and have had prior success with, copyleft licensing. This does not mean that they have failed with permissive licenses; it just means they have more experience, and a greater comfort level, with copyleft regimes.
Copyleft licenses rely on a viral mechanism to enforce disclosure of code modifications. Basically, if you are benefiting from the code, you contractually must disclosure any modifications or enhancements you make. By and large, there is a significant trend towards permissive licensing and away from copyleft licenses (see: http://blogs.the451group.com/opensource/2011/06/06/the-trend-towards-permissive-licensing/) . In what most people would think of as counter-intuitive, copyleft licences are more predominant amongst vendor-led open source projects. The reason for this is that some vendors choose to run a dual licensing business model where they put the code out under a restrictive copyleft license and ship a commercial license themselves. They usually combine the licensing regime with a contributor agreement. This means that the intellectual property is aggregated and owned by the sponsoring vendor. This provides the sponsoring vendor with the unique advantage of being able to distribute and package the code as they see fit under a commercial licensing regime. This is exactly the business model that Sun used with OpenOffice and, as I mentioned previously, the reason that the LibreOffice could only fork the code under a copyleft license.
A permissive licensing regime uses a different mechanism. In a permissive licensing based community, individuals and companies contribute significant value to a project because they sincerely believe that the project will be stronger for their contribution and that the marketplace, and by extension their own interests and any derived code, will be healthier because of the success of the project. Apache is but one of many communities that have demonstrated they can thrive with the driving motivation of the value of co-investment. As the term �permissive� suggests, there is more flexibility to re-package and extend the code from these projects without ceding intellectual property. This makes such projects more attractive to corporate vendors, and facilitates corporate investment of resources.
Both copyleft and permissive licensing based communities have demonstrated that they can harness community creatively and create value. With the opportunity that Oracle's donation of OpenOffice to Apache presents, IBM has decided to invest its resources there, and to actively encourage others to do the same. Furthermore, IBM has decided to contribute its Symphony code to the Apache OpenOffice project This is a very large donation, which includes an accessibility framework, bug fixes, performance enhancements, feature extensions and a significant UI enhancements.
The work at Apache is progressing. The community has been working on code and licensing hygiene. They have integrated many bug fixes and added some new and exciting function including some new SVG capabilities. The Apache OpenOffice 3.4 release will be posted shortly.
A draft of the AOO 3.4 release notes are in the wiki here:
If you want to receive an email notification when Apache OpenOffice 3.4 is available you can sign up for the project's announcement list by sending an email to email@example.com.
Now that the absorption of the OpenOffice donation has been finalized, IBM can make the Symphony code donation. The resulting merge of all of this new capability should appear in a 4.0 release later this year.
We are very encouraged by the progress that LibreOffice is making. There are many areas where the two communities do and should work together. Security and support, document interoperability and standards compliance are obvious. A significant amount of new technology will become available from Apache OpenOffice in the near future, especially once the Symphony code gets integrated. We encourage the LibreOffice community to leverage as much of it as they want. Apache's permissive licence allows them extraordinary flexibility to do so. We would also love to see LibreOffice contributors share their work with Apache OpenOffice. While the interest of these two communities are not identical they are most definitely aligned and complimentary.
As we move towards a web of linked data, deep Q&A and analytic-driven insight, significant new value will be created. We are hopeful that a collective community effort in one of the core elements (document creation and editing) will help accelerate this.
Two sides of the proverbial coin today as two sessions explored both sides of the "content" question. The first focused on how to create content that keeps your customers coming back; the second on the strategies you need to build to manage and govern it at scale.
Video (YouTube, Hulu, vivo) is the third wave of broadcasting. Networks were the first, cable the second.
Most brands fail to take advantage of video's two chief attributes: an instant global audience on billions of devices.
Smart and entertaining content can breathe new life into expired or archived brands. Warner revived the Mortal Kombat brand after watching a short fan-made video on YouTube and giving the director the chance to watch the feature. A low-budget feature was better advertising than actual advertising for a lot less money.
In-game ads convert better than most other ads.
Online content can combine different ROI metrics (reach, impressions, traffic, pipeline, etc) into one package.
Brand marketing isn't about Thursday-night shows or one-time events. It's a programming model that delivers content to audiences all the time. Need an example? Think RedBull.
Generally speaking, online content is out of control.Copywriting is not a Content Strategy.
Content Strategy includes archiving, governance, the user interface, the content and the coding that makes it come to life.
You're not writing documents. You may not even be writing. You're packaging ideas.
A content strategy has two sides, each with two distinct aspects. Content aspects are substance and structure. People components are workflow and governance. Most content strategists pay too much attention to the former and not nearly enough to the latter (see point #1)
When building a content strategy, plan a litle bit, then jump in and do. If you need signoff from other stakeholders, start the conflicts early and keep the interactions going.
Not sure if it will work? Do a pilot. Do a pilot. Do a pilot.
Content strategists absolutely deserve a seat at the executive table. If only they'd stop whining that they don't.
I�ve been writing quite a bit lately about the importance of perspective and context � about why in our interconnected world we need a diversity of perspectives to properly understand a challenge and create a context that helps us understand what�s coming at us.
Today it�s time to turn the lens around the other way.
I�m the social business strategist for IBM Software, a multi-billion-dollar business that�s changing the way the world literally works. As such, it�s my task to bring you stories of that change in engaging and meaningful new ways. In other words, to provide you with the perspective and context you need to evaluate our solutions and choose a partner for your own success.
I need to be better and providing you with perspective and context. So I�ve spent the last two months creating a plan that will help the IBM Software social presence achieve that goal. Frankly, some of these ideas are not so much new as they are long overdue. Others, however, are new and I hope to bring you more.
That�s why I came to Austin. There are more than 5,000 sessions to choose from here. But rather than snack and skim across myriad topics, I�m going to focus on sessions and keynotes focused on these areas:
Yesterday I wrote about the importance of perspective and context in helping us understand the new. Essentially, the more varied the perspectives we can bring to a situation, the better our understanding of the challenges at hand and the more effective the solutions we devise to solve them.
Happily, I'm not the only one who thinks along these lines.
Another is IBMer Marcela Adan. Marcela is a 33-year IT veteran whose career spans development, consulting, tech support, skills transfer and product management. Ostensibly a post about today's news of new IBM expert integrated systems, Marcela focuses on the importance of the "aha!" moment she experienced working on a life sciences project more than 10 years ago. It was a complex project that brought together IBMers with expertise in a wide range of technical and scientific skills - a wide variety of perspectives, in other words - to improve outcomes in a client she'd never worked for before. Each team member's CV was no doubt impressive; yet Marcela's "aha!" moment came the moment she realized the project would only succeed if the team put its collective skills together in an entirely new way.
This time, the client was in a very different business: human life. Maybe that is the reason why the lessons learned in this project hit me so hard. As humans, we all can relate to the importance of events that affect our quality of life.
The second post is by IBMer Simon Hodkin. He's also writing about expert integrated systems and again the issue of perspective comes up. This time, though, it's the ever-present, ever-shifting dynamic between IT and business users when it comes time to technology buying cycle. Anyone who's been in this situation knows the process seems simple, but as Hodkin explains, it's anything but. In the end, there is no size that will fit all:
If it helps, the functional requirements are largely IT-dictated while the non-functional requirements are more directed by business-value. These days, having to do more with less, turning into a service-oriented organization and responding to the needs of the business are all common themes. This is a practical area where you as an IT professional can start to make a difference to your business.
As the challenges we face on our planet grow increasingly complex and increasingly interconnected, so will their solutions call for an increasingly sophisticated combination of diverse skills and experience. We cannot solve our problems with the same thinking that created them. In ushering the world toward a new era of expert integrated systems, IBM brings its decades of experience in each of those words for a powerful solution. On April 11 IBM begins this new era. Care to witness?
In 2010, a time capsule sealed in 1905 was opened in the French city of Moulins. It contained, among other things, stuffed birds, skulls, stained-glass windows, an electric chandelier and a flushing toilet. Its creator, Louis Mantin, wanted it reopened as a museum in his hometown.
In 1940, an Atlanta man named Thornwell Jacobs created his own time capsule containing � again among other things � paper mache models of fruits and vegetables, 200 books of fiction, a Donald Duck doll, a Kodak camera and a flute. It won�t be opened until the year 8113.
In 1977, a NASA time capsule project led by Carl Sagan was shot into the Milky Way Galaxy. Its contents include a gold-plated record with recordings of Bach, Mozart and Chuck Berry, greetings recorded in 55 languages and an X-ray of a hand.
The date of its opening is still TBD.
Life as it was lived
Each of these time capsules contain snapshots of life as it was lived at the moment. To their credit, their creators leave a testament to our collective achievements and a message to posterity of how we want to be remembered. When opened, the contents in turn can help us better understand the hopes, dreams and fears of the past and how far we�ve come in the intervening years.
SXSW is a conference about understanding the present and creating the future. The buzz about it began weeks ago, and is set to continue into next week. If we captured the inevitable tidal wave of tweets, check-ins and iphone snaps in a capsule and preserved it for posterity, what would it say about us? More importantly, what would we want it to say?
Perspective and context
Perspective and context are key to understanding any new phenomenon. And the current disruptions that mark our our increasingly data-driven and socially soaked era are severely lacking in both. I�ve come to SXSW to broaden my own perspective on our tech and our times, and to gain that broader context that can help me assess what�s really important to my job and to you, our readers.
Within its beautifully designed and perfect-bound 220 pages are the words of poets, novelists, politicians and priests stretching into Antiquity. It was a fascinating and illuminating read. Because while we struggle to grasp the privacy implications and marketing opportunities of geo-location, we�d be fooling ourselves if we thought ourselves unique in grappling with dramatic technological and social change.
What's often missing
What's often missing in the breathless coverage of the latest social app is an awareness that entire empires have come and gone without electricity or the printing press. What we need now is not another killer app. What we need is a greater perspective and a broader context to understand what that app means and why it should be considered "killer" at all. Over the next seven days, I hope to find some answers and share what I've learned with you.
And, we asked you to dream the impossible dream of a world where a single vendor might deliver a family of products, including reporting, analysis, modeling, planning and collaboration, which would also balance analytic freedom with governance and control.
This dreamland is no fairytale, and we are happy to report it does have a very happy ending.
�Individuals, who need the freedom and flexibility of personal analytics, yet want to access corporate information and easily share insights across a wider community with IBM Cognos Insight
�Workgroups and mid-sized organizations, that need to be up and running with a solution that is easy to install and manage with IBM Cognos Express
�Enterprises,that require broad analytic capabilities deployed to hundreds or thousands of userswithIBM Cognos Enterprise� an offering that brings together the integrated capabilities of IBM Cognos business intelligence and IBM Cognos TM1 performance management.
With the IBM Cognos family of Business Analytics solutions, IBM addresses the breadth of analytics with any single product in the family spanning reporting, analysis, modeling, planning and collaboration.
And, the solutions ensure that any organization can begin its journey today depending on the specific requirements, as well as providing the confidence to expand the solution � without retraining, retooling or re-implementing.
Please watch the short video below that describes how we might �prescribe� the family of solutions directly to a business.
For example, if an individual user wantsto work independently and quickly without waiting for corporate systems, and thenshare those insights or create additional reports from larger data sets, an organization can easily add server capabilities to combine insights with real-time and corporate information. Those same insights can then be shared on scorecards and dashboards and sent directly tomobile devices.
As Sister Sledge once sang, �We are family! I got all my [analytics]and me!�
We invite you to join and interact with our growing family.
We�d also love to hear how you are usingIBM�s business analytics solutions toaddress your specific business needs � fromquickly gaining insight into the business to taking action and driving results. Please tell us your story in the comments section.
Much has been made about the topic of security intelligence over the course of the last few weeks, especially in light of IBM's recent announcements around the integration of the QRadar Security Intelligence Platform with our other core security competencies such as endpoint management, threat mitigation, database activity monitoring, identity and access management and application vulnerability scanning. We've talked a lot about the technology that surrounds the security intelligence world, but to me one of the most interesting elements of this discussion is around the people who actually deploy and use the technology every day.
Yesterday I was listening to John Kindevag of Forrester research talking about the things we need to do to "get off the reactionary hamster wheel of security." What does this mean? Well, at a high level, it means what any security professional would expect it to mean, which is that we need to stop fighting the losing battle of reactionary security. Now, that doesn't mean there won't be incidents you need to react to, because that will always be the case. Rather, he provided a set of ideas around how to more proactively and comprehensively address security, and an incident response plan was certainly an element of this. While incident response is certainly about reaction to events, it can still be a part of a pre-established plan around security.
There were five main discussion points that John had:
1) That the security leader will need to evolve from a technical leader to a more strategic player in organizations
2) That we should embrace a 0 trust model
3) The need to better understand, control and even "kill" your data
4) Embrace security analytics
5) Plan for failure
What stuck out to me was the relationship between the topics of security intelligence/analytics and the way we see the job role of the CISO changing. To begin with the CISO, John started a lot of this discussion by talking about how the security team, basically by necessity, has to be some of the smartest people in your entire organization. The amount of complexity they need to deal with, using limited resources, is at best a daunting task. However, John also said that the CISO can no longer be just the technical leader, they need to be a strategic business leader, and in doing so, open up new opportunities, and maybe even budget for their organizations.
What struck me most though was the sense that the security team will also need to be among the most tenacious groups in order for them to be successful. On the one hand that does mean fighting for budget and resources, on the other it means an attitude change to "we sweep nothing under the rug." John mentioned a few discussions he had where organizations didn't have the mechanisms in place to understand if they had been breached, and in some cases, the lack of insight was driven by a sentiment around "what we don't see, we don't have to spend time and money fixing." Two things jump to mind here, one is the need for measurements around security success that go beyond the breach (and as John would also add, are not about cost savings) but also the need for organizations to embrace the idea that they need to see everything that potentially impacts their security posture. This is where security analytics come into play.
Earlier in the day yesterday, before John spoke, Brendan Hannigan, the GM of IBM Security Systems, also delivered a talk around security intelligence and he joked with the audience about how they were logging more then they knew what to do with, and IBM was going to ask them to bring in more. Like John, Brendan was arguing for the need to see more. However, both men also realize that we reach a point very quickly where the volume of data we have is not possible for a human to analyze and then do something with. For that reason, organizations need security technology, such as the QRadar Security Intelligence platform, so that we can distill, using analytics, vast amounts of data into a smaller, more manageable number of security events that require investigation.
John closed with some thoughts around how important the profession is in general. This is something that I think anyone would agree with, especially as more and more of our lives are lived through various digital channels. Today's security leaders are responsible not only to their own business, but also for the sensitive personal information of their clients/customers, or in other words, when taken as whole, basically everyone in the world.
To read more, IBM is doing a series of papers on Security Essentials for CIOs over the coming months, and that series can be viewed here.
Cloud computing has become, in certain ways, the eat-right-and-get-some-exercise of IT infrastructures.
By this I mean that everybody's heard the message, and everybody knows the potential benefits... but not everybody actually follows through to the degree they could, or should, to get the best possible results. Even in 2012, the world is full of organizations that remain cloud holdouts. (I won't go so far as to call them cloud Luddites.)
Now, there are a number of valid reasons for this reluctance -- security and compliance, for instance, are major worries for certain sensitive applications, which aren't likely to migrate outside company walls any time soon.
Guaranteed performance is another common issue. For certain particularly business-crucial applications, like ERP, many organizations are simply not willing to trust a shared architecture like cloud in which many different services execute in parallel. So instead they're sticking with a tried-and-true, dedicated architecture to play it safe.
This, however, means that the information locked away in those applications can't easily be leveraged in other ways, and for other reasons -- very awkward and unfortunate for business purposes.
Fortunately, there's a good compromise: hybrid cloud models that deliver a sort of best-of-both-worlds approach. In short, you put your cloud-friendly apps in the cloud, leave the other apps (perhaps compliance-sensitive or ERP apps) in your conventional, in-house infrastructure and then integrate them as cleanly as you can to meet your needs.
Getting this done, however, means finding clever ways to get information flowing as it should between the two architectures. And by clever, what I really mean is fast, cost-efficient and yet complete, migrating all the information you want (and none of the information you don't) into the cloud.
How to make that happen? One way would be to try and custom code the interfaces between these apps.
But anybody with IT experience is probably already cringing at that idea. It might yield complete results, but it's not likely to be either fast or cost-efficient.
Is there a pragmatic plan B? Turns out there is.
Accelerate almost any hybrid cloud initiative via fast, seamless information integration
Recently I talked with Chandar Pattabhiram, who drives go-to-market strategy for the IBM WebSphere� Cast Iron product line. And he confirmed for me that indeed hybrid cloud models are increasingly attractive -- if you can take care of your information-migration needs in a business-optimized way.
�It's a hybrid world today and will continue to be so for a long time,� said Pattabhiram. �Integration has become a critical component of this hybrid world because companies need to rapidly connect the new cloud services they're adopting with the rest of the on-premise applications. And that's where IBM WebSphere Cast Iron Cloud Integration capabilities can really lend a helping hand.�
Does �Cast Iron� ring a bell for you? If you're an IT pro, you may recall that in 2010 IBM acquired Cast Iron -- a leading provider of solutions designed to integrate cloud and in-house apps in an accelerated way.
The Cast Iron technology thus turns out to target the exact 2012 scenario I describe above -- a company wants to link its own apps seamlessly with cloud apps in a hybrid model, generating the least possible complexity, costs and risks along the way.
�Integration has become the 'productivity application' for cloud computing,� said Pattabhiram. �Without integration, cloud users can wind up 'swivel chairing' -- trying to alternate between two completely different architectures to get access to critical business information in a rather clumsy way. But with integration, they get all the information they want in one place: the cloud. Net result is that integration helps companies maximize productivity, increase adoption and also maximize the value of their cloud investment.�
Drag and drop your way to cloud nirvana
How exactly does IBM WebSphere Cast Iron Cloud Integration work this magic? The answer is basically threefold: (1) Out-of-the-box templates and (2) special functions, both of which are managed via a simple drag-and-drop interface, and, if necessary, (3) custom scripting to handle the rare odd case.
Let's look at the templates first -- the heart of the solution. These have been developed based on the premise that companies struggling with integration issues are quite often dealing with the same groups of applications.
I mentioned ERP before; SAP apps are a good example along those lines. And migrating the information from SAP into the cloud really means, typically, migrating it into a particular cloud environment/application. One very common example: Salesforce.com.
So, to reflect this situation, the Cast Iron solution includes hundreds of templates to perform such jobs, each designed for a particular type of migration such as SAP-to-Salesforce. And in the majority of cases, a template will be found that (following a wizard-driven Q&A and basic validation checks) does the necessary job right out of the box.
How does that sound in terms of our previous evaluative criteria (�complete, fast and cost-efficient�)? Pretty fair, I'd say.
Now, there are certainly going to be cases where not every data record lines up perfectly between the two infrastructures; a little jiggering may be required. In scenarios like that, the Cast Iron solution also provides a range of handy data modification functions. Imagine, for instance, that you need to combine two text strings from the SAP data set into a single text string in the Salesforce application. To do that, you could use the concatenation function, which glues the two strings together. Problem solved, and we still haven't left the drag-and-drop interface.
So when you add up the convenience and capabilities, IBM WebSphere Cast Iron Cloud Integration strikes me as a tidy solution to a very common problem. Furthermore, thanks to the way it can be tweaked and modified as needed, it works well even in cases where the in-house app is completely homegrown, and there's therefore no template available.
Pattabhiram sees things the same way. �The templates are remarkably comprehensive, but, no, they won't work for all scenarios,� he said. �Still, even for home-grown applications, Cast Iron's �configuration, not-coding� approach is the way to go -- much faster and much less expensive than trying to custom code the interfaces between these apps. �
The final step, following the new orchestration across the two architectures you've just created, is to export it to an appropriate form factor for your needs. Specifically, we're talking about one of three options: (a) a physical server, (b) a virtual server or (c) a cloud-based service. The Cast Iron solution can be used for all three. That's a range of choices to fit any customer's requirements, and it also avoids locking them into a specific architecture or business process that, down the road, they might want to change.
�Integrating the cloud doesn't always really mean integration in the cloud,� said Pattabhiram. �What we've seen is that customers choose amongst a variety of form factors -- physical appliances, virtual appliance or integration as a service -- for their cloud integration needs. The key is to provide this flexibility of deployment options to customers depending on their size and IT environment.�
Maybe all of that sounds a little theoretical to you, and you need a little proof-of-concept? Take a look at the situation faced by Siemens Energy.
These guys faced the exact scenario I describe above -- an SAP-to-Salesforce hybrid cloud integration for significantly faster mirroring of information and key performance metrics across the two environments. And not only did the Cast Iron solution get the job done, it got it done in under two weeks.
How does your organization measure up? What's your cloud integration strategy?
Guest blogger Wes Simonds worked in IT for seven years before becoming a technology writer on topics including virtualization, cloud computing and service management. He lives in sunny Austin, Texas and believes Mexican food should always be served with queso.
Java and all Java-based trademarks and logos are trademarks or registered trademarks of Oracle and/or its affiliates.
I'm not normally one to engage in bad puns, but in this case I couldn't resist. You see, when more than 8,000 service management professionals descend upon the MGM Grand for our annual IBM Pulse conference and start tweeting, it's bound to cause some tremors in my social channels.
The theme of this year's conference is "Enabling Business Without Limits," and over the next four days attendees will learn how to fundamentally and cost-effectively change the economics of IT and speed the delivery of innovative products and services.
With a curriculum boasting top-notch education, networking and just a little big of fun, Pulse 2012 is helping today�s leaders react with agility in changing competitive landscapes, reduce vulnerability throughout the service lifecycle, and continuously improve the business impact of the technology.
Pulse 2012 will address a multitude of audiences and industries with sessions that demonstrate how to apply the tools and best practices to help your organization achieve business without limits through:
Transitioning to smarter, more flexible delivery models such as cloud
Converging digital and physical infrastructures to improve economics and speed service delivery
Managing rapid growth in data, security threats and compliance requirements
Leveraging mobile, web and instrumented endpoints
The discussion started in the opening general session, in which IBM Tivoli VP Scott Hebner, SVP Software Middleware Robert LeBlanc and Tivoli GM Danny Sabbah outlined the ways in which organizations can gain the Visibility, Control and Automation they need to achieve said "Business without Limits.� Live video was available through our Livesstream channel and we'll be covering the opening keynotes for the next three days, so be sure to check the schedule over on the Pulse Web site.
Speaking of staying connected:
Here are the other ways you can feel the Pulse and even help speed it up from wherever you are:
Follow the #IBMPulse Twitter tag or any of our Twitter accounts like @ibmpulse, @ibmtivoli or @ibmcloud.
Follow our intrepid and inexhaustible blogger, "Turbo" Todd Watson for a relentless stream of insightful posts on the event in near real-time or watch him and his partner Scott Laningham interview IBM experts on Livestream.
Bookmark the IBM Pulse Web site for info on speakers, agenda topics and live video.
Pulse comes to a close with nothing less than a discussion with Apple co-founder Steve Wozniak and we're giving you the chance to ask the questions! Submit your question via Twitter with the tag #askwoz and be sure to watch if your question is selected!
If for some unimaginable reason you�re not already an IBM Software Newsletter subscriber, March offers a couple of extra incentives to subscribe:
Our annual (and always popular) Pulse recap. Our March issue (which mails March 14) will connect you to all the top product announcements made at Pulse 2012 (March 4-7 in Las Vegas) � and to video from the show, follow-up events (online and in-person), and other relevant resources. It�s the perfect cheat-sheet if you can�t attend the conference; it�s also a great reference for attendees who want to review what they saw, or what they missed.
Our first System z Edition of 2012. If you�re a System z customer, just check �System z Software� on the subscription form to receive the March System z Edition � which includes information on the latest System z software releases, PLUS the top Pulse 2012 announcements, PLUS other information important to System z users. (You�ll also get three more System z Editions of the newsletter during the year as part of your subscription).
But � and this is important � you must subscribe by March 8 to get either of these March issues. Sosubscribe NOW.
I'm a sucker for anything related to analytics and sports. I�m also not alone.
Ever since the Moneyball craze, the general populous is keenly aware of the benefits and value analytics brings to the world of baseball. It�s hard not to imagine baseball without thinking about Michael Lewis, Brad Pitt, the Oakland A�s and analytics. Heck, even ESPN the Magazine this week created an entire issue dedicated to "Analytics."
I attended a Chicago Cubs and Bloomberg Sports event on Tuesday night here in Chicago where Shiraz Rehman, the new Assistant General Manager of the Chicago Cubs, discussed the team�s use of analytical tools to evaluate players for drafts and free agent signings, do advance scouting, and game planning.
I've never seen a room (of mostly men) so intently hanging on every word. Analytics is so much a part of baseball's lexicon that box scores have become an antiquated way of understanding a baseball game. The cravings have shifted from home runs and batting average to more sophisticated statistical measurements, such as UZR, WAR, VORP, OPS-plus, FIP and BAPIP.
Some of the interesting revelations from the event included:
�Mobile BIis running rampant in the clubhouse. Players love their iPads for consuming reports and evaluating video of themselves and the competition.
�Old school scouts and new school analytical thinking co-exist peacefully in the front office and rely on each other's insight to make smarter decisions.
�Even with all the analytical tools, baseball teams still miss "all the time" on prospects, especially for 18-21 year olds, where it's still difficult to predict behaviors.
�Teams might be equipped with the best analytical tools and data, but if they don't have the people with the right skill sets and creativity, they will fail.
If you want more insight into how a baseball team is using business analytics, register for an upcoming IBM Performance or IBM Finance Forumevent to hear Paul DePodesta, VP of Player Development & Amateur Scouting for the New York Mets. He'll be speaking at various cities throughout the United States and Canada (San Francisco; Dallas; Morristown, NJ; Huntington Beach, CA; Montreal; Charlotte).
While baseball might be the analytical poster child, all sports are wildly interested in using this technology for competitive advantage on the field/hardwood/ice, and to run their business operations. In fact, the Mecca for those in this profession is the annualMIT Sloan Sports Analytics Conferencetaking place this weekend in Boston.
Based on IBM's experience, below are a few other ways business analytics is transforming the sports world:
�Improving the customer experience.The Miami Dolphins are integrating IBM analytics technology into Sun Life Stadium, enhancing the overall experience for fans of sports, music and media. As a result, officials can gain immediate insight into all stadium operations including visitor traffic, fan spending preferences and weather patterns, as well as social media sentiment.Readthe press release and watch the video below.
�Guiding draft selections.By assessing future player performance and then leveraging the insights from the analytics to create business rules, teams can decide who to select based on positional needs, previous draft selections, and other factors.
�Maximizing schedules and ticket pricing.Analytics helpsteams optimally balance revenue generation and travel efficiencies, as well as optimizing ticket prices based on days of the week and opponents.
�Optimizing performance.Hockey teams are using analytics to evaluate how players perform in specific situations, such as at the end of a close game or during a power play. This analysis helps coaches determine which players should be on the ice at certain times of the game.
�Simulating winners.Seen the IBM SlamTracker?It�s alive scoring and analysis tool that applies IBM business analytics to give fans a virtual seat at the tournament. It allows millions of worldwide fans to track players' progress and see the Keys to the Match that shows the particular strategy players should take to improve their chances of winning.
As you can see, analytics is finding its way into games in many ways. For more information, visit the IBM Smarter Analytics website.
But, let�s not forget that�s it�s soon to be baseball season and hope, like analytics, springs eternal for all teams.
And now that the Chicago Cubs have Theo Epstein, Shiraz Rehman and their band of analytical savants�all I can say is �watch out!�
A hundred plus years of misery are only an algorithm away�or so I hope.
Guest post from Becky Smith, Product Marketing, IBM Business Analytics
In a way, doesn't everyone in business analytics have a little bit of Don Quixote in them?
We're all on a bit of quest to find answers hidden in our never-ending, growing piles of data. And often, we find ourselves tilting at windmills and fighting futile battles with IT, with spreadsheets or with silo'd applications that only do one specific task.
Close your eyes for a minute and let yourself dream about a world where there is a single vendor that delivers a family of products including reporting, analysis, modeling, planning and collaboration.
What if that family of products not only has a common set of capabilities but is also integrated and talks to other members of the family so they all live happily together? And, what if a mid-sized business could get the same self-service experience, what-if scenario modeling and the ability to discover and assemble their data across both business intelligence and performance management as a large enterprise?
This is a notion we call "right sized analytics." An enterprise solution might be too big and expensive for your organization, while an individual point solution might be too small and limiting and unable to provide the needed integration with a mid-sized or larger enterprise solution.
Right sized analytics means having an entire family of products built on a common foundation that enables individuals, workgroups and enterprises to access and analyze corporate information and easily share insights across the organization. Basically, it doesn't matter where you begin your analytic journey, as long as you have the comfort knowing that the solution will grow with you.
For example, an organization might have an issue with customer retention. Through the analysis of support calls, an individual user might uncover that there is a 15 percent increase in dissatisfied customers due a product defect. This information can then be shared back into the analytics solution so the manufacturing and support organizations can improve product quality, and plan to offer customers special promotions and increase customer service for the entire customer base, respectively.
Does this scenario only exist in your subconscious?
That one man, scorned and covered with scars,
Still strove, with his last ounce of courage,
To reach ... the unreachable star ...
In feedback from our customers, they've talked about the battles scars they've received in their quest for a common set of systems working seamlessly together. Stay tuned for part 2 of this post and hear why that business analytics star is no longer unreachable and how a complete family of products can:
� Address the needs for your organization � regardless of size
� Empower individual users, workgroups and enterprises
� Allow the organization to start anywhere and go everywhere
Earlier this month I presented at an IBM event entitled "The New Landscape: Social Business, Mobile Analytics, Modern Technology." It was a rather intimate affair that saw some 35 CFOs from the greater Toronto area gather at the Gardiner Museum. On the agenda was a four-course meal by top chef Jamie Kennedy, a custom wine pairing by noted wine writer David Lawrason and a discussion about social media and the CFO.
Facebook had just filed for its S1 the day before, so the timing of the event really couldn't have been better. But before the tasting notes and talk of terroir, I had been asked to present on the implications for finance professionals of "going social" and the analytics opportunities such a move would present for their companies if done well. We followed up my talk with a demonstration of IBM Cognos Consumer Insight.
Here are a few highlights, and my presentation as well.
Social Media is at an inflection point in its evolution. It's an inflection driven by maturing platforms, the addition of mobile computing and increasingly sophisticated analytics.
Like analytics, social media is a transformational technology that will impact your business whether you're "doing it" or not. And like analytics, there will be leaders, learners and laggards.
Social media adoption is quickly expanding beyond Marketing and PR and into core business processes such as customer service and R&D.
Social media data is driving the need for big data strategies to manage volume, variety and velocity.
With responsibility for risk, brand reputation and IT, Finance departments need to understand what's happening in the social sphere, even if their own organizations have yet to get on board.
The time for Finance to "get into social" is now. Finance can do this in three ways: Governance (creating social computing guidelines), Partnership (funding and advising analytics and content strategies) and Information (much of the Finance press is now social as well).
Happy RSA everyone. To check out an overview of what IBM has going on at the show, click here.
In other news, if you have ever turned on a television during a sporting event in the US, you have likely seen IBM talking about data and analytics, and the increasing number of ways that this technology and vision is being applied to all sorts of different challenges. Some of the most prominent ways this intelligence is being applied is in our business analytics solutions, many of our Smarter Planet projects, and perhaps most well known, Watson, the computer that famously won on Jeopardy. Hot on the heels of these topics, and perhaps offering similar meaningful insights, is security intelligence. With the announcement of the Q1 Labs acquisition, and the formation of an entire security division, IBM took some important steps forward in making this vision around security and analytics a reality. We promised to break down more security silos and provide more insight than ever across an organization's entire security posture.
Last week IBM announced new capabilities around our QRadar Security Intelligence platform. While we have made a number of significant announcements recently, even within the last 6 months, there haven't been any that have been as "loaded," at least from my perspective. What do I mean by that? Well, this announcement was the first that had a very direct impact on all the major things we do in security, whether that's security information and event management, endpoint management, network security, threat research, identity and access management, application vulnerability testing or database activity monitoring. It was everything integrated with the new QRadar Security Intelligence platform- all of our technology now understanding how to talk to all of our other technology, and it all happened in one announcement.
Over the course of the coming weeks and months we'll talk more about what each of these integrations mean, but for now I want to highlight some examples of new capabilities that these integrations will deliver.
Infrastructure Security: QRadar + IBM Endpoint Manager (powered by BigFix) What: Detect and prevent stealthy malware infections How: Correlate anomalous network activity with vulnerable endpoints, & determine impact Example of New Capability: Detect when a botnet has infected a vulnerable endpoint that is missing patches, and see what data was communicated back to the command-and-control server
Data Security: QRadar + IBM Guardium Database Security + X-Force Threat Intelligence What: Prevent data exfiltration and detect data breaches faster How: Correlate detailed database activity with other network activity to detect anomalous and suspicious behavior Example of New Capability: Detect when multiple failed logins to a database server are followed by a successful login and accessing of credit card tables, then followed by an FTP upload to a questionable site
Application Security: QRadar + IBM AppScan (Static and Dynamic Testing) What: Apply predictive analytics to prevent application compromise and better detect breaches that occur How: Correlate application vulnerabilities with network topologies and suspicious activity Example of New Capability: Determine when an unpatched Web application is attacked using a known SQL injection vulnerability, and identify the potential impact of the attack
User/Identity Security: QRadar + IBM Identity Manager & Access Manager What: Provide deeper visibility into user-driven threats and risks How: Correlate user identities & actions with network activity to prevent & detect breaches Example of New Capability: Detect when a contractor logs into a high-value application after hours, and then sends a large amount of data via personal email account to a third party
IBM has been working with clients all over the world for years to help them improve their security posture.
This announcement was the next step in the capabilities IBM is delivering to the market in security, a step driven by tighter integrations across technologies and by intelligence and analytics derived from correlating data and events from across all of these different domains. Stay tuned.
In a partnership with the IBM Center for Applied Insights, Kris Lovejoy, IBM's VP of IT Risk, will be publishing a series of ten whitepapers on the topic of security essentials for CIOs. To view the series website, pleasevisit us on the web.
IBM's X-Force Research and Development team called 2011 "The Year of the Security Breach." It was an unprecedented year for not just instances of data loss/theft but also a number of other different attacks where the goal wasn't necessarily to capture information. Things like DDoS attacks where company websites are brought down played a significant role as hactivists used this attack to make politically motivated commentary. We also saw new types of threats targeting things like mobile phones. What had previously been a theoretical threat, became a real threat in 2011. We saw the development of attacks that took advantage of the specific capabilities and marketplace around mobile phones, such as creating a malicious application that, when installed, would send text messages to premium numbers to run up large phone bills. We've seen elements of this type of attack before (malicious software disguised as legitimate software, attacks designed specifically for profit), but the new capabilities and connectivity of mobile phones opened up new avenues to achieve these ends.
As the world changes, and becomes more interconnected, instrumented and intelligent, security challenges, like the one I just spoke about around mobile phones, will continue to evolve. Mobile phones represent a very current and telling example of some of these changes. Employees want to bring their personal devices into the workplace, and those devices will be used constantly for a mix of work and personal use. Balancing necessary security controls with all the connectivity and capabilities that modern employees want is not an easy task. It also represents only a single example of the security challenges we will face around a hyper-connected work force (and world for that matter) as well as one in which traditional network boundaries are dissolving. Looming still are the implications of security and the Internet of things. Moving forward, we will be forced to continuously consider the ramifications of plugging "things" into the Internet in order to provide new capabilities in areas such as data collection and management.
With both cloud and mobility we are also seeing a world in which everything is everywhere. Your company's data is no longer just sitting in a company database, being accessed on a company workstation. It's walking all over the world in the hands of your employees. In addition, your IT shop might be a blend of in-house resources, strategic outsourcing and maybe you are even using 3rd party cloud infrastructure for some of your work. These changes usually happen because they represent opportunities to grow and become more efficient and we all want to do these things. However, what it is doing, now more than ever, is making security both more difficult and more important.
This new world can offer a lot of promise, and security can hold you back if it's absent, or it can help you move forward with confidence if you know what you are doing. The reality we now see is that if security teams can address the way the world is changing effectively, then they could be the real key to unlocking all of these new opportunities.
IBM believes that this change will ultimately play an important role in shaping the role and significance of security leaders. Much in the way in which the CIO has changed from someone who was responsible for IT maintenance to someone who was responsible for bringing about important strategic change within the business, so too will the role of security leaders, whether they be CIOs, CISOs or in some sort of executive risk management function, take a more strategic role in the business moving forward.
Over the course of the coming months, IBM's VP of IT Risk, Kris Lovejoy, will be doing a series of papers around "Security Essentials for CIOs." She will be talking about what she sees as the starting considerations of an effective approach to IT security. In this recent Forbes article Kris gave a brief overview of some of the topics that she will discuss.
As each new article in the series become available over the course of the coming months, I will link to each in the following post.
Make sure to follow IBM Security ( @ibmsecurity ) and the IBM Center for Applied Insights ( @IBMCAI ) for any updates on this series as well as other important news and updates from IBM.
You can visit the landing page for this series on the web, here.
IBM has pinned its reputation on advancing the cause of "social business," so it was appropriate for the company to make use of one of its largest annual conferences as a showcase for its new technologies. The October Information On Demand event had the usual branded presences on Facebook, LinkedIn, Twitter and YouTube, but organizers went far beyond that.
Key IBM bloggers were contacted and asked to promote highlights of the conference. Six iPad-toting �social concierges� roamed the floor inviting attendees to take polls and capturing their observations in short videos. Tweets about the event were displayed on a 90-by-20-ft. screen. Some customers carried FlipCams to record man-on-the-street interviews, which were posted to YouTube. The LiveStream service was used to broadcast eight key sessions and 42 informal interviews with attendees, speakers and IBM executives. The broadcasts were viewed more than 24,000 times during the event. The #iod11 hashtag achieved a Twitter reach of 14.3 million unique users and more than 63,000 new �likes� were recorded on relevant Facebook pages.
Delivering a fun, engaging and valuable social experience to the small city's worth of information and analytics professionals who descend upon the Mandalay Bay each October is no mean feat, so I'm both pleased and proud to see the work of my passionate and persistent team members rewarded like this. Congratulations to the IOD Social Team of Crysta Anderson, Matt Carter, Alex Goldsmith, David Pittman and Tim Powers. Equally large thanks go to our roster of contributors and behind-the-scenes production wizards Stephanie Caputo, Beth Flood, Sanjay Kaupae, Scott Laningham, Jacqi Levy, Jessica Sharkey, Susan Visser, Eric Vonheim and Todd Watson!
One of the first things I ask people on the subject of data is �What do you want it to look like?�
I don't mean for the question to be complex, but I generally get an odd look and response in return. �What do you mean how do I want it to look? It�s data, it has to be accurate and structured.�
So why is it when we talk about data we picture streams of numbers, columns and rows of calculations filtering through servers or built out over multiple spreadsheets? Not all business users have the time or skills to interpret large enterprise or spreadsheet data sets.
Why is it we can't be asked the question, �What do you want it to look like?� and answer with a personal perspective? I tend to believe it's because we don't think of data as having a personality. In fact, data has quite a bit of personality.
While often quite shy and reclusive, data will talk your ear off with just a little nudge. It can also be kind of a gossip revealing interesting insight into customer behavior or an organization�s financial information. And it�s those individual users running the data analysis who can easily unleash that personality� for the good of the organization.
That�s why organizations are realizing the value and achievements that come about as a result of having a complete analytic solution built not only for the enterprise, but also for the individual user in mind.
The notion of personal analytics is about bringing agile analysis capabilities to people in an easy-to-use manner without having to rely on IT. Business users can now take advantage of solutions that bring exceptional capabilities to their desktop in terms of personalizing how they display local or enterprise data and how they solve individual or workgroup challenges � all on their terms.
Analytics is definitely becoming a more personal experience. Being able to explore data and format it in a presentation layer of the user�s choice, add built-in calculations, apply scenario modeling and do write-back on the fly, or creating traffic lights that represent key metrics that are important to the individual are all a means to answering my question, "How do you want it to look?"
Take the distribution industry, for example. What if you had the freedom to model out different scenarios based on the price of diesel fuel? An individual user can now identify the key drivers of the business, like fuel cost, then test different assumptions and identify best case, worst case and probable outcomes.
One can only imagine how this could affect you personally, with regard to your line of business in a distribution center. But what if it also affected the manufacturing floor in terms of energy prices for production costs? Would the same scenario be important to other areas of the enterprise?
In a way, building the right analytics competency is like building a business from the ground up. You need to create a growth path that combines personal analytics with enterprise scope and IT values. By providing the necessary foundation of analytics along with limited barriers of usage, you can turn your business users out to the world to explore, discover and grow as analytics professionals with a personalized capability that brings meaning and life to the data.
Knowing that these business users can turn data exploration into action by aligning their discoveries with the enterprise challenges the silos of information that personal analytics has typically produced.
Business users have traditionally created and held onto spreadsheets or data files that only they maintain and which are not reflected across workgroups for greater use. Today's organizations demand a bridge between what line of business users want and what IT requires to run a smooth enterprise environment.
Personal analytics creates that bridge.
So the next time someone asks you, "What do you want your data to look like?" tell them you want it to look interesting and attractive, prescriptive and distinctive, honest and meaningful, and actionable.
But above all else, tell them you want it to have personality!
Love and marriage. Spring and allergies. Bad economies and ROI. These concepts often come in pairs, and for good reason: when the first comes along, we need to pay more attention to the second. This is certainly true for the third example I've listed above. For both individuals and organizations, the question �How can I get the best ROI from the investments I've made?� is mighty popular right now. And for both, the answers are too elusive.
If you're an individual, perhaps you decide to talk to a stockbroker -- a guy who charges you money to tell you things you probably already knew and who probably generates no clear value over time. (It's suggestive that stockbrokers, despite claiming extensive specialized insight going back multiple decades, are practically never billionaires.)
For businesses, fortunately, the situation is a lot brighter. Particularly in the case of a portfolio of applications, there are ways to go about improving ROI that are based on sound and consistent principles. And software solutions which are built on those principles are now available.
That, in sum, is what Application Portfolio Management (APM) is all about. If you think of applications as investments -- which, for organizations, they certainly are, and on a huge scale -- it's very logical to ask: �What kind of return am I getting from my investments? Which ones are vital to my business, is there a scope for consolidation, which applications should I consider retiring? How should I optimize my investments to dial up my total ROI and dial down my total risk?�
APM solutions are particularly attractive to organizations at the enterprise level. That's because the largest organizations have giant portfolios of applications, and getting good answers to the above list of questions is therefore much harder. Similarly, in certain industries like banking, where applications have been in use for an exceptionally long period of time, the idea of introducing change to those applications is going to encounter more cultural resistance than usual. Change may be necessary, but it's really going to have to be justified with demonstrable ROI, if it's going to happen.
When you throw in the problematic economy we continue to face, in which ROI has taken on greater significance, it becomes pretty clear the need for effective APM has never been greater. Yet in many cases, organizations have barely even begun to think about portfolio management in this context.
Recently I was very fortunate to be able to talk to a real expert in this area: Per Kroll, Chief Solution Architect for Application Portfolio Management at IBM. Kroll agreed with me that at many organizations, the time for APM and also project portfolio management capabilities is now -- and not just because the economy is bad, and ROI is a touchy topic.
�The basic problems have been around for quite some time,� he said. �But they are getting worse every year and have now reached a breaking point. Companies can no longer continue with business as usual. They need to assess the value versus cost of all their current applications.�
Kroll's slant on the cost benefit ratio of current applications is particularly intriguing.
The usual approach to portfolio management in the enterprise revolves around projects -- answering the question: �What is the ROI for this business project we're thinking about undergoing, or have just finished?�
Well, that question is sensible. But it has the effect of shifting the focus away from applications. It ignores the fact that a problematic application's influence can be, and often is, multiplied because it spans multiple business projects.
How do APM solutions help? They put the focus right back on the applications. And they provide a clear, logical path organizations can follow to get more value, and lower risk, from every application in the complete portfolio.
IBM solutions including IBM Rational Focal Point, System Architect and Asset Analyzer can be used to pursue the following steps:
1. Create an application inventory.
2. Provide initial information about each application.
3. Analyze applications and determine which need more investigation.
4. Make decisions, like consolidate, modernize or move to cloud
5. Execute and track on those decisions through project proposals and project delivery
Enhance both IT development and IT operations, and receive more value from every application you have
IBM thus helps organizations optimize their application portfolios the way a stockbroker is supposed to help an individual optimize an investment portfolio -- in a balanced, objective and data-driven way that takes full advantage of proven best practices.
This approach generates many positive effects. And the more creative the company is in using APM solutions, the more benefits it will realize. Some of them are more obvious and some more subtle, but the possibilities really are endless.
Looking for something big and obvious? Think about the 80/20 rule of IT budgets. This says that typically an organization spends 80 percent of its total budget on IT operations (�keeping the lights on�) and only 20 percent on strategic innovation (�doing new stuff to make the business grow�).
What IBM APM solutions do -- unlike certain alternatives -- is deliver value to both halves of that ratio. And particularly in operations, that's welcome news in the enterprise.
�Seems to me like companies have got the 80/20 rule wrong,� said Kroll. �Many implement an objective and transparent portfolio management process only in development, to determine how best to spend the 20 percent of funds going to new projects. But what about the 80 percent on the operations and maintenance side? That's often decided based on a 'who screams the loudest' approach -- the squeaky wheel gets the grease whether it deserves any or not. Our APM capabilities make decisions like that a great deal more objective.�
A subtler, but still powerful, improvement lies in the area of information transparency. APM solutions, once applied, have the effect of pulling key information out of the shadows and into the spotlight, where it can deliver more value through wider utilization (and/or correction or revision, if necessary).
Because it's revealed, that information also becomes more resilient -- surviving the loss of key employees, for instance, who leave the organization.
�Think about what happens when people make decisions about investment levels, modernization targets or which applications to move to the cloud,� said Kroll. �Usually the relevant information is distributed in people's heads throughout the company... or hidden in spreadsheets. APM is about revealing that information (including analytical processes), prioritizing it, and making it all easily available, to anybody who needs it, at the time decisions are made.�
That reference to cloud brings up yet another point. Cloud and APM turn out to be closely related areas because APM-based insights can significantly improve the odds of a cloud's success.
How? Given an application inventory in which each application's context (risks, costs, complexity, etc.) has been quantified and analyzed, that information can be very useful in deciding which applications are the best candidates for clouds and choosing specific cloud models. This is really important, because picking the right set of applications and the right model can make or break a cloud project. APM insight not only helps ensure the chosen applications will scale well in a cloud, but also addresses other factors -- security, for instance, or business criticality -- that definitely need to be taken into account as well.
Furthermore, these same APM solutions can be used to support and enhance many other kinds of initiatives as well, some of which are hot and rapidly getting hotter.
Kroll agreed. �The interest in APM is growing so rapidly right now partly because people need to make so many new kinds of application-related decisions -- not just cloud, but also in areas like mobility, regulation compliance and outsourcing,� he said. �And once you've built an application inventory that captures value, costs and risks, all these decisions are much easier to make. IT's job is not to say 'no,' but to help business establish the constraints and trade-offs at hand.�
Innovate 2012, to be held June 3 - 7 in Orlando, Florida, offers more on portfolio management, enterprise modernization, application lifecycle management and more -- with nearly 400 technical sessions and more than 20 tracks -- to give you insight into how software can help your organization cut costs, drive innovation and reduce risk. Be sure to register < http://www.ibm.com/software/rational/innovate/register.html > by March 14 to save US $200.
Read a commissioned study conducted by Forrester Consulting, Measuring The Total Economic Impact Of IBM Rational Integrated Solution for Application Portfolio Management
About the author Guest blogger Wes Simonds worked in IT for seven years before becoming a technology writer on topics including virtualization, cloud computing and service management. He lives in sunny Austin, Texas and believes Mexican food should always be served with queso.