Use of technology by business leaders is becoming more common; IT consultants Gartner famously predicted that Chief Marketing Officers would soon have bigger budgets than heads of technology. But without the thought leaders behind the technology, planning what the technology design needs to be and how to implement it, it wouldn't be much good. In Dr. Ines Wichert, IBM has just such a person. Ines is a senior psychologist, leading the Women in Leadership team at the IBM Smarter Workforce institute and studies the relationship between diversity and performance in organisations. For example, companies that have a strong diversity policy have greater employee engagement, are more innovative and show greater loyalty. Read Ines's blog here: https://ibm.biz/BdEHMi.
The work Ines does, along with the wider Smarter Workforce team, is research based, so companies are no longer relying on guesswork to make their HR decisions. You can hear more about this work at IBM's Business Connect event at Twickenham stadium on 5th November (remember, remember), where Ines will be hosting a panel discussion including some key HR thinkers from Stonewall and EC Harris. Other IBM thought leaders in attendance will be Kieran Colville and James Cook, who understand how the use of analytics can identify the best recruits, ensure good career progression and effectively create a high performance team.
Ines, James and Kieran will top and tail an interactive session of information and dialogue, including clients talking about how they use IBM HR solutions. STA travel, will outline how in today's world of self-service holidays, why they need to best staff to remain competitive, and Lend Lease will talk about how an integrated HR and RPO policy delivers value to the business.
Guest speakers at the event will include Will Greenwood and Tim Henman, who I hope will regale us with stories of their hoary past adventures on the professional sports field.
Technology advertising has existed for nearly a century, continuously promising a life made easier thanks to better, cheaper and faster gadgets. The only thing that changes is the art direction.
At the time these ads announced hard-won innovations in power and speed. Now, they invoke simple amusement at the way things used to be.
Why is this?
I posed the question to Don Campbell, a longtime tech watcher and IBM Distinguished Engineer. What ensued was a fascinating discussion about the pace of innovation, our relationship with the technology we build and what we�ll find funny 20 years from now. Below are the highlights:
Why do we find old technology ads so amusing?
There was always a lot of romance in these ads about what a system could do for you � it would make you better, smarter, faster, richer, faster � all of those things. And yet they never drew the connecting point between the technology of the day and what you�d get as a result. The buyer needed to make the leap of faith that by the time you acquired the technology and figured out how it worked that the maker would have filled in the blanks between what it could actually do � which in most cases was add numbers together � and something that could make a meaningful difference in your life. It�s the gap that we never got over.
What does this say about us? Why do we keep buying into the romance?
We want a partner in our lives. We want technology to be that partner � not just a tool that we use, but a partner that will take care of the boring stuff that we don�t want to do. And we humanize it into being a partner.
At every leading edge of technology, there is a little more of that human element that we want to give it. I think we have some of those feelings for our iPads or our favorite mobile devices. But when we realize that a particular device didn�t become our partner, then it looks like one of those old ads.
Watson wasn�t human � but it used human language and was built to compete against humans using human rules. Could Watson be that partner?
What interested me about Watson wasn't the system called �Watson.� I don�t have a lot of applications in my business for winning on Jeopardy! What did interest me were the analytics possibilities that it opened up if you were to combine its components with other analytics capabilities. For example: it breaks down a problem and attacks it in multiple domains; it understands its confidence in each of those domains; it combines those confidence levels into an overall confidence level and a threshold that determines how to act. Sometimes it had the right answer but wasn�t confident enough to buzz in.
These are attributes that you can apply to your business � we�re getting all kinds of unstructured content from blogs, tweets and message boards, but we�re not sure how much we should trust them. Now, we can apply Watson algorithms and capabilities to this data to help us understand whether or not we should trust what�s being said, whether we should respond and how.
There�s also the gaming aspect, which is quite appropriate for our businesses. In some cases, I might want to make as much money as I can as quickly as possible; in others, all I might need to do is beat a specific competitor in one area at a particular time. In that case I�ll want to reduce my risk and choose different tactics to drive a better outcome. So when we think about the gaming part of our business, I think Watson has a lot to teach us.
Ken Jennings said that on most nights, the top Jeopardy! players know most the answers; winning is more a matter of who buzzes in first. Our businesses are less about the specifics of Jeopardy! and more about trying to maximize how often you�re right given the information you have and how well you can pull it all together. The algorithms in Watson help it accomplish this and that's extremely valuable in a business context. I was very impressed with how well it dealt with the nuances of the language and the complexity that was behind those questions to get most of its answers right.
Does Watson eliminate the disconnect between the promise and performance? Does it follow through on the romance?
Watson�s success is a tremendous accomplishment. But no matter what we say today, you know that 20 years from now we�ll be having this same conversation. No matter now good our technology is now, we�ll all be saying, �Can you believe it took a room full of rack-mounted servers to do that? Let me just ask my watch.� But the fact that Watson had more human characteristics � it spoke, it played a game that humans play in a way that felt more human, it took natural language as an input and produced natural language as output, and even conversed as it chose each question � that will drive it more into our comfort zone.
Technology will always change. But will what we want from technology change as well?
I think our expectations will continue to grow in the same space. We�ve made tremendous progress in many domains. But in some respects we�ve not yet lived up to the ads of 25 years ago. Until we do, there�s a lofty goal out there for a system to support us in the way we want to be supported. We�re getting better at allowing more people to participate in technology and benefit from what it can do for us � for example we don�t have to type in lines of code from the back pages of Byte Magazine to make our system run � but I think in general we still want that partner in our lives that will make our world easier to manage and that will free us up to do more things. We�ve made progress, but too much technology still makes us perform tasks on its own terms.
Comparing politics and economics with information security is one of my strangest hobbies, that is for sure. But, these are basically the only two things I care about besides the basics of being a human so it winds up happening quite a bit. Not to fear though, this blog won't venture into the land of any of my political preferences. Instead, I want to look at the similarities between two things that at first glance might seem to share few of them.
I want to begin by looking at this notion around, "creating a job" and what that means, or if it's even possible. If you are following the current election cycle in the US at all, you probably hear a lot about jobs and unemployment and see lots of different graphs saying we're either doing well or we aren't. However, everyone is saying that they want to create more jobs. But do politicians really create jobs? Well, unless we are talking about directly increasing the number of people on government payroll, the government doesn't create jobs. However, that is not to say that government doesn't play a role in this conversation. Government does quite a bit to create an environment where job growth is possible. Regardless of your political preference, the balls that are up in the air here are things like tax rates for businesses, crime, property taxes, educational achievement in the area, quality of the regional infrastructure, regulations, natural resources, market stability, location, and the list goes on. Some of these do represent competing interests (can't have lower taxes and more government services), but balancing all of these factors successfully can result in creating an overall climate where employers feel comfortable growing their business and bringing on new people. While the job growth number is any easy one to get your hands on, the affect that any one program or tax law change impacts that number is almost impossible to accurately quantify.
While not a perfect analogy, there is a great deal shared between creating a job and detecting/remediating a sophisticated threat. Tom Cross does this talk on Advanced Persistent Threat and one of the best elements of that discussion is around the "kill chain" of an attack (reconnaissance, exploitation, infection, command and control, internal pivot, data preparation, data exfiltration). There are a lot of things that an attacker has to do between deciding to attack a network, and leaving said network with the desired data. If you approach the problem from a kill chain perspective, the goal is to look at the entire chain of events and apply security counter measures along the way, each capable of alerting you to an attempted intrusion. Tom mentions the notion that you want to strive for detecting an attacker at minimum in two different points in the kill chain. Only one means your defenses were too close to unnoticed compromise. Additionally, just because one security technology didn't directly detect the attacker, doesn't mean it didn't play a role. Hardened defenses in one spot can force an attacker into using different tactics. As an example, if an attacker wants the information in a database that sits behind a web application, but the web application was coded securely, this forces the attacker to loop through the back end and possibly attack the individuals with access to that database. Let's say the attacker is eventually discovered because of irregularities in database user activity, does this then mean that the application vulnerability scanning tools used to make the web application didn't factor in thwarting the attack? Of course not. Does this reality make it harder to understand the impact of any one technology. That it does.
At this point, it's a good time to return to this analogy around job creation. In both cases end results are tangible (you have more jobs, you caught the bad guy), but it can be difficult to quantify the impact of any one investment or decision. Success in these scenarios is often predicated on systemic strength. Just as with job creation where you are trying to create an environment where job creation can take place, with advanced attackers you are trying to create an entire environment where the attacker can be detected and defeated. In practice, creating that environment involves a whole ecosystem of different capabilities and expertise that may or may not play a part over the course of any given incident.
With that in mind, I am pleased to say that today's announcements are a reflection of IBM's belief in this notion around the strength of the system. Today was the first announcement around our Advanced Threat Protection Platform, with our new anomaly detection appliance headlining the show.
There are also integrations between X-Force/our Network Intrusion Prevention System with our recently acquired Q1 Labs technology as well as the addition of "hybrid protection" to our Network IPS. The latter of these announcements complements the proven, ahead of the threat protection found in IBM�s Protocol Analysis Module (PAM), with the open source capabilities and common syntax of SNORT.
While there is always more work to be done, this announcement represents the latest example of what we are trying to do in security, which involves addressing complexity not by proclaiming simple solutions, or a one product fix-it, but by bringing together a lot of different technologies and capabilities to deliver something greater than the sum of its parts.
What follows is the third in a three-part series of blogs by Bernie Spang, Director of Strategy and Marketing, IBM Database Software and Systems.
There�s no shortage of data these days. At IBM, we estimate that as much as 90 percent of the world�s data has been created in the last two years. Greater use of the cloud will be a big part of how organizations manage and apply their data. Gartner, for example, estimates that by 2016, 50 percent of data will be stored on the cloud.
The challenge is to make sense of this data�no wonder more organizations are making analytics part of their processes. The problem is that no one type of analytics solution can meet everyone�s needs. Various companies have been working to fill the gaps with multiple solutions for different requirements for years.
If you�re analyzing huge data sets for pharmaceutical research or weather modeling, for example, you might choose the IBM big data analytics solution and team it up with an IBM supercomputer capable of performing trillions of calculations per second. Or maybe you need enterprise-class analytics for your business in the form of an appliance you can drop right into your infrastructure, like the IBM Netezza� data warehouse appliance.
Need the simplicity and quick deployment of an appliance plus the flexibility and control of a custom-tuned system? IBM has now filled that gap with IBM PureSystems�. I predict you�ll hear more and more about PureSystems as businesses reap the benefits of built-in expertise. We�ve distilled �patterns of expertise� from thousands of successful deployments and embedded them into these systems.
Take the IBM PureApplication� System, for instance. It not only includes servers, storage and networking, but it also has preintegrated middleware and patterns of expertise that let you deploy an analytics application in just minutes�optimized for your particular needs. In fact, patterns for more than 80 applications are available through the IBM PureSystems Centre. The PureApplication System simplifies every step from procurement to deployment to data analysis. And it makes getting to the cloud faster and simpler.
If you�d like to find out how IBM PureSystems can make capturing value from your data faster, simpler and more cost-effective, you can read more at http://ibm.com/ibm/puresystems. Better yet, see them demonstrated in person and talk to IBM PureSystems experts. We�re rolling out the latest member of this family of expert integrated systems at InterConnect 2012 in Singapore October 9-11 and other events around the globe and online. I think you�ll be as excited about PureSystems as I am.
IBM recently conducted tests in its labs that revealedIBM Cognos BI v 10.1.1 to be at least on par and better by 14 to 46 percent when compared to Microsoft Windows 2008 Server.
IBM Cognos BI application performance between similarly configured IBM POWER6 and IBM POWER7 systems showed significant performance advantages for IBM POWER7 servers.
IBM conducted a variety of tests to match the different ways of using IBM Cognos Business Intelligence services and system resources. The test systems used similar server configurations and current processor generation. Download the free report here.
Other findings included:
�Performance improvements of as much as 41 percent for workloads such as running HTML and PDF-based reports and portal navigation
�Performance improvements of as much as 26 percent for workloads such as running large and highly formatted PDF reports, locally processed calculations, interactive analysis activities and complex queries mixed with lighter workloads
For example, an IBM customer had developed a Cognos Business Intelligence application to distribute PDF based reports by email; as implemented and before optimization this application was performing at a rate of 11 multi-page reports per minute.
After the customer applied recommended AIX tuning parameters, the application performance improved to 150 multi-page PDF reports per minute.
On average, most applications might see performance improve two or three fold by applying AIX level tuning.
To provide a comprehensive view of the potential performance impact of optimizations made in Cognos Business Intelligence v 10.1.1, IBM used a broad range of tests. See the graphic that lists the performance improvements for the 20 different tests used.
For more information:
�Downloadthe whitepaper, �Best Practices and Advantages of IBM Power Systems for Running IBM Cognos Business Intelligence,� to see the full performance results.
NOTE:Performance is based on measurements and projections using standard IBM benchmarks in a controlled environment. The actual throughput or performance that any user will experience will vary depending upon many factors.
Guest post by Dr. Jean Paul Ballerini, Sales Enablement for IBM Security.
There are few industries untouched by multiple regulations; businesses contend with regulations that control their accounting, IT security, operations, and so on and so forth � in addition to those that are sector specific. But all of this is for the good of the industry and to protect consumers.
If regulations are there for our own protection, why is there so much complaint about them? I�m sure that every now and then there are companies who feel that having to be compliant gets in the way of an �easier� way to run a business, to do accounting, to "do" security� But this isn�t the real issue: the true struggle is having to demonstrate compliance to auditors in order to avoid the consequences ranging from steep fines to seeing your business shut down. So, the million dollar question is, why does it have to be so complicated to demonstrate compliance?
When it comes to an IT security audit, the easiest questions only require showing that certain actions take place-- that actions are logged, that logs are collected, that backups are made, that there is a DRP, etc. The slightly more complex query will focus on proving that companies can stop events that shouldn�t happen; for example, that unauthorized users actually don�t have access to documents, that cyber-attacks are detected and blocked, that compromised systems are identified and cleaned, that logs cannot be manipulated or deleted without alerts triggering. The top of the chart goes to questions that require analysis of the security settings, such as showing the consistency among the access policies on all applications, among the firewall policies, and among the intrusion prevention policies.
The higher the complexity, the longer it takes a company to prove compliance to an auditor; I have seen companies spending several man-months to be able to show compliance. It is obvious that when the process takes that long it becomes very expensive and can impact the revenue margins of any company, in particular because the process of proving compliance is recurring and not a one-time expenditure.
So what can be done to handle this task simpler and cheaper? Two things: automation and security intelligence.
We all know replacing manual processes with automation can be a huge money saver, not only because automation usually works much faster but also because it standardizes the steps making it much simpler to prove compliance. For example, automating the collection of logs allows you to set triggers that will alert you when it doesn�t happen, catching two birds with one stone. The automation of the identity provisioning not only ensures that the correctly configured access rights are assigned but also helps speeding the process and enables employees to become productive much faster.
Security Intelligence is the glue that gives a meaning to the millions or even billions of events that can be generated by an IT infrastructure. When there are billions of log entries, how can a company state they know what is going on if they don�t have a security intelligence solution that not only collects the data but also correlates it in order to identify events and incidents, and enable the security staff to inspect and understand the bigger security picture rather than what point products can collect? The beauty of a security intelligence solution doesn�t only lie in making the security staff�s life easier, but also in the availability of reports that allow both the monitoring and demonstration of compliance. Monitoring compliance allows companies to quickly identify events that can prove costly, for example a configuration change, access to confidential information, etc.; demonstrating compliance allows companies to save the man-months that they are currently spending to run a manual process--repeatedly.
Do you want to take it a step further? Wouldn�t it be great to test the impact of a policy change rather than applying it and monitoring events to identify incidents or loss of compliance? That is possible and is exactly what risk management can do. This new frontier brings companies to be pre-emptive rather than reactive; instead of showing the auditors that they are able to detect security issues and fix compliance issues, companies can show that they verify the policies before applying them, making it even easier and less costly to abide by regulation and prove compliance, while reducing the risk of security incidents.
Read more on meeting PCI DSS requirements�even in virtual environments.
About the author:
Jean Paul Ballerini is a member of the World Wide Security Sales Enablement Team since January 2010. Prior, he was the Technical Sales Lead for IBM�s South West Europe region after having covered the role Senior Technology Solutions Expert for IBM Internet Security Systems for the previous six years. Since 2003, Ballerini has also served as the EMEA spokesperson for the X-Force, the IBM security research and development team.
He also holds a PhD in Computer Science and Law. In 2005 Ballerini became a CISSP, and since 2007 has also served as a Qualified Security Assessor for the Payment Card Industry. In June 2008 he was appointed an IBM certified Senior Technical Staff Member.
Chris Woodis an Integration Architect at Visa Europe, based in the United Kingdom. Chris has a background in SOA and ETL, designing and building systems that incorporate WebSphere SOA Appliances, WebSphere Service Registry and Repository and WebSphere MQ.
I like to get stuff done. I can't help it. This is just part of my personality. I like to get to grips with a problem, work out the best way to solve it and get it done. I think this is why I was drawn into integration architecture and design. For the past ten years, I've been taking integration problems and attempting to solve them, to get them done right.
People who have worked with me will also know that eventually I like to talk. It may take a while to get going, but once I get there with a problem and how to solve it, suddenly take-off velocity is reached and there's no stopping me.
That continuing dialogue, taking a problem, working out how to solve it, and communicating the solution particularly resonates with me on the subject of Web APIs. I've only been in the API space for a couple of years but I understand the value of Web APIs and what they bring to the integration space:
Web APIs help me design and build interfaces that developers understand. Making integration easy is the key, right? So why not build a solution that the developers can pick up, use and understand with minimum fuss?
Web APIs implemented using REST are great, but what about all the other stuff the business has that could be exposed as an API? The existing databases and SOAP-based Web Services? How do I make them accessible in a way developers understand and I can manage?
For the existing assets I expose, the databases and Web Services, how do I make sure that it's business as usual for internal systems? That the new consumers of those assets don't impact the existing ones?
This is where IBM API Management can help; it provides the tools to the integration guy to make stuff happen, to get Web APIs into the marketplace and get developers working with them ASAP. With IBM API Management you can:
Expose an existing RESTful Web API that developers can pick up, use and understand using a technology that is natural to them, adding the management features of usage entitlement and security to safeguard the backend;
Expose an existing database or Web Service, defined as a Web API; all the benefits of communicating in the language developers understand but with the reassurance of usage entitlements and security;
With IBM API Management you can implement the Web APIs you want to expose, and deliver the documentation and guidelines in one package. Developers can get on and code while you get on with solving the next integration challenge.
So IBM API Management helps businesses, developers and the integration guy deliver Web APIs. What's not to like? Don't forget to check out the IBM Redbooks publication on IBM API Management to learn more!
Every retail manager is familiar with the idea of the Return Browser. By this I mean the guy who keeps coming back to check out something he obviously likes -- a car, a guitar, a flat-panel TV -- but just... can't... quite... justify. A similar situation seems to me to apply inside certain organizations as they ponder moving to cloud architectures.
They're familiar with the cloud story: Faster service delivery! Smart resource allocation! Increased focus on strategies, not technical details! Minimum waste, maximum business value!
But still they ponder.
It's an understandable doubt they feel. The promise of cloud computing is, they are afraid, not going to be realized in reality -- at least in their case.
They correctly see that a cloud is going to be a much more dynamic infrastructure, meaning in part that it will be harder to predict exactly what it will do in any given context. You need to think through all the major ramifications very carefully before making such a deep commitment. (And if you happen to see a parallel to marriage here, I won't tell you you're wrong).
By offering organizations substantially enhanced power to determine and optimize how a private cloud will fulfill workloads, these solutions also inspire confidence that maybe cloud computing really can live up to the hype.
Provision and monitor your way to peace of mind -- and incredible business value
When I spoke with Marvin Goodman, Product Manager for IBM Tivoli Software, he also seemed to see things along these lines.
Actually, one of the points he made was that even when the cloud is already up and running, much the same kinds of questions will still apply as new workloads are added to it from the more conventional infrastructure -- or somebody proposes said addition.
In that scenario, in fact, a double set of worries may apply.
�Physical to virtual migration plans are a daunting challenge for both application owners and cloud administrators,� said Goodman. �The application owners are under pressure to meet deadlines for the virtualization of their workloads, but face uncertainty about the ability of the cloud to service their customers. Meanwhile, cloud administrators have to quickly respond to requests from those application owners, and be able to determine, with confidence, that addition of those workloads is feasible, and won't affect the performance of existing workloads.�
Fortunately, what SmartCloud Provisioning and Monitoring offers is directly applicable to both sets of worries.
Consider: both the application owners and cloud administrators are bound to like the idea that in the cloud, new virtual servers will be created and provisioned with absolutely mind-boggling speed based on business requirements (thousands of servers per hour, if need be). And once those servers are up and running, workloads can be assigned and distributed across them -- meaning that applications should indeed perform as expected, and that the cloud will simply have taken on another role with ease.
They're also going to like the idea that the cloud's assets and resources are continually and automatically monitored over time to verify that they're performing up to target levels -- or if they aren't, notifications will be sent, steps will be taken and a fix will be made. Because when things are about to take a turn for the worse, the sooner you know about it and the more comprehensive your insight, the better.
�When application owners surrender their workloads to cloud administrators, they lose the visibility into performance they've been accustomed to. Their application is now sharing server resources with lots of other workloads, many of which they know nothing about. So they're uncertain about how their applications will perform in the cloud,� said Goodman. �SmartCloud Monitoring allows cloud administrators to provide assurances that workloads are, indeed, running smoothly in the cloud. It can also leverage performance data to optimize those workloads and their placement to simultaneously maximize performance and capacity.�
Instead of dynamically generated virtual servers and unpredictable resource allocation being something to worry about, in other words, they are simply strengths to rely on -- strengths the cloud was supposed to have in the first place.
Future-proofed clouds generate more rain over time
Of course, not even a cloud runs on magic; ultimately there is a limit to what it can accomplish given a finite set of resources. The question is: where's that limit, and how accurately can you establish it in advance?
If you're a cloud administrator of the type Goodman is talking about, capacity planning and management is a pretty big deal for exactly these reasons. Which, no doubt, is why capacity management is one of SmartCloud Monitoring's great selling points.
�Customers trying to grow the maturity of their virtual environments into robust private clouds often grapple with the pressure to add more and more workloads to the environment, at a pace that far exceeds the growth of their cloud budget,� said Goodman. �SmartCloud Monitoring's capacity analytics and planning unlock hidden capacity in the existing infrastructure by freeing up resources through virtual machine 'right-sizing' and optimization.�
So rather than always buying new storage for new workloads, you can often just improve the way you're using existing storage.
Instead of always working harder, you work smarter. Indeed, this gets right to the heart of what IBM has in mind when it talks about Smarter Computing. Maybe you really do need more/new resources or maybe you don't; why not establish as clearly as possible which situation applies, and respond accordingly? It all goes straight to the point of making sure clouds will live up to their original promise.
And if you're really going to design a cloud to be the best possible IT service delivery platform -- the one that really is as optimized as it can be -- you should probably try to future-proof your cloud to ensure it will support change of many kinds: change in workloads, certainly, but also change in critical resources and assets.
For instance, consider all those server images -- the complete software snapshots needed to create virtual servers dynamically. For many organizations, image management is a huge hassle because (a) there are way too many images, (b) more show up all the time and (c) it's not very clear what's inside them.
SmartCloud Provisioning, fortunately, includes some nifty features directly aimed at these issues. Looking for a specific image that needs a security patch, and all virtual servers based on it? You can easily conduct a search along those lines. Or suppose you're trying to drum up the closest possible match to a target image � that, too, is a straightforward matter. This also means it's easy to discover and eliminate duplicate images, consolidate libraries down to the essentials and in short, knock the bullet point titled �Image Sprawl� right off your Fix Now! list.
Along similarly future-proofed lines, note that SmartCloud Monitoring offers support not for just one hypervisor, but for many. Ergo, if you want to add different hypervisors to your cloud over time, you can just go ahead and do it, and rest assured that the IBM solution has got your back.
Goodman sees this particular instance of future-proofing as a serious advantage.
�IT departments want to be able to choose hypervisor technologies based on cost-benefit analysis, and not feel compelled to stay with a particular vendor just because they've become reliant on its management tools,� he said. �As a management solution that crosses different hypervisor platforms, and indeed physical platforms as well, SmartCloud Monitoring allows customers to maintain tool continuity as they move workloads from one virtualization platform to another, focusing directly on availability, performance and total cost of ownership.�
So with all that in mind, let me ask you this:
When it comes to private clouds... what, really, are you so worried about?
Connect, learn and share with Cloud/Virtualization Management experts About the author Guest blogger Wes Simonds worked in IT for seven years before becoming a technology writer on topics including virtualization, cloud computing and service management. He lives in sunny Austin, Texas and believes Mexican food should always be served with queso.
Did you Tweet today? Update your Facebook status? Check in on LinkedIn and Foursquare?
I did. All four.
Of course, it�s my job to do these things.
I�m glad it is. Because even if it weren�t, I�d still do it. Tweeting is fun. Facebook and LinkedIn keep me in touch with my friends. And, increasingly, they�re my way of knowing what�s going on in the world.
If you�re reading this, you�ve probably done the same, probably every day for quite some time now. No big deal, right?
Today is Mashable's second annual �Social Media Day,� a global celebration of the technological advancements that enable everyone to connect with real-time information, communicate from miles apart and have their voices heard.�
All of these activities are expected; none is really that surprising. So why is Social Media Day such a big deal?
I can think of three reasons:
Your connections are your currency. The more you have, the richer you are. On its own this isn�t new, but the ease and speed with which we can find each other, connect with each other and share with each other on a conceivably infinite number of topics certainly are. Help your connections make other connections and you'll be very rich, very quickly.
All media is social media. As a student of media history, I�ve never been keen on the term �social� media. All media are inherently social, for why do they exist if not to connect people? In a more pragmatic sense, just think: Where do more and more people go now for breaking news but Twitter? Where do 500 million people go to discuss politics and entertainment but Facebook? The site isn�t just taking a greater share of Web visitors every day, it�s taking share from other sites as well. This isn�t to say there�s no value in visiting CNN, BBC or CBC; but discussions about their stories often happen elsewhere. For Marketers and brand managers, too, an increasing number of clicks lead to Facebook and Twitter.
Data, insights, outcomes. There was a time when it was enough to run your business knowing a little about what happened last week. That was roughly the time when newspapers came in the evening and TVs still had antennas. Now, not only must companies anticipate the future, they must analyze and shape events that haven�t even happened yet. In this brave new world, historical data can help, but analytics-driven organizations need to tap into data that moves and changes a lot more quickly and tells a much more valuable story. To a great extent that data comes from Facebook and Twitter. Properly analyzed, companies can learn from this data what their customers like, what they don�t and what they�re likely to want next.
Side note: The item on smarter security relates to the BigFix acquisition news I mention on the IBM Tivoli Software blog.
I'd recommend paying close attention to another big news item not on this list: New survey shows IBM Business Partners expect social media to drive sales. A recent IBM survey with over 1000 Business Partners found that while 45% of IBM Business Partners are experimenting with social media, 74% are seeking out education on social media for business. See the video with IBM VP Sandy Carter below for more details.� You can expect to see big things in this space in months ahead, and I'm excited to be part of that effort!--more on that later.
It was a food fest of social conversations and activity around IBM and Smarter Planet this week! It's times like these when I want to leave my day job and spend the whole time participating in conversations and activities on how to make the world a better place. Fortunately, I work at a company that gives me lots of opportunities to do just that even during my day job. While I'm on this topic, and before I get to the business at hand, quick shout out to the IBM launch of People for a Smarter Planet on Facebook this month.
According to Twazzup, the top six links shared around "ibmsoftware" in social media circles today revolve around:
The links above give you a hint of how busy next week is going to be! While all the topics are excellent, the one that will consume my passion and attention the most is the IBM Summit at Start online activity. Start is a United Kingdom initiative by The Prince's Charities Foundation to promote and celebrate sustainable living.
From September 8-16, IBM is sponsoring a nine-day summit at Start in London to convene business, industry, and academic thought leaders to discuss challenges and next steps to enable economic, environmental and societal sustainability. If true change is going to happen, it's going to take groups like this and YOU and me, of course, to make it so. Below are some of the videos and photos at the event and a few other links if you'd like to follow the online conversation.
Next week sees 8,500 of the most influential CIOs and IT professionals from around the world converge on the Swan and Dolphin hotel in Orlando for the 2013 edition of Gartner Symposium ITxpo. IBM is once again a Premier Sponsor, with senior IBM Software executives on-hand to present the latest innovations and provide their insights into the industry's most important trends. Here's a sampling of what you'll hear:
On Cloud: Cloud computing has changed the way we think about technology. It already offers new economics in IT, service delivery without boundaries, and stronger customer relationships. Now see how IBM SmartCloud® can keep your business in line with current demands.
On Big Data & Analytics: Data is on the rise. And any organization that mines this growing resource with analytics can outperform the competition. Moreover, with Smarter Analytics from IBM, you can turn information into insight, make better decisions faster, and enact changes to maintain your competitive edge.
On Mobile Enterprise: Mobile devices are facilitating greater interactions with employees, customers and partners, driving intelligent decisions and actions as a result. But they can also create management and security issues. Find out how IBM mobile services can guarantee secure performance.
On Social Business: Social is the next big movement in business. An interconnected and instrumented world has permanently altered the way people work, interact and make decisions. It's also shifted more power from the organization to the customer. Find out how you can capitalize on this evolving landscape.
We'll also be on-hand to discuss Smarter Leadership. As software moves ever closer to the forefront of business process, so too, does the CIO move closer to office of the CEO: By aligning IT with business objectives, forward-thinking decision-makers are building an innovation capability to support revenue-gaining initiatives.
The centerpiece of the IBM presence at Gartner Symposium will be a presentation by Steve Mills, SVP and Group Executive, IBM Software and Systems. It's called Cost-Effective IT Strategies to Lead in a Digital World." I've seen an early draft of the presentation and while I can't give away the details, I can share with you the five key points Mills will be conveying. Here they are:
Do not confuse price with cost
Budgeting and charge back techniques can cause false economics
Technology is a tool, not a religion …… insist on fact based analysis
TCO cannot be overlooked but neither can agility and effectiveness
zEnterprise multimedia resource library. Analysts, reporters and customers can�t stop talking about the new IBM zEnterprise System � so we�ve put everything they have to say on one page packed with links to videos, podcasts, demos, reports, papers and more. Take a look.
The CityOne �serious game� is ready to play! Since it was announced in April at Impact 2010, the CityOne city-sim game has been eagerly anticipated by analysts, customers and journalists alike � in fact nearly 8,000 people registered to be notified when the game went live. Play CityOne to learn how you can make cities and industries smarter by applying IBM technologies � BPM, SOA, collaboration, cloud computing � in innovative ways. Or play just tell people you played an online game featuring content from the EPA. Register and start playing.
I�m only 50 or so pages into Kevin Kelly�s new book, What Technology Wants, but I can already tell it�s going to be a fascinating read.
A timely one, too, given we're only a few weeks away from the showdown between Watson and its human challengers.
More on Watson further down. First, a description of Kelly's book straight from the publisher:
This provocative book...suggests that technology as a whole is not a jumble of wires and metal but a living, evolving organism that has its own unconscious needs and tendencies. Kevin Kelly looks out through the eyes of this global technological system to discover "what it wants." He uses vivid examples from the past to trace technology's long course and then follows a dozen trajectories of technology into the near future to project where technology is headed.
Given this description and Kelly�s background as co-founder and former executive editor of Wired, I was surprised to on the first page that for most of his life he �owned very little, dropped out of college and for most of a decade wandered remote parts of Asia in cheap sneakers and worn jeans, with little time and no money.�1
Not surprisingly, though, this existence helped him develop an acute sense of the organic rhythms and reality of nature:
Living close to the land, I experienced the immediacy that opens up when the buffer of technology is removed. I got colder often, hotter more frequently, soaking wet a lot, bitten by insects faster, and synchronized quicker to the rhythm of the day and seasons.2
Good technology "can lift your soul"
These experiences helped Kelly develop an appreciation for truly great technology:
If my travels in the old world had taught me anything, it was that aspirin, cotton clothing, metal pots and telephones are fantastic inventions. They are good...Anyone who has ever held a perfectly designed hand tool knows that it can lift your soul.3
Steve from the moment I met him always loved beautiful products, especially hardware. He came to my house and he was fascinated because I had special hinges and locks designed for doors. I had studied as an industrial designer and the thing that connected Steve and me was industrial design. It wasn�t computing...Steve had this perspective that always started with the user�s experience; and that industrial design was an incredibly important part of that user impression.
Technology evolves like we do
In chapter three, History of the Seventh Kingdom, Kelly draws a convincing parallel between the evolution of genetic organisms (amoebas, zebras, your uncle Bernie) and the evolution of technology � the sum total of which he refers to as the technium. Consider this passage:
The two share many traits: The evolution of both systems moves from the simple to the complex, from the general to the specific, from uniformity to diversity, from individualism to mutualism, from energy waste to efficiency, and from slow change to greater evolvability.4
A few sentences later, he explains how disparate ideas within the technium often merge into entirely new entities:
Most new ideas and new inventions are disjointed ideas merged. Innovations in he design of clocks inspired better windmills, furnaces engineered to brew beer turned out to be useful to the iron industry, mechanisms invented for organ making were applied to looms, and mechanisms in looms became computer software.5
Is it me, or does this sound a lot like a mashup?
Language is technology is language
On the next page, Kelly explores the importance of language - specifically language as a technology:
A prime example would be the transformation of alphabets (strings of symbols not unlike DNA) into highly organized books, indexes, libraries and so on (not unlike cells an organisms).6
Transitions, Kelly writes, are a key part of evolution. Just as single-celled organisms transform into multi-cell organisms (outlined on page 46), technological evolution drives a transformation from oral culture to writing and mathematical notation (outlined on page 47).
Language drives transition
No transition in technology, he continues, has affected our species, or the world at large, more than the creation of language:
Language enabled information to be stored in memory greater than an individual�s recall...The invention of writing systems for language and math structured this learning even more. Ideas could be indexed, retrieved, and propagated more easily. Writing allowed the organization of information to penetrate into many aspects of everyday life.6
Where Watson and business analytics come in
This, I thought, is where Watson comes in. It's also, I thought, why Watson is so important, not only to the evolution of the technium, but also to business analytics. Here are three reasons why:
First, Watson relies on those highly organized books, indexes to find the responses to the Jeopardy! clues. As a self-contained system, its algorithms must be able to quickly access and analyze every piece of data from nearly infinite perspectives. It must also trust the accuracy of the data for its confidence algorithms to be of any value. This isn�t far removed from the queries you do on your structured corporate data every day, or the trust your users place in the data they receive from those queries.
Second, as odd as it may sound, you could view your data as a �living� entity in continual transformation. In nearly every organization, text now merges with audio, data cubes mesh with images to create entirely new sources of business insight. It's a much more accurate reflection of the world's activity and knowledge, and Watson must make sense of it to beat the world�s best. As a BI professional, you need to make sense of this same complex mix of data types to drive better business outcomes.
Finally, Watson is driving another important transformation in what many computer scientists see as the final frontier in human-computer interaction: solving natural language.
For we humans, language codifies knowledge, thereby enabling it to be shared down through generations and across different communities to share ideas and drive progress. In a similar vein, the most effective business analytics deployments rest on effective communication between IT and Business, in which definitions, expectations, time lines and thresholds are all commonly agreed upon and widely shared.
Should Watson beat the Jeopardy! champs it will show us the way to reach - and perhaps surpass - that final frontier. In doing so it has the potential to make world of insights instantly available to anyone who asks. And given how many questions come up in a typical business analytics deployment, that possibility should be enough to get anyone excited.
Two contrasting views of software came through my twitter stream today that got me thinking.
I do that sometimes.
The first was a blog post by Dr. Norman Lewis. The second was by Manoj Saxena. Dr. Lewis is a co-author of Big Potatoes: The London Manifesto for Innovation as well as Chief Innovation Officer and a Managing Partner of Open-Knowledge. Mr. Saxena holds two U.S. software patents for advanced discovery and non-intrusive personalization of Web services.
Clearly, these men know a thing or two about code.
In his post, "Facebook valuation: $100 billion for what?", Dr. Lewis takes the financial media to task for celebrating what he views as a colossal waste of time. Investor excitement about Facebook, he says, is not about innovation nor any social or economic benefit. Rather, he says, Faceook's financial success depends on continuing to enable a regressive worldview in which connections are ends not means and introspective narcissism are celebrated.
"[In its S1 filing] Facebook argues [...] that if it fails to retain existing users or add new users, or if its users decrease their level of engagement, then its revenues, financial results and business may be significantly harmed. In other words, what Facebook understands, and what it wants its prospective investors to understand as well, is that its future success relies upon more of the same: more people playing games, sharing photos and sending each other messages upon which Facebook can generate revenues through targeted advertising...The largest technology IPO in history, in other words, is not about a technology that can transform nature, provide new sources of energy, or new cures for illnesses or cancer, for example; it is about a platform that encourages ever more of the same unproductive, self-absorbed communications between users...Facebook is now a cultural institution, which is driven by, and nurtures, a culture of self-reflective, self-absorbed individuated entertainment and therapeutic communications. The sad truth is that the frenzy and excitement generated by Facebook�s pending IPO reflects little more than its cultural significance. And this is a million miles away from where investment ought to be focused...The Facebook IPO does show how unambitious contemporary society�s expectations are about technology. That Facebook could become the largest technology IPO in history is an alarming prospect."
"We anticipate that the technology will next be applied to other diseases and could eventually become a ubiquitous bedside and office-visit assistant for doctors and nurses�enabling them to deliver truly personalized care to everyone they serve....Watson�s potential to help transform the healthcare industry is especially meaningful to me. My mother, who was a doctor in our native India and now lives with my family in Austin, began showing symptoms of dementia two years ago. I�m witnessing close up the decline of a person who was the model of a vibrant, strong professional woman and parent. Now she�s wandering into a fog. It�s tragic. And it didn�t have to be this way. We need new tools to help physicians understand and prevent diseases, and Watson promises to be an important new tool in that battle. I don�t want others to experience what my mother is going through right now."
In other words, Watson is doing what Dr. Lewis believes technology should do.
I'm inclined to believe him, too. Yes, I'm on Facebook and enjoy trading pop-culture references with friends far and wide. But it's an amusement. One thing you learn when you work at IBM is that expectations of ourselves and of the products we build are very high,
It may be too early to know what Facebook's IPO really "means" in the grand scheme of things. Facebook users may effect real change through their thousands of connections. But for now I'm happy to work for a company that understands the importance of the role it plays in the world and the impact its products have on the way we live our lives.
Studies show that organizations that apply analytics outperform their peers. Further, those with a broad-based, analytics-driven culture perform, on average, three times better. Not only do they drive more top-line growth and control costs, they take timely corrective action to reduce risks that derail their plans.
So, with a month to go before we kick off the big show down in Vegas, why not secure your spot at the biggest conference in the IBM Software galaxy and discover what our diverse portfolio of analysis software can do for your organization? Whether your bent is predictive analytics or social media analytics, financial or operational analytics or the increasingly important arena of customer analytics, you'll find the answers - and the insights - you'll need to drive better outcomes for yourself and your organization. It's the smartest move you can make to get smarter about smarter analytics.
Last weekend we had a friend over for a good ’ole Texas BBQ. He has been working as a database administrator in his company for over 15 years and had recently been promoted to the most senior position in that role. But while we were enjoying our German Hefeweizen beer he reported that he had a bad week and was really upset about some work related issues.
The “security guys” in the company have forced everybody on his staff to specifically sign out an administrative user ID before they can access any of the production databases. “How could they undermine my trust and skills in such a way?” he reported.
Wednesday�s launch of IBM PureSystems captured a lot of attention from the folks in the server room, but it was a busy week on the software front as well. Here are five big announcements you may have missed. Analytics fans, take note: the first three should be particularly interesting for you:
1. IBM to acquire Varicent
IBM added to its SmarterAnalytics strategy with the acquisition of Varicent, a provider of analytics software for compensation and sales performance management. Varicent is a privately held company based in Toronto. Financial terms were not disclosed.
The software allows clients such as banks, insurance companies, retailers, information technology and telecommunications providers to more accurately determine compensation, streamline territory assignments, manage quotas, and report and analyze sales activities. The software also strengthens audit and compliance readiness and provides transparency for all aspects of incentive compensation.
Varicent has also been recognized as a leading category vendor by Ventana Research and Gartner and was alos ranked as the fastest-growing software company on Deloitte�s 2010 Technology Fast 500.
IBM has elevated seven employees to IBM Fellow -- its most prestigious technical honor -- to acknowledge their important contributions and industry-leading innovations in developing some of the world's most important technologies.
The seven employees who have earned the coveted distinction of IBM Fellow this year are Luba Cherbakov (IBM Enterprise Transformation), Paul Coteus (IBM Research), Donald Fagin (IBM Research), Vincent Hsu (IBM Systems and Technology Group), Balaram Sinharoy (Systems and Technology Group), Ruchir Puri (IBM Research) and Jeff Jonas, IBM Software Group. You can read all about them and their accomplishments here.
If you were at last year's Information On Demand conference you'll no doubt remember Jeff Jonas' dynamite keynote presentation on Day 1. If not, here's a video that touches on what he's all about.
3. Groundbreaking research collaboration to drive economic development in Canada
The Governments of Canada and Ontario, with IBM (NYSE: IBM) and a consortium of seven universities led by the University of Toronto and Western University (my alma mater) announced a collaboration to establish a new Ontario-based $210 million R&D initiative that will create 145 new highly skilled jobs in Ontario and a new economic cornerstone for the country. IBM will invest up to $175 million through December 2014 in the project, thus forming the �IBM Canada Research and Development Center� to serve as a foundation for the research initiative. The Government of Ontario is investing $15 million; The Government of Canada will contribute $20 million.
This collaborative model will help university and industry researchers use high performance and cloud computing infrastructure to better manage and analyze massive data sets to solve critical world challenges including rapid urbanization, rising healthcare costs, water conservation and management and energy efficiency. Dr. Bernard Myerson, vice-president of innovation at IBM, provides his perspective here:
4, IBM Joins OpenStack with Platinum Sponsorship
On the cloud computing front, IBM announced yesterday its Platinum Sponsorship of the newly formed OpenStack Foundation. In this sponsorship, IBM will assist OpenStack as it evolves into a new, independent foundation along with other companies also committed to the goal of promoting open standards for cloud computing.
OpenStack is a global collaboration of developers and cloud computing technologists that seeks to produce a ubiquitous Infrastructure as a Service (IaaS) open source cloud computing platform for public and private clouds.OpenStack was founded by Rackspace Hosting and NASA jointly in July 2010. Over time, 150+ companies have joined the project to various degrees.
IBM will focus on several key areas to integrate and leverage OpenStack capabilities:
Create a robust, scalable, secure enterprise class platform for IaaS
Enable IBM capabilities in heterogeneous hardware and systems management under OpenStack
Enable workload aware and workload optimized systems building on OpenStack in the underlying IaaS
Leverage CDMI to manage data in the cloud
IBM intends to use current storage software stack & platforms (e.g. Virtual Storage Center and SONAS ) to provide high-value, end-to-end, comprehensive storage solutions
Combining IBM's current IaaS capability strengths with OpenStack will build a robust IaaS component as part of the Common Cloud stack which will be incorporated in our SmartCloud Foundation products.
5: IBM, Honda & PG&E partner for smarter charging
Finally, on the Smarter Planet front, IBM also announced that it has teamed with American Honda Motor Co., Inc. and Pacific Gas and Electric Company (PG&E) on a new pilot project that will allow communication between electric vehicles (EVs) and the power grid. This project will demonstrate and test an electric vehicle's ability to receive and respond to charge instructions based on the grid condition and the vehicle's battery state. With visibility into charging patterns, energy providers will have the ability to more effectively manage charging during peak hours and create consumer-friendly programs to encourage electric vehicle adoption.
Here's a great infographic that explains the issue: