Guest post by Dr. Jean Paul Ballerini, Sales Enablement for IBM Security.
Reducing expense is a mandate of departments in every company; this is particularly true for those departments that are traditionally considered a �necessary cost� � like security. I won�t debate here why I truly believe that security is an enabler of business and far from just a cost; I will focus for a moment on the opportunity that lies ahead of us, the one of securing heterogeneous environments.
But first- let�s discuss how we�ve gotten to this point. The way our environments have become heterogeneous is often the result of this cost-savings driver. We are led into the temptation to look for an affordable point solution that satisfies the need of a new platform, a �quick and dirty� way to prove compliance, or a pure cloud security solution (whatever this means). The hidden cost invariably is the lack of integration. Every point solution brings a new management platform, new processes to be introduced in the company and, of course, additional complexity.
Every quick fix turns out to be quick only once; the second time around things don�t work out the way did at first, or the solution doesn�t meet the requirements of other regulations. A product tuned to fit a specific architecture doesn�t allow us to easily secure multiple points, so there either is a larger cost in doubling the solution or in a reduced level of security.
So what can or should we do? The answer is consolidation and correlation.
Consolidation tackles two aspects which are extremely extensive in their impact on the security strategy.
First, the security measures that are implemented need to work in heterogeneous environments. We cannot afford to have security measures dedicated to a specific architecture living isolated from everything else; it is costly and ineffective. How can any company be satisfied with a solution that fits well the cloud architecture but not a traditional one? Not only do the management cost and time double, but the capacity to correlate events is seriously stretched.
Second, companies can less and less afford the cost, time and complexity of dealing with multiple (often dozens of) vendors. Consolidating the number of vendors on the one hand is often a guarantee of good integration among different areas of security (e.g. identity management and threat management). On the other, it gives companies time to concentrate the effort related to managing the relationship and obtaining better deals.
Whereas consolidation can highly improve any company�s capacity to manage and respond to security issues, there is additional value in the increased intelligence gained when all the information from all sources and devices in your network is collected and correlated. For example, often the entirety of data output from the networking infrastructure (e.g. log from switches and routers, all the �allows� from firewalls�) isn�t considered security information; yet, when an incident is detected it is crucial to be able to reconstruct the incident itself as well as link it to the related the data flow which might lead the analyst to victim #1 or, even better, to the source of the incident. It is then that the knowledge gained from this correlated and consolidated data becomes security intelligence.
Is this an easy path? Not necessarily; this process requires good communication among departments, sometimes it might require the replacement of existing point solutions , and it requires a proactive approach to securing an IT environment, rather than just reacting to the latest security trend. Is this the best path? I am strongly convinced it is; a consolidated security solution, one that secures heterogeneous environments and yet maintains a high level of integration, thus favoring the correlation of information, will reduce complexity and, consequently, minimize costs. This high level of integration is one of the basic requirements to enable a company to better identify emerging and advanced threats; it enables the security staff to have the necessary visibility that allows them to identify incidents much sooner hence preventing or limiting their impact.
About the author:
Jean Paul Ballerini is a member of the World Wide Security Sales Enablement Team since January 2010. Prior, he was the Technical Sales Lead for IBM�s South West Europe region after having covered the role Senior Technology Solutions Expert for IBM Internet Security Systems for the previous six years. Since 2003, Ballerini has also served as the EMEA spokesperson for the X-Force, the IBM security research and development team.
He also holds a PhD in Computer Science and Law. In 2005 Ballerini became a CISSP, and since 2007 has also served as a Qualified Security Assessor for the Payment Card Industry. In June 2008 he was appointed an IBM certified Senior Technical Staff Member.
Guest post from Jing Shyr, Chief Statistician & Distinguished Engineer, IBM Business Analytics
It's the age-old question: why did the chicken cross the road?
With one chicken, the answer is easy to compute.
But, what if there were millions of chickens crossing the road? And each chicken had a mobile device and was tweeting out its opinions, desires, likes/dislikes, photos, and detailed descriptions of what it had for breakfast that morning. Oh, and what if that road was being monitored by millions of sensors?
With current statistical techniques, it's no longer easy to quickly understand why each chicken decided to cross that road and, more importantly, predict when they might cross again.
The business analytics and statistics industry faces tough data analysis challenges in the coming years, including lack of skills, easily consuming analytics, mobility and big data.
Having been around the analytics industry for many years, it is refreshing to see that businesses are taking statistics and data mining results and injecting them directly into the business (and directly into the business process itself). The Catch-22 is that while more and more organizations are realizing the benefits of analytics, finding those professionals with an understanding of how to not only capture and analyze the tsunami of data created daily still requires training and a unique skill set.
A recent McKinsey Global Institute report indicates that over the next seven years the need for highly skilled business intelligence workers in the U.S. alone will dramatically exceed the available workforce � by as much as 60 percent.
I often imagine a business analyst presenting results to an executive the same way I present to my students. When teaching a lesson on modeling, I often ask, "Do you see what I see?" Everyone stares with blank looks on their faces and says, "No! What do you see?"
Herein lies part of the problem. To help counteract the skills shortage, we have to make the software easier to use and force the software to be consumable versus strictly scientific. Communicating results is just as important as the results themselves. I strongly believe that statistical software needs to go through a revolution of its own and become as intuitive as a smartphone.
And speaking of smartphones...
Most statistical software produces an incredible amount of very large tables and charts, making it extremely difficult to comprehend in a mobile environment. I torture my eyes every time I try to read a report on my Blackberry.
Consumability means anywhere, anytime and through any device. It's time we hold statistical s
oftware to a higher standard.
Let me get back to the chickens for a moment.
The volume, velocity and variety of data today is seemingly overwhelming traditional statistical software. Not to be clich�, but Big Data is giving the statistics industry big problems.
Previously, if we wanted to analyze any data, we would follow the same logical flow: decide what we want to predict or classify and build a model by bringing in all the predictors (independent variables). The size of predictors are often well below 100.
Today, however, we are dealing with thousands of different variables making traditional statistical analysis a serious hurdle. The machine capacity is no longer capable and many algorithms have been outpaced by data capacity.
The challenge calls for a new process of data reduction before modeling and new computation algorithms are required to handle millions of records and fields quickly in a distributed environment without passing the data back and forth multiple times.
Most importantly, we don't need to be chicken when it comes to Big Data.
Creating new statistical techniques for Big Data will get us all to the other side of the road, and you'll never have to ask why.
There is a German proverb that says, �The eyes believe themselves. The ears believe other people.�
Londoners are taking this proverb very serious. The London Eye (Ferris wheel), which is one of the most visited attractions of London and the spot where the New Year�s fireworks are sparked off each year, is transforming into a sentiment monitor.
With every Olympics, we get to see the splendor of each host nation. In 2008, China confirmed that it had an ability to stop rain, and now London is turning its busiest tourist attraction into a social media �mood ring,� a partnership between EDF Energy and a group of graduates from the Massachusetts Institute of Technology (MIT).
They developed an intuitive algorithm that linguistically analyses tweets related to the Games. Tweets will be scanned for Olympic-relevant terms such as "Olympics," "Torch Relay," "London 2012," and EDF's own hashtag, "#energy2012."
This initiative rides on the social media wave that the Olympics might be star bursting. This is really cool, but having lived in London for almost a decade, and being involved in social media, I see large loop holes. Just a bit of a pessimistic, "init."
The Olympics represent an incredibly diverse environment � from races to languages to communities. So what are the details behind the analytics on the �Eye�? Is this algorithm going to track sentiment in languages other than English? Moreover, what are the details behind neutral or ambivalent sentiment?
Let�s look at a beautiful example of some noise: On Twitter, Mr. Hancox said that for Londoners, "It's as if someone else is throwing a party in our house, with a huge entry fee, and we're all locked in the basement."
How will the eye in the sky pick up that statement and how will fellow Londoners and the rest of the United Kingdom react?
I would have thought that Boris Johnson, London�s mayor, would have done this in a better way.
Well it comes down to basics, we all learn from our mistakes. So rather than just monitoring the buzz and view how we faired, we need to analyze this social data and take corrective measures to ensure that negative sentiment is minimal, keeping in in mind we can�t please everyone!
Take for example, RTL Nederland, a Dutch entertainment company that produces the �X Factor.� It analyzed the sentiment of more than 71,000 online conversations about the show to understand audience needs and preferences and, based on the online feedback, altered the show for the final episodes to increase viewer satisfaction.
In today�s business environment, no business can survive without analyzing all available data, including social media. A mega event like the Olympics is no different. Social analytics is therefore an understatement.
In the case of the London Eye and the Olympic organizers, I hope their ears are open and are really listening to the feedback so they can take appropriate actions to improve these games, or 2016 in Rio de Janeiro.
For now, let's watch the sentiment on the wheel and then decide.
How are big data initiatives managed? Who sets the direction? How do the business and IT sides of the equation collaborate? You'll learn a lot about big data down at Information On Demand in October, but for some early answers to these questions I spoke to two IBM Champions with deep expertise in Big Data. Alex Philp is Founder and Chief Technology Officer of TerraEchos. Ivo-Paul Tummers is CEO of Jibes.
Part 1 of our discussion is below. We''ll have part two next week.
It seems that with every announcement we invent a new word to describe the amount of data out there. At some point, these numbers get so big as to be abstract. Ivo-Paul, what are your clients seeing? How much more data are they working with now than two or three years ago?
Ivo-Paul Tummers: Typically our customers see their data volumes growing 20-30% per year. One of the reasons is that large amounts of data are generated purely by keying it in or through physical transactions. But an increasing amount of data is generated by machines and by devices and people moving through networks. For example: GEO data and automatic scanning. Also, we are attributing a lot of data in the execution of processes in transaction on top of what used to be much more light- weight data sets. This is just the growth of the data that is generated by internal company systems. When we also consider data from the ecosystems of our customer, then this growth and diversity is even higher.
Alex, is that what you�re seeing with your clients as well?
Alex Philp: It�s definitely explosive growth. Our customers right now represent large U.S. government organizations. We�re dealing more and more with an explosion of unstructured data. With audio and full-motion video, we�re getting into petabytes a day. The big debate is If we�re collecting it, where are we storing it? What are we analyzing and how are we sharing it? Our niche in the big data problem is the filtering and distilling on the real-time side � figuring out how to operate on the data in real-time and deciding what to store for analysis later.
The other aspect of big data is the variety � unstructured data is the next source of insight and analytics � video, audio, things that don�t come in rows and columns. Could you go into more data about how you handle big data for your clients? What capabilities are you using?
Alex Philp: Our customers live on networks, and so we�re trying to structure the conversations with clients around the �3VI Over Network� model. They�re dealing with data coming from the sensor web or �The Internet of Things.� Many of our clients are trying to figure out what part of their IT infrastructure can handle this data. How do we store it? What metadata management systems do we use? How do we index and pre-index video, or elements of data coming out of video? How do we separate the stuff that�s meaningful from the stuff that�s not so meaningful? Then we really get into deciding on the right analytics in the right place. It�s finding the right algorithm against data at rest and data in motion, both offline and online. We�re trying to come up with workflows that increase our customers� efficiency so they can be proactive in using and exploiting the information. It�s multi-faceted. It�s hardware, software and analytics and workflow optimization.
Ivo-Paul, what�s your take on the variety aspect of big data?
Ivo-Paul Tummers: First, I agree completely with Alex. We�re more in the retail space. The diversity of the data is indeed a bigger challenge than the sheer volume. But the real value is in combing your internal data with as many data sets as possible. The reasoning behind this is simple; your internal data only helps you to get insight into your own performance, not your relative performance. Another factor is that markets are increasingly customer-centric world (opposed to product-centric) which is reflected in the dominance and success of brands and dominant e-tailers like Amazon and Asos. They compete on customer experience and trust. Capturing customer data when it is relevant is key. We see many companies that recognize this changed reality, but they don�t always always know how to compete in this new arena. Luckily, we do see that at a first assessment there is actually more data that people think there is. And you can already make a first simple step in unlocking this potential by connecting these data sets. For example: mailing lists, customer counter records, webshop logs, cash register data, Google analytics data and geographic data can be combined. However, we are still lacking the data quality. This is not necessarily complex, but is a lot of work and slows you down. So bottom-line; the amount of data and the number of nodes (connected datasets) is a real value driver. The real challenge is to to turn all this data into into actionable intelligence.
The last aspect of big data is velocity. I�ve often said that an organization can only move as quickly as its data and today data moves very quickly indeed. Where is the data moving most quickly in your clients' organizations? What challenges does this increased speed pose to their decision-making?
Ivo-Paul Tummers: This is a bit of a paradox; more data to process in increasingly shorter time frames. Across our customer base we see that in retail the pace in which information flows is high. And the value of the data decreases fast, so you need to act fast. This is retail through all verticals, so from banking to clothing. The challenge is in getting the data processed into useful information and routing this to the right people at the right moment. In order to do this connecting datasets is just a precondition and the real work is in filtering, classifying, enriching, routing and actually applying the information. We see this in many of our customer processes ranging from buying decisions (impacting on your stock position and working capital) to driving conversion by targeted offerings to specific customers. The speed at which information travels through the value chain has a deep impact and changes the dynamics of even the most stable internal business processes.
Alex, what are you seeing?
Alex Philp: We�re seeing these trends spill into the commercial sector as well. We�re being challenged to come up with hardware and network configurations that can anticipate 100GB throughput. It�s not uncommon for customers to say, �We want to get 80 to 90 to 100 gigabits in and out of a machine every second. How do we do that? �We�re working with IBM and others to find the right combination of hardware � FPJ, signal processors, GPU, CPU, power architectures � and the right right network configurations to get that out. We might be dealing with anywhere from 300 to 500 megabytes of data. So we need to figure out how to run computationally intensive algorithms � a Bayesian or Markov algorithm - on data in motion. How do you filter down to the core? The interesting discussion I�m having with customers now gets back to what Ivo-Paul was talking about � the interdisciplinary approach to data and analytics.
Second, a lot of customers are asking us to tunnel horizontally through their data, connect up attributes and link it to data at rest. Some people call this �perpetual analytics.� We can�t afford to stop, so our models are being tuned by our observations. Also, everyone wants to see into the future, customers are trying to get into predictive analytics. We�re combining databases with data warehouses with real-time analytics and trying to find that sweet spot. Not everything is in motion or high-velocity, but you need to balance both attributes in that �3VI Over Network� equation.
The intensity of the threat landscape is constantly increasing and evolving. As a result, we have seen a lot of corresponding innovation in security technologies and services, often times from small businesses. While we all want to see advances that improve the global state of information security, the realities imposed by budgets and skills shortages mean that complexity has become the enemy of security in more ways than one. This is not to say that one vendor will ever have all of the answers for you, and we certainly don�t claim to be that, but there is a reason you are seeing a continued consolidation around both platforms and vendors. When organizations think about the long-term investments and strategy they have to ask the question, �is what I�m buying actually going to make me more secure?� and often times, that answer is very much related to the questions �am I adding more complexity or reducing it? Am I increasing my visibility and understanding or do I now have another thing I�ll never manage well?�
Mitigating risk while lowering cost is a daunting task and it requires deploying and managing security processes and technologies across your people, data, applications and infrastructure. Ideally these security technologies will have the ability to not only prevent attacks but also be able to provide a central reporting environment the IT department needs to validate that its technologies are indeed performing their tasks without interfering with the day to day work of the company�s employees. In the event of a breach, a single repository of security logs is also essential to incident response and determining root cause.
With the rise of Next Generation Firewalls (NGFW) and Next Generation Intrusion Prevention Systems (NGIPS), new options are available to consolidate protection technologies. There is promise in this area but IT departments should be cautious and ensure that the new technologies are, in fact, new and do not fall short in key requirements for keeping the business running. Furthermore, they should be confident that the products satisfy the key requirement of adding capability without complexity. Ideally, the lives of security professionals should get easier.
One of the ways to ensure that you are actually improving security without adding complexity is having a good capabilities around data analysis and security intelligence. The truth is that all of these technologies can generate lots of data and security teams might find themselves overwhelmed without automated tools to help. Your Security Information and Event Management (SIEM) product should be able to consume data from all of your security products being deployed. These technologies, when working together, will help improve protection against today and tomorrow�s security risks by providing the security professional with the data he or she needs to make critical decisions, and make them at the right time.
More about the author: Brian Fitch is Product Manager for IBM Security.He has been in the information security industry for over 12 years. Brian currently manages the GX IPS and XGS NG-IPS line of appliances. Prior to his product manager role, Brian was an Internet Security Systems (ISS) Systems Engineer for a decade.
Needing to decide among competing job offers while an undergrad at Stanford, Mayer eschewed compensation, location, and assorted perks for a decidedly analytical approach. Manjoo writes:
Over spring break, she studied the most successful choices in her life to figure out what they had in common. �I looked across very diverse decisions�everything from deciding where to go to school, what to major in, how to spend your summers�and I realized that there were two things that were true about all of them,� she said. �One was, in each case, I�d chosen the scenario where I got to work with the smartest people I could find. � And the other thing was I always did something that I was a little not ready to do. In each of those cases, I felt a little overwhelmed by the option. I�d gotten myself in a little over my head.�
Maybe I'm presumptuous, but I'd say that's career guidance we could all use.
Guest post from Burke Powers, Managing Predictive Analytics Consultant, IBM Business Analytics
Today, every company of appreciable size has some social media presence. Most companies I speak with are either just monitoring social media or are engaged in �spray-and-pray� tactics that are only loosely tied to corporate goals.
To realize the value in social media it is important to integrate social media into broader customer analytics programs and business decision making.
Too often, companies ask, �What are customers saying about us?�
An objective like this is too vague to direct an analysis and identify actions. What we really need to be able to ask is, �Product XYZ will launch in two weeks. We have done A, B, and C campaigns to create awareness and to position the product.
�What kind of buzz (as measured by D, E, and F KPI�s) has this created around each of our message points?
�Are there other topics that we did not anticipate?
�Can we setup real-time reporting of the topics so that we can monitor the customer reaction to the product once they begin using it?
�Can we monitor any emerging, unanticipated topics after the launch?�
The objective should focus on an area of the business where you are confident additional insight can lead to quick improvements. The best opportunity might be related to a product, the service level of a critical customer touch point, competitor actions, a specific brand attribute, or a customer behavior.
The sheer volume of social data requires some planning. There are a limited number of data aggregators (major aggregators include BoardReader, Gnip, & DataSift) and each comes with its own benefits and trade-offs.
To choose an aggregator that best fits your needs, decide how important data history is, the cost of hosting the data, and the importance of access to all social media data (full fire-hose) versus sampling.
Secondly, decide whether to integrate additional data sources. Using the same filtering and reporting for social media and survey verbatims makes them more comparable for analysis and reporting. Also, determining whether to include internal social network data from Yammer or Lotus Connections may be a factor.
3) Plan and Execute the Analytics
By its nature, social media data is going to be different from what most business analysts are used to analyzing. It is unsolicited and unstructured and tends to be rich in attitudinal and usage information. It is frequently strongly positive or strongly negative.
But, it provides tremendous value because it has rich customer narratives of every product feature and customer touch-point that no other data source can offer. It brings traditionally dry analysis to life for business decision makers.
Most existing social media analytics tools offer only a limited ability to search and trend terms as well as view some sort of sentiment. Some allow filtering by the source metadata as well. These are necessary elements of any serious analysis, but stop short of offering the tools needed to take the data to an actionable level.
To be truly useful across many parts of the business, the free-text data needs to be understood in context and translated into an accessible format for reporting and analysis. This capability is one of the strongest differentiators for IBM Cognos Consumer Insight.
4) Motivate Actions
Once the analysis is ready, it is time to deliver the information to the decision maker at the right time, in the appropriate context to make a decision, and in a persuasive manner.
Finally, be sure to include a rich narrative quote that illustrate the argument and provides an additional persuasive hook that augments the analysis and builds buy-in from the �gut� of business leaders.
For example, let�s say your company recently launched the �Wonder Widget.� You are preparing the first report on how the product has been received by customers. Include a positive customer quote to support the data and drive the point home.
Ideally, the quote says exactly what your analysis leads to, �I love your new �Wonder Widget,� it is already making a difference. Except for one thing, the XYZ dial has got to be moved closer to the display so that I don�t have to look away. Fix this and I can easily justify ordering more units.�
There are many social metrics that could be used, from numbers of followers or tweets generated, to the ratio of issues resolved, and to issues raised via social channels.
Additionally, you could track the results via click-throughs usingIBM Coremetricsor email campaign response using IBM Unica.
You also might choose to experiment through customer support channels and monitor perceptions via both social media and surveys.
Finally, the metrics and actions need to be tied back to financial metrics either as revenue-generating or cost-reducing. This may require knowing the cost of resolving an issue via a social channel versus contact center or perhaps the cost of a response via one promotional channel versus another.
Identify a New Objective and Repeat
Now that we�ve gone through the process from beginning to end, it can now be repeated again with a new objective. A disciplined approach using these best practices will generate rapid returns on virtually any social media analytics endeavor.
For more information:
Read the whitepaperon techniques for gaining valuable customer insight with social media analytics
�I�m recognizing the importance of learning new behavior and beginning to question my future role in society as a result of the leadership demonstrated by your organization. The donation of these new computers has caused me to think about the need for me to reach out to other people in communities around the world and share what I have learned.� ~ Roger B.,Ottawa Mission Client
As �more than a shelter,� The Ottawa Mission helps Ottawa�s most vulnerable residents manage through difficult times and get back on their feet. As a volunteer in the Mission�s kitchen for the past six years, I�ve met these people and heard their stories first-hand.>
How fortunate, then, that during its Centennial, IBM helped me help The Mission write a different story � one that�s both heartening and humbling.
"Roger is thriving"
A year ago, Roger was too shy to look anyone in the eye. Yet in a few months he�ll be ready to work with others as a certified electromechanical technician, installing and testing machines with complex hydraulic, pneumatic and electric controls.
Roger�s personal development has come through The Mission�s Essential Skills program, which has helped him learn to listen, give feedback and hold a conversation. His education has come through The Mission�s new Distance Education modules from George Brown College, made possible through a Centennial Grant from IBM.
�His story moves you no matter what�
Last June, as part of IBM�s Centennial Celebration of Service activities, I helped organize a day-long Workforce Skills seminar at The Mission. With the help of its Client Services team, resources from the IBM On Demand Community and close to 30 generous colleagues and friends, we introduced Mission clients to workplace culture and behaviors, helped them brush up their resumes and practice their interview skills � an activity that for some was entirely new. In the Kitchen, team members served hot meals and prepared hundreds of sandwiches while in the basement, more volunteers sorted through an Everest of donated clothing.
For our efforts, The Mission was awarded a cash grant of $15,000, which it used to replace the aging machines in its computer lab.
The result? �Roger is thriving,� says Jennifer Crawford, Manager of Client Services at The Ottawa Mission. �His story moves you no matter what.�
�We�d never bought 13 computers before�
�We�re over the moon about the new computers,� says Samantha Laprade, Legacy Giving Officer with The Ottawa Mission Foundation. But, she adds, the opportunity did throw them for a loop. �We�d never bought 13 computers before. We were at a bit of a loss as to how to go about it.�
Help came from an unlikely source. Tom Donohue, The Mission�s Chaplain, is also a former systems administrator with telco provider Telus. With Donohue�s help, The Mission outfitted its computer lab with 13 new Lenovo ThinkCentre M71e towers, each configured with enough processing power, memory and software to meet the needs of The Mission�s now-expanded offerings.
A new world of learning
The new towers open a new world of learning for clients. Now they can study a wide range of topics including bookkeeping, hospitality and tourism management, robotics, geographic information systems and electromechanics � Roger�s current field of study.
The computers also meet a longstanding goal of the Client Services team. �We�d had this in our education proposal for over a year, but the modules weren�t compatible with our old machines,� says Crawford. �The experience would have been extremely tedious to navigate.�
The new experience is exactly the opposite. In a thank-you letter to IBM, Roger writes: �The new computers have transformed the way that I process information and match my preferred learning styles, which are solitary, social and visual. �
A modern approach to adult education
Crawford says the new machines bring The Mission more in line with current trends in adult education. �Almost everything in adult education is through distance learning. We�re bringing opportunities to our clients that the rest of the world already has.� she says. �With the old machines, it felt like �poor machines for poor people,� and that�s something we want to move away from.�
Roger is taking full advantage of the new technology. In his letter, he writes: �The enhanced internet browsing capability enables me to research text, capture it and paste into a speed reader in order to improve my reading recall rate. My friend and I are taking a computer-based electronics technician course that uses the DVD capabilities of the new computers to create a virtual electronics lab simulator on the computer monitors.�
Faster machines mean more sessions for more clients
Crawford adds that because the machines are so fast, clients can finish their courses in half the time. Their reliability means volunteers can lead job and resume skills sessions once a week instead of once a month. �It�s so much less frustrating to have computers that actually work,� she observes. �Even for things as simple as email, clients were asking for help because the computer was so slow. Now, instead of waiting for pages to load, clients can use their time to send emails or look for jobs.�
Roger is seeing additional benefits. In his letter, he continues: �I�m using the new IBM ThinkCentre technology to learn more effectively in school and to become more organized at home. I�m accomplishing more office work in less time and saving natural resources by creating digital notebooks utilizing the professional software that was installed on these new computers.�
Reconnecting with family
The new computers also help clients beyond their education. A new ThinkCentre has been installed in the lobby of The Mission�s Client Services Centre, where clients can access basic Web features like email and social networking.
Laprade says the speedier machines mean clients can make the most of their 15-minute time slots. �Fifteen minutes may not be much to be or you, but that may be the only time clients have to connect with family.�
Part of a bigger picture
Throughout IBM's Centennial year, IBMers in 120 countries donated more than 3.2 million volunteer hours to more than 5,000 projects around the globe.
The Centennial Celebration of Service activities were intended to not only celebrate IBM's long heritage of community involvement, but also to take those activities to the next level � and the more than 1,000 projects which have been started so far this year is evidence of such progress.
Last year at this time, I was proud that IBM valued my dedication to helping Ottawa�s less fortunate residents. Having helped contribute to Roger�s success story, I�m doubly so that my values and IBM�s vision are so closely aligned.
Guest post from Anuj Marfatia, Senior Market Manager, IBM Predictive Analytics Solutions
Usually when traveling for work or vacation and right after takeoff, I undoubtedly begin to panic, much like the mother inHome Alone. I constantly worry that I left the garage door open, the iron on, my kids behind, food out for the dog, or most importantly, if I put my vintage1963 Issue #4 Avengers comic bookback in its protective cover (don�t judge).
Beyond this unnecessary distress, protecting my arm rest from the chatty passenger beside me, and browsing throughSkyMall, I sometimes read the passenger safety document. Have you ever looked through one of these? It�s unintentional comedy.
In the past few years, almost all airlines have included an �exercises� section in the pamphlet.
As an economy class passenger, I have to laugh at such pictorials � as most of these exercises (see image) are almost impossible given that my knees are already touching the seat in front of me. Now, if only I were a contortionist�
In all seriousness, do you know why these exercises are important?
Studies have shown that many emergencies and future health issues are correlated to inactivity while flying, and one in every 20,000 passengers has an in-flight emergency (source). One serious, yet preventable, issue is venous thromboembolism (VTE) that occurs when a blood clot in a leg vein (deep vein thrombosis or DVT) travels through the body to the lung.
Based on a BBC News report,some 75 percent of air-travel cases of VTE have been linked to lack of movement while in the air. I can sleep well knowing that economy passengers, like myself, are no more likely to develop clots than the more fortunate passengers in business or first class.
While I try to do some sit-ups, lunges, and pull-ups on the plane (kidding of course), it would be great if I knew how likely I was to get VTE or DVT or how much I would have to exercise to minimize my risk of attaining VTE or DVT.
Wouldn't it be cool while purchasing a ticket or at check-in, you would be informed of the health risks on a certain flight? That would be a red eye-opener.
While such a thought may seem like something from a science fiction movie or occur 100 years from now� think again. Predictive disease management actually exists today!
There is a lot of information about a patient that can be used (HIPPA-compliant, of course) to determine the likelihood of disease occurrence or treatment effectiveness.
Based on the study above and numerous other studies, drugs that doctors prescribe are still relatively ineffective.
Doctors do use their best knowledge and experiences, in most cases, but many times they are not utilizing ALL of the information that is available to them when making a decision about the patient. (This is also why some of theIBM Watsonapplications inhealthcareare so interesting to watch.)
This is wherepredictive analyticscomes into play. Predictive analytics software pulls in information from all the disparate data sources, such as from health information systems, Excel, and even from Facebook and Twitter (for those cases you told your friends that last night�s Ethiopian food left you �indeposed�).
The software enables healthcare organizations to transition to a new model and find more effective ways to treat patients and develop new treatment protocols. For example, a predictive outcome could be that Jane Doe has a 95 percent probability of positively reacting to a certain treatment, essentially increasing the quality of care and containing costs.
This is why I�m happy to hear researchers at Hospital Santa Barbara, a research and treatment center in Spain, analyzed patient records and other research data to establish a new, reliable diagnostic model for DVT enabling earlier diagnosis and treatment in high-risk patients. (Learnmorehow Santa Barbara Hospital used IBM SPSS predictive analytics.)
While Spain may have their own economic issues, I�d like to thank them for helping to begin the journey of a DVT-free flight � so I can fly the friendly skies without worrying about my health.
I love hearing from customers. I love it even more when they're willing to share their success stories with the world. Like the case below, wherein Florida Hospital, Deluxe Corporation, and Baldor discuss how migrating from Oracle to IBM DB2 has lowered their costs and improved system performance.Check it out below, then visit our Break Free area on IBM.com.