March 15, 2017 | Written by: Chris Nott
Categorized: Industry Insights
Governments are increasingly embracing DevOps, both its cultural change as well as process automation in software delivery. Cloud delivery models and practices have transformed the consumption of IT, enabling services to be introduced and changed quickly in response to business need. Contrast this with equipping business users with insight. All too often in my experience, people struggle to break away from the comfort of storing data before it is queried. We are failing to empower business users and analysts with speed of response in what data can tell them.
Benefits of streaming analytics
Businesses can benefit from undertaking analytics on up to date data more than they realise. I have seen continuous monitoring of incoming data offer value in these ways:
- Identifying new business opportunities by enabling actions from perishable data. Early examples include outbound marketing, first in telcos. A customer service or marketing opportunity is lost if you cannot personalize a targeted action on a customer who calls in before that call has finished. Intervening in the moment has wider applications in preventing fraud, and increasingly with the Internet of Things.
- Situational awareness with a continuously current view of what is happening. Again telcos were early adopters monitoring networks and quality of service, but there are further applications in traffic monitoring, acoustic monitoring, preventative maintenance and social buzz.
- Using in line analytics to focus on the outcome including the effectiveness of avoiding bringing data to rest through aggregation, summarization, filtering and notifications. There is a scale up requirement for such in line processing with increasing analysis of machine data and the rise of the Internet of Things. But there is also a need to scale down and be highly optimized in the use of compute resources in small systems and devices, and network bandwidth.
- Closing feedback mechanisms under user and analyst control. Business users are increasingly empowered to take action quickly themselves in response to changing operating circumstances, but such action is taken blindly without capability to measure the effect of the change. This compromises an organisation’s ability to optimize use of its resources and its operational.
Acting on all your data in real time
Continuous analysis of incoming data offers opportunities to reduce cost and increase operational success. More data is being generated in more forms, and so a technology platform for such analysis needs to be engineered for machine scale. Essential characteristics include processing efficiency, low latency, scale out, developer productivity, and perhaps less obviously state management.
The importance of managing state
Most analytics on incoming data cannot derive the insight that businesses seek simply by ingesting data feeds: they need to manage state. Put simply, most algorithms need to hold some previous data to be able to process the next data.
It follows that recent developments in serverless computing, whilst offering scale out, are inadequate for continuous analytics. Custom code can be added to ingest data, analyse it and manage state. However, this is sub-optimal because it attempts to apply technology engineered to meet one set of design points for a different purpose. To illustrate the importance of state in continuous analytics, here are three examples:
- Predicting collisions of objects moving on a map – state is the velocity of each object.
- Aggregates – state is the values over a moving period of time.
- Deviation from an expected norm – state is the significance of the deviation.
Why store data?
Continuously executing analytics on incoming data enables insight to be pushed to users as the organisation has sufficient data to derive it. The traditional approach of storing data first inevitably means that query results retrieved are historic. However, storing data is necessary for the following activities:
- Algorithm development, e.g. data exploration, pattern analysis, machine learning, test data.
- Knowledge exploitation, e.g. augmenting and informing human activity, enrichment, historical reference.
- Investigative analysis, e.g. forensics, audit, point in time financial and compliance reporting.
Analyse data as it arrives
Clearly organisations will continue to need to store data. But rather than accept that as the default they should consider the value in generating insight by analysing data as it arrives. Areas that can benefit public sector include fraud detection and prevention, situational awareness in military and city operations including public safety, alerting applications, anomaly and deviation detection, and cyber security. Why wait?