We all realize we live in a now kind of world and this is true for analytics. Data has been referred to as the "new oil" or basis of competitive advantage and therefore insights are needed sooner rather than later. It used to be that data was moved in overnight batch jobs from the transaction system to the analytics system and reports followed later. However in a growing list of use cases, the insights are needed faster than this. Data volumes have also grown such that some batch jobs run out of night!
Enter HTAP ... Hybrid Transaction / Analytic Processing. HTAP makes real-time analytics possible in the transaction or operational database — providing analytics across events as they are happening so you can impact the outcome rather than know in hindsight what happened. In addition, HTAP can help to simplify the IT environment because you need fewer separate systems and copies of data to achieve transaction and analytics processing.
HTAP works with a combination of technologies to achieve these results, including:
- In-memory processing which helps to handle compute-intensive analytics very rapidly. This includes open source Apache Spark and also BLU Acceleration in DB2
- Event-driven applications which are rapidly moving to the forefront due to their agility and ease of updating
- Machine learning which helps continually improve the analytics algorithms as more and more data is analyzed, which can be done using Apache SparkML
- Data governance which is essential so you know you are working with the right, trusted data.
You will see these and other technologies in and coming to the IBM analytics portfolio including DB2. You can learn more about HTAP in DB2 in these blogs by our experts:
- The DB2 Journey: The Hybrid Database Evolution by George Baklarz and Matt Huras
- Analytics Now: Leveraging HTAP to deliver an analytics value chain without limits by Matthias Funke