What is massive data management and how does one prepare for it? Rates of growth for Big Data for AI can accelerate as organizations master and expand their AI, machine learning, and deep learning capabilities. It can grow from tens of petabytes (PB) to hundreds of petabytes and even to exabytes (EB) of capacity, particularly over a decade. Even if one is not yet generating such capacity, one should be prepared to manage unanticipated growth, and do so without skyrocketing budgets. This presentation shows how to manage Big Data from ingest to archive with the scalable performance and capacity to accelerate AI throughput. It includes how one can start with tens of terabytes (TB) and grow to tens of exabytes while always exceeding the performance levels and capacities required (however large they grow); this includes managing growth with non-stop and seamless Product Lifecycle Management. It demonstrates how one can manage an exabyte of Big Data for $100 M to $200 M less than using disk-based storage (including in clouds), save $10 M to $20 M for 100 petabytes, and so forth.
Date: Tuesday, May 12th, 2020
07 April 2021