Artificial intelligence is changing on-premises and cloud data center requirements forever.
In this TIRIAS Research Paper, sponsored by IBM, learn how “training” and “inference” differ and why dedicated data center resources are needed for both forms of AI processing.
“95% or more of all current AI data processed is through inference processing. However, both training and inference are required for the deployment of an AI solution and both require extensive resources.”
In this paper, you’ll learn about:
- Compute performance requirements
- Training and inference workload models
- How to scale your AI solution
- Optimizing processor and data center resources
- Watson Machine Learning Accelerator