Data flow in a cloud deployment
Learn how event data flows between the components of IBM® Netcool® Operations Insight® on Red Hat® OpenShift®.
The following figure shows a simplified data flow between the components of a containerized deployment.
Data flow
Events come in to the ea-noi
gateway pod (message-bus) from the
ObjectServer.
Events are picked up by the ingestion service, which puts them onto Kafka, to be consumed by the other pods within IBM Netcool Operations Insight on OpenShift.
The archiving service picks up events from Kafka and puts them into storage.
The events in storage are used by the training component to create temporal and seasonal policies.
The inference service also listens for events from Kafka and associates policies to the events. When a policy is found for an event, the inference service puts an action for the event onto Kafka.
The actions are picked up from Kafka by the aggregation service and passed onto the NOI action service. The NOI action service sends the actions to the ObjectServer, where the action is performed on the event.
The events query service is used to query the events in the historical database for the purpose of displaying policy details.