Data flow for e-comm processing

The solution provides features to process electronic communication data such as email and chat transcripts.

The following diagram shows the end-to-end data flow for e-comm processing:

Figure 1. e-comm surveillance data processing
Diagram showing the e-comm surveillance data processing workflow
  • e-comm data is fed into the Kafka topic and monitored by the ActianceAdaptor Streams job.
  • The ActianceAdaptor stream converts the xml message into a communication tuple.
  • The CommPolicyExecution Streams job processes the communication tuple against the policies that are applicable for the communication and then creates a JSON message that contains the communication data along with the extracted features and risk indicator details.
  • The JSON message is published to the Kafka topic. And the topic is consumed by the ECommEvidence Spark job.
  • The ECommEvidence Spark job saves the communication data and the extracted features and risk indicators to the database. Also, Solr runs the inference engine to check for alertable conditions. If so, an alert is created in the system.
  • Alerts are created or updated in the database based on the outcome of the inference engine. This is done through the Create Alert REST service.

The following diagram shows the end-to-end data flow:

Figure 2. e-comm surveillance data
Diagram showing the e-comm surveillance data flow