On-demand trend detection
The on-demand trend algorithm integrates with the Surveillance Insight Complaints Explore page and is also available as a stand-alone REST API.
A user can determine features of interest, such as complaint category, process, product, customer age, and geography, and set their corresponding values. When sent to the REST API, the on-demand service produces a time-series of counts of complaints per day for the date range given and other key statistics useful in determining whether a trend is present.
Business problem
The Surveillance Insight trend detection algorithm runs on a periodic basis to determine if trends are occurring across all dimensions and at all levels of granularity, and then displays all of the discovered trends in the trend detection UI. Aside from this, users may want to independently query the trends database to check for trends. This might be for several reasons:
- To visually determine the normal baseline level of complaints for a given set of complaint features
- To combine complaint features in arbitrary combinations that are not checked as standard
- To confirm that a previously detected trend has now ceased to trend for example because remedial action has been taken
- To determine the effect of complaint and entity labeling in between scheduled trend-detection runs.
Approach to solving the business problem
A lightweight, on-demand version of the trend detection algorithm is provided via a REST API. The main difference between the trend detection algorithm and the on-demand trend detection algorithm is that the on-demand trend detection is designed to allow a user to investigate one time-series of complaint counts that may or may not show a trend. The following table summarizes the differences.
| Trend detection algorithm | On-demand trend detection |
|---|---|
| Automatically generates all combinations of available features | User manually inputs one set of features of interest |
| Algorithm finds only those combinations of features that are trending, and ignores non-trending combinations | Returned time-series may be trending or non-trending |
| Variable number of trends may be detected, depending on actual number of trends present in data, sensitivity of trend detection parameters in configuration etc. | Always returns one time-series corresponding to the selection criteria (provided there are entries in the database) |
Using the REST service
- Starting the REST Service
-
python3 onDemandTrendDetectionRestAPI.py - Sample input
-
THEME = Complaint Delay, CUSTOMER_AGE = 30-50, GEO": TX and FL - Sample response
-
Table 2. Sample response example Date Complaint count Smooth complaint count 2018-03-08 14 14.0 2018-03-09 16 15.0 ... ... ... 2018-03-15 27 18.3
Service details
The service allows users to determine if a given set of complaint features is trending.
- Method
- POST
- URL
- /analytics/models/v1/on_demand_trend/
- Input
- JSON payload
- Output
- JSON response
The following is an example CURL command to POST:
curl -k -H 'Content-Type: application/json' -X POST --data-binary @query.json https://ip_address:port/analytics/models/v1/on_demand_trend/
The following code is an example JSON payload:
{"query" : {"THEME": ["Complaint_Delay"], "CUSTOMER_AGE": ["30-40", "40-50"], "GEO": ["TX","FL"]}}
Either set the start-date and end-date directly by including the
parameters in the JSON payload, for example, “STARTDATE”: “2017-05-01”,
“ENDDATE”: “2017-06-12”.
Or, you can send no dates, in which case the end-date is set to the current date, and the start-date is set to be 365 days prior (using configuration parameter time periods.)
The response is a JSON structure that gives the complaint counts per day and other trend statistics. The following code is an example response:
{"trends":{"description": "CUSTOMER_AGE:30-40&THEME:Complaint_SalesPractice", "complaintcount": {"trendinglast30days": "+4.6%", "30dayavg": 21.3, "7dayavg": 19.1, "earlytrend": null, "trendinglast7days": "+0.0%", "dates": [{"date": "2018-03-08", "smoothcount": 14.0, "actualcount": 14}, {"date": "2018-03-09", "smoothcount": 15.0, "actualcount": 16}, {"date": "2018-03-10", "smoothcount": 15.3, "actualcount": 16}, {"date": "2018-03-11", "smoothcount": 15.7, "actualcount": 23}, {"date": "2018-03-12", "smoothcount": 16.1, "actualcount": 19}, {"date": "2018-03-13", "smoothcount": 16.4, "actualcount": 20}, {"date": "2018-03-14", "smoothcount": 17.3, "actualcount": 27}]}, "product": "all", "riskscore": 24.5, "timeperiod": 7, "theme": "Complaint_SalesPractice", "trendtypeid": 3, "geo": "all", "trendattributes": [{"MAX_AGE": 40}, {"THEME": "Complaint_SalesPractice"}, {"MIN_AGE": 30}], "trendstartdate": null}}
Assumptions
The NLC and NLU models that create the meta-data (complaint type, process, etc.) have accurately classified complaints in the majority of cases.
Accuracy and limitations
On-demand trend detection does not work for keywords.