Manage predefined data extraction to file

Guardium has pre-defined data extractions to file that are disabled by default. You can enable the export extractions by scheduling them through GuardAPI commands.

About this task

The predefined extractions to file are listed in Table 1. By default, extractions are hourly. You can modify the frequency, however, there are suggested execution times for the pre-defined extractions, based on internal Guardium processes. They are presented in Table 2.

The format of the extracted file name is <Global Id>_<short host name of source machine>_<export job name>_<period start date time short format in UTC>.gz, for example: 1762144738_machine1_EXP_SESSION_LOG_20181028230000.gz

If a file transfer fails for any reason, for example if the target machine is down, then it retries the transfer on the next run. The backlog is kept in /var/exportdir directory, and the backlog purge interval is twice the data extraction log purge interval. Use the CLI command show purge objects age to view purge intervals. Set the datamart extraction log purge interval using the CLI command store purge object age 31 [age] where [age] is the desired purge interval.

Full_SQL data mart only works if log full details or log masked details is defined and installed.

Outlier data mart only works if outlier detection is enabled.

If data mart/s scheduler had been stopped for some time and you don’t want the data to be extracted retroactively, then before you reschedule extractions to run again, set the correct “Initial Start” in the Data Mart Configuration screen.

Table 1. Predefined data mart export jobs
Datamart Name / job description / objectName Description Report Title Unit Type Datamart ID jobname
Export:Access Log Includes details of the connection information and the activity summary per hour. The log includes the OS and DB user, successful and failed SQLs, client and server IP and more. Export: Access Log Collector 22 DataMartExtractionJob_22
Export:Session Log Includes details about datasources’ sessions (login to logout). The log includes session start and end timestamps, OS and DB user of the session, source program and more. Export: Session Log Collector 23 DataMartExtractionJob_23
Export:Session Log Ended Session may extend for long period. The extraction works hourly. This log sends the sessions that ended later than the hour started. Export: Session Log Collector 24 DataMartExtractionJob_24
Export:Exception Log Details the Exceptions / Errors captured by Guardium. The log will includes exception/error description, user name, source address, DB protocol and more. Export: Exception Log Any 25 DataMartExtractionJob_25
Export:Full SQL Includes the executed SQL details. The log includes full SQL, records affected, session ID and more. Export: Full SQL Collector 26 DataMartExtractionJob_26
Export:Outliers List Includes the outliers. The log includes server IP, DB user, Outlier type, DB and more. Analytic Outliers List Any 27 DataMartExtractionJob_27
Export:Outliers Summary by hour Includes an hour summary of outliers. The log includes server IP, DB user, DB and more. Analytic Outliers Summary Any 28 DataMartExtractionJob_28
Export:Group Members Includes a log of all groups members. The log includes Group type, Group description, Group member and Tuple Flag. Export:Group Members Any 29 DataMartExtractionJob_29
Export:Export Extraction Log Includes log of data relevant to all export or copy files having a name starting with “Export:” User Defined Extraction Log Any 31 DataMartExtractionJob_31
Export:Policy Violations Includes the details about logged violations, such as DB User, Source Program, Access Rule Description, Full SQL String and more. Export:Policy Violations Collector 32 DataMartExtractionJob_32
Export:Buff Usage Monitor Provides an extensive set of sniffer buffer usage statistics Buff Usage Monitor Any 33 DataMartExtractionJob_33
Export:VA Results   Security Assessment Export Any 34 DataMartExtractionJob_34
Export:Policy Violations - Detailed The same as Export Extraction Log, but has Object/Verb tuples. It is recommended that only one of them has to be used. Export:Policy Violations Collector 38 DataMartExtractionJob_38
Export:Access Log - Detailed The same as Access Log, but also has the following fields from Application Event entity: Event User Name, Event Type, Event Value Str, Event Value Num, Event Date. It is recommended that Access Log or Access Log – Detailed should be used and not the both of them. Export: Access Log Collector 39 DataMartExtractionJob_39
Export:Discovered Instances Provides the result of S-TAP Discovery application, which discovers database instances Discovered Instances Any 40 DataMartExtractionJob_40
Export:Databases Discovered   Databases Discovered Any 41 DataMartExtractionJob_41
Export:Classifier Results   Classifier Results Any 42 DataMartExtractionJob_42
Export:Datasources   Data-Sources Central Manager, Standalone 43 DataMartExtractionJob_43
Export:STAP Status   S-TAP Status Monitor Collector 44 DataMartExtractionJob_44
Export:Installed Patches   Installed Patches Any 45 DataMartExtractionJob_45
Export:System Info   Installed Patches Any 46 DataMartExtractionJob_46
Export:User - Role   User - Role Central Manager, Standalone 47 DataMartExtractionJob_47
Export:Classification Process Log   Classification Process Log Any 48 DataMartExtractionJob_48
Export:Outliers List - enhanced   Analytic Outliers List - enhanced Any 49 DataMartExtractionJob_49
Export:Outliers Summary by hour - enhanced   Analytic Outliers Summary by Date - enhanced Any 50 DataMartExtractionJob_50
Table 2. Default cronString for predefined data mart export jobs
Job description Recommended cronString Every hour at:
Export:Access Log 0 40 0/1 ? * 1,2,3,4,5,6,7 00:40
Export:Session Log 0 45 0/1 ? * 1,2,3,4,5,6,7 00:45
Export:Session Log Ended 0 46 0/1 ? * 1,2,3,4,5,6,7 00:46
Export:Exception Log 0 25 0/1 ? * 1,2,3,4,5,6,7 00:25
Export:Full SQL 0 30 0/1 ? * 1,2,3,4,5,6,7 00:30
Export:Outliers List 0 10 0/1 ? * 1,2,3,4,5,6,7 00:10
Export:Outliers Summary by hour 0 10 0/1 ? * 1,2,3,4,5,6,7 00:10
Export:Export Extraction Log 0 50 0/1 ? * 1,2,3,4,5,6,7 00:50
Export:Group Members 0 15 0/1 ? * 1,2,3,4,5,6,7 00:15
Export:Policy Violations 0 5 0/1 ? * 1,2,3,4,5,6,7 00:05
Export:Buff Usage Monitor 0 12 0/1 ? * 1,2,3,4,5,6,7 00:12
Export:VA Results 0 0 2 ? * 1,2,3,4,5,6,7 Daily at 2 AM
Export:Policy Violations - Detailed 0 5 0/1 ? * 1,2,3,4,5,6,7 00:05
Export:Access Log - Detailed 0 40 0/1 ? * 1,2,3,4,5,6,7 00:40
Export:Discovered Instances 0 20 0/1? * 1,2,3,4,5,6,7 00:20
Export:Databases Discovered 0 20 0/1? * 1,2,3,4,5,6,7 00:20
Export:Classifier Results 0 20 0/1? * 1,2,3,4,5,6,7 00:20
Export:Datasources 0 0 7 ? * 1,2,3,4,5,6,7 Daily at 7 AM
Export:STAP Status 0 0/5 0/1 ? * 1,2,3,4,5,6,7 Every 5 minutes
Export:Installed Patches 0 0 5 ? * 1,2,3,4,5,6,7 Daily at 5 AM
Export:System Info 0 0 5 ? * 1,2,3,4,5,6,7 Daily at 5 AM
Export:User - Role 0 5 0/1 ? * 1,2,3,4,5,6,7 00:05
Export:Classification Process Log 0 25 0/1 ? * 1,2,3,4,5,6,7 00:25
Export:Outliers List - enhanced 0 10 0/1 ? * 1,2,3,4,5,6,7 00:10
Export:Outliers Summary by hour - enhanced 0 10 0/1 ? * 1,2,3,4,5,6,7 00:10


  1. Enable the appropriate datamarts and point their output to your target server, for example:
    grdapi datamart_update_copy_file_info destinationHost=<destination server name>
    destinationPassword=<destination server PW>
    destinationPath=<destination server> destinationUser=<destination server user>
    Name="Export:Session Log" transferMethod=SCP  
  2. Schedule the extraction. Guardium schedulers are local to the appliance, so you need to run the GuardAPI scheduling command on each appliance from which data is extracted. For example, for each collector that needs to send session data you would run:
    grdapi schedule_job jobType=dataMartExtraction cronString=0 45 0/1 ? * 1,2,3,4,5,6,7 
    objectName="Export:Session Log"