Using automated discovery from the command line

Use the discover option to discover the database schema available for a given database or the files and folders for a given file system.

Purpose

You can use the discover to discover the database schema available for a given database or the files and folders for a given file system. You must use an existing connection to determine the schema or folders. You can then create import areas per schema/folder depending upon the type of connection being passed.

Syntax

imam --action discover 
[--server server_name] [--port port_number]
[--username username] [--password password][- cf][- dsh] [- redc]  
 [--help] [--silent] [--log]

Parameters

When you specify the long name of a parameter, you must type two dashes (--) before the parameter. For example, --action. When you specify the short name of a parameter, type only one dash (-). For example, -s.

Commands that use the discover option can use the following parameters.

Table 1. Parameters that can be used with the discover option
Parameter name Description
   
--action or -a

Required.

Use the discover option.

--server or -s

Optional.

Name of the services tier computer. If you specify a server, then you must specify a port.

--port or -p

Optional.

Port number to use on the services tier computer. The default HTTPS port is 9443. If you specify a port, you must specify a server.

--username or -u

Optional.

User name that is required for logging into InfoSphere® Information Server. The user must have the role of Common Metadata Administrator or Common Metadata Importer. You can use the --authfile parameter instead of specifying the --username and --password parameters. If you enter a user name without entering a password you are prompted for a password when you run the command.

--password or -w

Optional.

Password for the specified user name to log into InfoSphere Information Server. You can use the --authfile parameter instead of specifying the --username and --password parameters.

By default, passwords are saved. This default setting is controlled on the Import Settings page of the Administration tab of InfoSphere Metadata Asset Manager.

--dataSourceHost or -dsh

The argument -dsh is optional. This argument accepts a string value containing the host name of the system hosting the data source. If -dsh is specified in the command-line it will be used, else the targetHost from the JSON configuration file will be used.

--configurationFile or -cf

This argument accepts a JSON configuration file.

--reuseDC or -redc

The argument -redc is optional. This argument accepts a string value of a data connection name which you would like to reuse to invoke the discover command

--help or -h

Optional.

If you use --help with other options, the other options are ignored. Prints the list of actions and parameters. The help command is automatically issued when you issue a command that contains a syntax error, such as a typographical error, an improperly cased parameter or argument, or when the command is missing a required parameter.

--log or -lg

Optional.

Prints runtime log messages to the console while you run the command line. You can use the log to debug issues that arise when you use InfoSphere Metadata Asset Manager. The log includes the details of the HTTPS call that is made to the server, and stack trace information for any exceptions that are thrown.

discover command format

imam –-action discover -s <<server name>> -p <<port number>> -u <<username>> -w <<password>> -dsh <<target host name>> 
The following command invokes discover reusing data connection
imam –-action discover -s <<server name>> -p <<port number>> -u <<username>> -w <<password>>  -dsh <<target host name>> -redc <<dcName>>
The following command invokes discover with filter option
imam –-action discover  -s <<server name>> -p <<port number>> -u <<username>> -w <<password>>  -dsh <<target host name>>"
The following command invokes discover with configuration file option
imam –-action discover   -u <<username>> -w <<password>> -cf <<configuration file>> -dsh <<target host name>> 
The following command invokes discover with configuration file option when a data connection already exists and you want to use that data connection
imam –-action discover  -redc<<reuse data connection>> -mn <<port number>> -u <<username>> -w <<password>> -cf <<configuration file>> -dsh <<target host name>> 

Sample XML file

<?xml version="1.0" encoding="UTF-8"?>
-<ImportParameters release="11.5.0.1" bridgeVersion="9.1_1.0" bridgeId="CAS/DB2Connector__9.1" bridgeDisplayName="IBM InfoSphere DB2 Connector">
-<CompositeParameter type="DATA_CONNECTION" id="DataConnection" displayName="Data connection" isRequired="true">
-<Parameter id="dcName_" displayName="Name" isRequired="true">
<value>DB2_For_Demo_3</value>
</Parameter>
-<Parameter id="dcDescription_" displayName="Description">
<value/>
</Parameter>
-<Parameter id="Database" displayName="Database/Location" isRequired="true">
<value>SAMPLE</value>
</Parameter>
-<Parameter id="Username" displayName="User name">
<value>username</value>
</Parameter>
-<Parameter id="Password" displayName="Password">
<value>password</value>
</Parameter>
-<Parameter id="Instance" displayName="Instance">
<value/>
</Parameter>
</CompositeParameter>
</ImportParameters>

Sample JSON configuration file

{  
   "projectName":"IAProject1",
   "jobSteps":"columnAnalysis,termAssignment,qualityAnalysis",
   "dcRid":"b1c497ce.8e4c0a48.nrn5tro4h.v4diis2.no0ht3.j1q0v7vqd2mqmcm2rc77u",
   "importParameters":{  
      "isShallow":"false",
      "rootAssets":"folder[/dummy/new_file]",
      "fileTypeFilter":"",
      "Asset_description_already_exists":"Replace_existing_description",
      "targetHost":"HDFS_SAMPLE_IMPORT"
   }
}

Sample JSON configuration file for Amazon S3 connector

{  
   "projectName":"IAProject1",
   "jobSteps":"columnAnalysis,termAssignment,qualityAnalysis",
   "dcRid":"b1c497ce.8e4c0a48.nrn5tro4h.v4diis2.no0ht3.j1q0v7vqd2mqmcm2rc77u",
   "importParameters":{  
      "isShallow":"false",
      "rootAssets":"s3bucketName",
	  "S3BucketContents":"folder[bucketName/bucketContent]"	
      "fileTypeFilter":"",
      "Asset_description_already_exists":"Replace_existing_description",
      "targetHost":"HDFS_SAMPLE_IMPORT"
   }
}