Commands, return codes, and session IDs

You can use commands to diagnose problems, determine the status of the different parts of the system, start and stop sessions, or start and stop the system.

In a multiple server installation, you can run the commands from any server in your system. However, you should run the commands from the master server. The master server can access information from all other servers in the system.

Most commands have the following formats:
esadmin command_name arguments
esadmin session_ID action -option

For more information about all commands, enter esadmin help. For more information about a specific command, enter esadmin action help.

Restriction: Entering more than one esadmin request at the same time is not supported. The system does not prevent multiple concurrent submissions of the command, but there is no mechanism to prevent inconsistency if concurrent operations collide. If multiple esadmin requests run at the same time, one operation might succeed but the other operations are likely to fail.
Command descriptions are divided into the following sections:

System-wide esadmin commands

Enter the following commands on one line.

Table 1. System-wide esadmin commands
Command Description
esadmin system startall
Starts the system components on all servers. Starts the common communication layer (CCL) on the local server only. To recycle the CCL, you must manually stop and restart the CCL on each remote server.
Sample command:
esadmin system startall
esadmin system stopall
Stops the system components on all servers. Stops the CCL on the local server only. To recycle the CCL, you must manually stop and restart the CCL on each remote server.
Sample command:
esadmin system stopall
esadmin system checkall
Checks the status of all components on all servers.
Sample command:
esadmin system checkall
esadmin database.node1 export 
[-range [from_hours_before],[to_hours_before]] 
-fname csv_file_in_absolute_path
Exports the query statistics history to a file in CSV format. If you omit the range option, all kept history records are written to the file. With the range option, records from the specified from_hours_before value to the to_hours_before value are written to the file.
Sample command: Export all records:
esadmin database.node1 export -fname /home/esadmin/query_all.csv
Sample command: Export records from the past 24 hours (from 24*60 minutes until the current time):
esadmin database.node1 export -range 24, -fname /home/esadmin/query_last_24_hours.csvesadmin database.node1 export -range 24, -fname /home/esadmin/query_last_24_hours.csv
Sample command: Export records up until 24 hours ago:
esadmin database.node1 export -range ,24 -fname /home/esadmin/query_before_24_hours.csv
Sample messages and return codes:
FFQC5303I Database session (node1) (sid: database.node1) CCL session exists. PID: 8357
FFQC5314I The following result occurred: 0

Component-specific esadmin commands

Enter the following commands on one line.

Table 2. Component-specific esadmin commands
Command Description
esadmin crawler_session_id start
Starts a crawler session. This command does not start any crawling activity.
Sample command:
esadmin col1.WEB1.esadmin start
Sample messages and return codes:
FFQC5310I WEBCrawler1 (sid: col1.WEB1.esadmin) 
is not running.
FFQC5314I Result: 0
esadmin crawler_session_id startCrawl 
-options option
Applicable only to data source (non-Web) crawlers. Starts the crawler to crawl new, modified, and deleted documents. Use -options 2 to start a full crawl or -options 3 to start crawling new and modified documents only.
Sample commands:
esadmin col3.DB21.esadmin startCrawl
esadmin col2.DB21.esadmin startCrawl -options 3
Sample messages and return codes:
FFQC5303I DB2Crawler1 (sid: col3.DB21.esadmin) 
is already running. PID: 23650
FFQC5314I Result: 0
esadmin crawler_session_id pause
Pauses crawling.
Sample command:
esadmin col3.DB21.esadmin pause
Sample messages and return codes:
FFQC5303I DB2Crawler1 (sid: col3.DB21.esadmin) 
is already running. PID: 23650
FFQC5314I Result: 0
esadmin crawler_session_id resume
Resumes crawling.
Sample command:
esadmin col3.DB21.esadmin resume
Sample messages and return codes:
FFQC5303I DB2Crawler1 (sid: col3.DB21.esadmin) 
is already running. PID: 23650
FFQC5314I Result: 0
esadmin crawler_session_id stopCrawl
Stops crawling.
Sample command:
esadmin col3.DB21.esadmin stopCrawl
Sample messages and return codes:
FFQC5303I DB2Crawler1 (sid: col3.DB21.esadmin) 
is already running. PID: 23650
FFQC5314I Result: 0
esadmin crawler_session_id stop
Stops a crawler session.
Sample command:
esadmin col3.DB21.esadmin stop
Sample messages and return codes:
FFQC5303I DB2Crawler1 (sid: col3.DB21.esadmin) 
is already running. PID: 23650
FFQC5314I Result: 0
esadmin crawler_session_id getCrawlerStatus
Gets the status of a crawler. The information that is returned depends on whether the crawler is a Web crawler or a crawler for all other data sources.
Example for a Web crawler:
esadmin col1.WEB1.esadmin getCrawlerStatus
Possible return codes and messages for a Web crawler:
FFQC5303I WebCrawler1 (sid: col1.WEB1.esadmin) 
is already running. PID: 23650
Example for a non-Web crawler:
esadmin col3.DB21.esadmin getCrawlerStatus
Possible return codes and messages for a non-Web crawler:
FFQC5303I db2crawler (sid: db2col.DB2_96945) 
is already running. PID: 5936

For more information about returned status messages, see Web crawler status and Non-Web crawler status.

esadmin dscrawler_session_id 
getCrawlSpaceStatus

Gets general crawl space status for any crawler other than the Web crawler.

Sample command:
esadmin col3.DB21.esadmin getCrawlSpaceStatus
Sample messages and return codes:
FFQC5303I DB2Crawler1 (sid: col3.DB21.esadmin) 
is already running. PID: 23650

For more information about status messages, see Non-Web crawler status.

esadmin web_crawler_session_id 
getCrawlStatus 
-selections value

Gets general crawl space status for the Web crawler.

Sample command:
esadmin col1.WEB1.esadmin getCrawlStatus

For more information about returned status messages, see Web crawler status. For descriptions of the values that you can specify for the selections parameter, see Crawl space status for the Web crawler.

esadmin dscrawler_session_id 
getCrawlSpaceStatusDetail 
-ts target_server_id

Gets detailed crawl space status for any crawler other than a Web crawler. If you do not specify the target server option, data for all target servers is returned. For example, if the DB2 crawler crawls the FOUNTAIN and SAMPLE databases and you do not specify the target server option, the status of all tables in the FOUNTAIN and SAMPLE databases is returned.

Sample command:
esadmin col3.DB21.esadmin 
getCrawlSpaceStatusDetail -ts FOUNTAIN
Sample messages and return codes:
FFQC5303I DB2Crawler1 (sid: col3.DB21.esadmin) 
is already running. PID: 23650

For more information about returned status messages, see Non-Web crawler status.

esadmin webcrawler_session_id 
getCrawlDetailsPerSite 
-url string -selections num 
-threshold num

Gets detailed crawl space status for the Web crawler.

Sample command:
esadmin col1.WEB1.esadmin getCrawlDetailsPerSite

For more information about returned status messages, see Web crawler status. For descriptions of the values that you can specify for the selections and threshold parameters, see Detailed crawl space status for the Web crawler.

esadmin monitor 
getCollectionParserMonitorStatus 
-cid collection_ID
Gets the parser status.
Sample command:
esadmin monitor getCollectionParserMonitorStatus 
-cid col1
Sample messages and return codes:
FFQC5303I Monitor (node1) (sid: monitor) 
is already running. PID: 12543
esadmin controller startIndexBuild
-cid collection_id
Start an index build.
Sample command:
esadmin controller startIndexBuild -cid col_1
esadmin controller stopIndexBuild
-cid collection_id
Stop an index build.
Sample command:
esadmin controller stopIndexBuild -cid col_1
esadmin startSearch -cid collection_id
Starts the search server processes.
Sample command:
esadmin startSearch -cid col1
Sample messages and return codes:
FFQC5303I Controller (node1) (sid: controller) 
is already running. PID: 25917
FFQC5314I Result: 0
esadmin stopSearch -cid collection_id
Stops the search server processes.
Sample command:
esadmin stopSearch -cid col1
Sample messages and return codes:
FFQC5303I Controller (node1) (sid: controller) 
is already running. PID: 15292
FFQC5314I Result: 0
esadmin monitor 
getCollectionSearchMonitorStatus 
-cid collection_id
esadmin collection_id.
searcher_session_id
getSearchStatusDetail
esadmin searchmanager_session_id
getStatus -cid collection_id
Gets the status of the search server.
Sample command:
esadmin monitor getCollectionSearchMonitorStatus
-cid col_1
Sample messages and return codes:
FFQC5303I Monitor (node1) (sid: monitor)
is already running. PID: 12649

Returns detailed search index status information for a collection on a given search server. There is one search manager session per search server. Each search manager session is responsible for monitoring and operating the search indexes on a specific search server.

Sample command:
esadmin col_1.searcher.node1 getSearchStatusDetail
Sample messages and return codes:
FFQC5303I Index searcher - Collection col_1 (node1)
(sid: col_1.searcher.node1)
CCL session exists. PID: 5977
FFQC5314I The following result occurred: 
<SearchStatusDetail>
<NumberOfDocuments>24</NumberOfDocuments>
<LastRefreshedTime>1331618922665</LastRefreshedTime>
<IndexVersion>82</IndexVersion>
<NumberOfQueryExecutions>40</NumberOfQueryExecutions>
<NumberOfCacheHits>0</NumberOfCacheHits>
<RecentAverageResponseTime>0</RecentAverageResponseTime>
<RecentQueryRate>0</RecentQueryRate>
Sample command:
esadmin searchmanager.node1 getStatus -cid col_1
Sample messages and return codes:
FFQC5303I Search Manager (node1) 
(sid: searchmanager.node1) 
CCL session exists. PID: 12614
FFQC5314I The following result occurred: PID=5977
CacheHits=0
RefreshedTime=1331618922665
IndexVersion=82
QueryRate=0
Port=59855
SessionId=col_1.searcher.node1
CacheHitRate=0.0
Status=1
ResponseTime=0
SessionName=col_1.searcher.node1.1

For more information about returned status messages, see Search server status.

Web crawler status

When you run the command to obtain Web crawler status, the command returns information in an XML document format. The following information can be returned by the Web crawler status command:
FFQC5314I Result: <?xml version='1.0' encoding='UTF-8'?>
<CrawlerStatus>
<CrawlerRunLevel Value="Running"/>
<CrawlerThreadStateDist Count="4" Total="200">
<CrawlerThreadState State="FETCHING" Count="100"/>
. . . 
</CrawlerThreadState State="FETCHING" Count=100>
<ActiveBucketList Count="500">
<ActiveBucket URL="http://w3.ibm.com/"
                 NumActURLs="355"
                 NumProcURLs="350"
                 TimeRem="5" Duration="1195"/>
. . . 
</ActiveBucketList>
<CrawlRate Value="75"/>
<RecentlyCrawledURLList Count="40">
<RecentlyCrawledURL URL="http://example.server.com/example.html"/>
<RecentlyCrawledURL URL="http://example.server.com/example.html"/>
<NumURLsThisSession Value="160000"/>
</CrawlerStatus>
The following table describes each XML element and its possible attributes that are returned by the Web crawler status command:
Table 3. Web crawler status information
Element Attributes Description
CrawlerStatus
  • CrawlerThreadStateDist
  • ActiveBucketList
  • CrawlRate
  • RecentlyCrawledURLList
  • NumURLsThisSession
Crawler status.
CrawlerRunLevel Value
  • String (English) "Not started": The crawler session exists, but it has not yet received the start message to process documents.
  • "Started": The crawler is starting.
  • "Running": The crawler finished initialization and startup and is actively crawling.
  • "Paused": The crawler was told to suspend active crawling, but not to exit.
  • "Stopping": The crawler received the stop signal and is going to stop.
  • "Error": The crawler is in an unrecoverable state, and it must be stopped and restarted to resume crawling.
Information about what the crawler is doing.
CrawlerThreadState State String (English) Crawler thread activity. This field shows what the thread or threads are doing.
ActiveBucket
  • URL: String (URL spec)

    The protocol, host and port whose URLs are being crawled.

  • NumActURLs: Integer (positive)

    The number of URLs in bucket when it was made available for crawling (activated).

  • NumProcURLs: Integer (nonnegative)

    The number of URLs from bucket that have been processed so far, either crawled or rejected.

  • TimeRem: Integer

    The number of seconds remaining before the bucket times out.

  • Duration: Integer (nonnegative)

    The number of seconds since the bucket was activated.

The current activity of a specified Web site.
CrawlRate Value: Integer (nonnegative)

Pages per second being crawled (all buckets combined).

The crawler throughput measurement.
RecentlyCrawledURL URL: String (URL spec)

String specifying a protocol, host, port and file that was crawled.

A page that was crawled recently.
NumURLsThisSession Value: Integer (nonnegative) The number of URLs that were crawled since this instance of the crawler (process) started crawling.

Crawl space status for the Web crawler

When you run the command to obtain crawl space status for a Web crawler, the command returns information in an XML document format. The following information can be returned by a Web crawl space status command. The selections parameter values are masks. For example, to see status information for mask 1 (Number of pages in raw data store) and status information for mask 2 (Number of discovered sites), add the mask values and specify -selections=3.

Table 4. Selection mask values for the Web crawler crawl space status command
Mask bit Selects
1 Number of pages in raw data store.
2 Number of discovered sites.
4 Number of sites with DNS.
8 Number of sites without DNS.
16 Number of discovered URLs.
32 Number of unique saved pages.
64 Number of crawled URLs.
128 Number of URLs that are uncrawled.
256 Number of URLs that are overdue.
512 HTTP status code distribution.
All values represent cumulative totals for all sessions that use the current internal database:
<CrawlStatus>
  <NumPagesInRDS Value="5422386"/>
  <NumSitesDiscovered Value="15332"/>
  <NumSitesWithDNS Value="14832"/>
  <NumSitesWithoutDNS Value="500"/>
  <NumURLsDiscovered Value="15222999"/>
  <NumUniquePagesSaved Value="6234789"/>
  <NumURLsCrawled Value="7800422"/>
  <NumURLsUncrawled Value="7422577"/>
  <NumURLsOverdue Value="14000"/>
  <HTTPCodeDist Count="4" Total="1031000"/>
    <HTTPCode Code="200" Count ="1000000"/>
    <HTTPCode Code="301" Count ="1000"/>
    <HTTPCode Code="404" Count ="10000"/>
    <HTTPCode Code="780" Count="20000"/>
  </HTTPCode Code="780" Count="20000">
<?CrawlStatus>
The return data contains any or all (possibly none) of the following elements:
Table 5. Crawl space status information for the Web crawler
Element Attribute Description
CrawlerStatus
  • NumPagesInRDS
  • NumSitesDiscovered
  • NumSitesWithDNS
  • NumSitesWithoutDNS
  • NumURLsDiscovered
  • NumUniquePagesSaved
  • NumURLsCrawled
  • NumURLsUncrawled
  • NumURLsOverdue
  • HTTPCodeDist
Information that can be quickly obtained about the cumulative state of the crawl (all sessions).
NumPagesInRDS Value: Nonnegative integer

How many pages are currently in the raw data store (RDS) staging area (from this crawler only).

How full the raw data store (RDS) is becoming (from this crawler's contributions only).
NumSitesDiscovered Value: Nonnegative integer

How many hosts were discovered by crawling (or from seeds).

A measure of the crawler's coverage of the domain to be crawled (host count).
NumSitesWithDNS Value: Nonnegative integer

How many hosts have associated IP addresses (resolved by the crawler in background).

A measure of how effectively the crawler is able to get IP addresses for hosts that are discovered by DNS names in URLs.
NumSitesWithoutDNS Value: Nonnegative integer

How many hosts do not have associated IP addresses (resolved by the crawler in background).

A measure of how effectively the crawler is able to get IP addresses for hosts that are discovered by DNS names in URLs.
NumURLsDiscovered Value: Nonnegative integer

How many unique URLs were visited by the crawler.

A measure of the crawler's coverage of the domain to be crawled (URL count).
NumUniquePagesSaved Value: Nonnegative integer

How many unique pages were written to the RDS for further processing by other components.

This crawler's contribution to the size of the index.
NumURLsCrawled Value: Nonnegative integer

How many unique URLs were crawled by the crawler.

A measure of the crawler's ability to process data, end to end. This number is different from the number of pages written to RDS, because not all crawled pages result in being written to RDS.
NumURLsOverdue Value: Nonnegative integer

How many unique URLs are eligible to be recrawled.

A measure of the crawler's ability to traverse the Web space.

Detailed crawl space status for the Web crawler

When you run the command to obtain detailed crawl space status for the Web crawler, the command returns information in an XML document format. The following information can be returned by the detailed crawl space status command. The selections parameter values are masks. For example, to see status information for mask 1 (Number of pages in raw data store) and status information for mask 2 (Number of discovered sites), add the mask values and specify -selections=3.

Table 6. Selection mask values for the Web crawler detailed crawl space status command
Mask bit Selects
1 Number of pages in raw data store.
2 Number of discovered sites.
4 Number of sites with DNS.
8 Number of sites without DNS.
16 Number of discovered URLs.
32 Number of unique saved pages.
64 Number of crawled URLs.
128 Number of URLs that are uncrawled.
256 Number of URLs that are overdue.
512 HTTP status code distribution.
Sample returned information:
<CrawlDetailsPerSite>
  <Site URL=http://w3.ibm.com/">
  <NumURLsDiscovered Value="5422386"/>
  <NumURLsOverdue Value="15332"/>
  <NumURLsCrawled Value="15332"/>
  <NumURLsUncrawled Value="15332"/>
  <NumURLsOverdueBy Threshold=”604800” Value="14832"/>
  <NumURLsActivated Value=”2200"/>
  <LastActivationTime Value="1076227340"/>
  <LastActivationDuration Value="4300"/>
  <IPAddressList Count="1"/>
    <IPAddress Value=“9.205.41.33"/>
  </IPAddressList>
  <RobotsContent>
   robots content. . . 
  </RobotsContent>
  <HTTPCodeDist Count="4" Total="1031000"/>
    <HTTPCode Code="200" Count ="1000000"/>
    <HTTPCode Code="301" Count ="1000"/>
    <HTTPCode Code="404" Count ="10000"/>
    <HTTPCode Code="780" Count="20000"/>
  </HTTPCodeDisT>
</CrawlDetailsPerSite>
The following table describes each field that is returned for the Web crawler detailed crawl space status:
Table 7. Detailed crawl space status information for the Web crawler
Element Attributes Description
CrawlDetailsPerSite
  • LastActivationTime:
  • LastActivationDuration:
  • IPAddressList:
  • RobotsContent:
  • HTTPCodeDist:
Information that can be quickly obtained about the detailed state of one site.
Site URL URL of the site root page.
NumURLsDiscovered Value The number of URLs that were discovered from the site.
NumURLsOverdue Value The number of URLs that are eligible to be recrawled from the site.
NumURLsCrawled Value The number of URLs that were crawled for the site.
NumURLsUncrawled Value The number of URLs that are not yet crawled for the site.
NumURLsOverdueBy Threshold, Value: Integer (positive or negative)

The value represents the number of URLs that are eligible to be recrawled. The threshold specifies the amount of time that the URLs have been waiting to be recrawled. The threshold is measured as the number of seconds offset from the current time. If the threshold is negative, it means that a recrawl of the URLs is overdue. If the threshold is positive, it means that a recrawl of the URLs is due to occur.

The number of URLs that became eligible to be recrawled at least some number of seconds ago or that are becoming eligible to be recrawled in the next so many seconds.
NumURLsActivated Value Number of URLs brought into memory during the last scan of this site and made available to crawler threads.
LastActivationTime Value The number of seconds since epoch at which this site's URLs were last brought into memory.
LastActivationDuration Value The number of seconds that this site's URLs were last in memory and available to crawler threads.
IPAddressList IPAddress All known IP addresses for this site's server host.
IPAddress Value IPv4 dot-notation address for the site's server host.
RobotsContent Text Text from the robots file, if any text exists.
HTTPCodeDist HTTPCode Distribution of HTTP codes from this site's attempted downloads.
HTTPCode Code: Integer

An HTTP status code or another internal code.

How many times a particular HTTP status code occurred during the crawl of this site.

Detailed status for severs crawled by the Web crawler

When you run the command to obtain detailed crawl space status for the Web crawler, the command returns information in an XML document format. The following information about specific Web servers that are crawled can be returned by the detailed crawl space status command.

Sample returned information:
<ServerStatus>
  <Server Name ="www.example.com">
     <Status>1</Status> 
     <StatusMessage>Running</StatusMessage>
<NumberOfURLs></NumberOfURLs>
     <NumberOfCrawledURLs>109771</NumberOfCrawledURLs>
     <NumberOfInsertedURLs>1030</NumberOfInsertedURLs>
     <NumberOfUpdatedURLs>2053</NumberOfUpdatedURLs>
     <NumberOfDeletedURLs>2045</NumberOfDeletedURLs>
     <HTTPStatus value=200>100000</HTTPStatus>
     <HTTPStatus value=404>2045</HTTPStatus>
      …
     <StartTime>1100497759852</StartTime>
     <EndTime></EndTime>
     <TotalTime>1395055</TotalTime>
  </Server>
</ServerStatus>
The following table describes each field that is returned for the Web crawler detailed crawl space status:
Table 8. Detailed crawl space status information for the Web crawler
Element Attributes Description
ServerStatus Name The host name of the crawled server.
Status The crawler status for this server. Values are 0: Idle; 1: Running; -1: Error.
StatusMessage The string representation of the Status element. Idle: Idle; Running: Running; Error: Error.
NumberOfURLs The number of known URLs for this server.
NumberOfCrawledURLs The number of documents crawled during this session on this server.
NumberOfInsertedURLs The number of documents inserted during this session on this server.
NumberOfDeletedURLs   The number of documents deleted during this session on this server.
HTTPStatus Value (HTTP return code) The number of documents with this HTTP status code.
StartTime The time this crawler start crawling this server.
EndTime The time this crawler stopped crawling this server.
TotalTime The total time spent to crawl this server, in milliseconds.

Non-Web crawler status

When you run the command to obtain crawler status for a non-Web crawler, the command returns information in an XML document format. The following information can be returned by the getCrawlerStatus command for non-Web crawlers:
FFQC5314I Result: <?xml version='1.0' encoding='UTF-8'?>
<GeneralStatus>
<Status>0</Status>
<StatusMessage>Idle</StatusMessage>
<NumberOfServers>1</NumberOfServers>
<NumberOfCompletedServers>1</NumberOfCompletedServers>
<NumberOfTargets>3</NumberOfTargets>
<NumberOfCompletedTargets>3</NumberOfCompletedTargets>
<NumberOfCrawledRecords>115</NumberOfCrawledRecords>
<RunningThreads>0</RunningThreads>
</GeneralStatus>
The following tables describe the XML elements and attributes for each crawler except for the Web crawler. This information is returned with the crawler status command.
Table 9. Crawler status information for the NNTP, DB2, JDBC database, and Notes crawlers
Element and attribute name NNTP crawler DB2 and JDBC database crawlers Notes crawler
Status Status (0, 1, 2, -1) Status (0, 1, 2, -1) Status (0, 1, 2, -1)
StatusMessage Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error
NumberOfServers The number of NNTP servers in the crawl space. The number of databases in the crawl space. The number of databases in the crawl space.
NumberOfCompletedServers The number of crawled NNTP servers. The number of crawled databases. The number of crawled databases.
NumberOfTargets The number of news groups in the crawl space. The number of databases in the crawl space. The number of views and folders in the crawl space.
NumberOfCompletedTargets The number of crawled news groups. The number of crawled tables. The number of crawled views and folders.
NumberOfCompletedRecords The number of crawled articles. The number of crawled records. The number of crawled documents.
RunningThreads The number of crawler threads. The number of crawler threads. The number of crawler threads.
Table 10. Crawler status information for the Exchange Server, Content Manager, and Content Integrator crawlers
Element and attribute name Exchange Server crawler Content Manager crawler Content Integrator crawler
Status Status (0, 1, 2, -1) Status (0, 1, 2, -1) Status (0, 1, 2, -1)
StatusMessage Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error
NumberOfServers The number of Exchange Server servers in the crawl space. The number of Content Manager EE servers in the crawl space. The number of repositories in the crawl space.
NumberOfCompletedServers The number of crawled Exchange Server servers. The number of crawled Content Manager EE servers. The number of crawled repositories.
NumberOfTargets The number of subfolders in the crawl space. The number of item types in the crawl space. The number of classes in the crawl space.
NumberOfCompletedTargets The number of crawled subfolders. The number of crawled item types. The number of crawled item classes.
NumberOfCompletedRecords The number of crawled documents. The number of crawled documents. The number of crawled documents.
RunningThreads The number of crawler threads. The number of crawler threads. The number of crawler threads.
Table 11. Crawler status information for the Quickr for Domino, UNIX file system, and Windows file system crawlers
Element and attribute name Quickr for Domino crawler UNIX file system and Windows file system crawlers
Status Status (0, 1, 2, -1) Status (0, 1, 2, -1)
StatusMessage Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error
NumberOfServers The number of places in the crawl space. Fixed value of 1.
NumberOfCompletedServers The number of crawled places. 0 or 1 if all subdirectories are crawled.
NumberOfTargets The number of place databases and room databases in the crawl space. The number of subdirectories in the crawl space.
NumberOfCompletedTargets The number of crawled place databases and room databases. The number of crawled subdirectories.
NumberOfCompletedRecords The number of crawled documents. The number of crawled files.
RunningThreads The number of crawler threads. The number of crawler threads.
Table 12. Crawler status information for the Seed list, SharePoint, and FileNet P8 crawlers
Element and attribute name Seed list crawler SharePoint crawler FileNet P8 crawler
Status Status (0, 1, 2, -1) Status (0, 1, 2, -1) Status (0, 1, 2, -1)
StatusMessage Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error
NumberOfServers The number of seed lists in the crawl space (always 1). The number of SharePoint elements to be crawled. The number of FileNet® P8 elements (classes or folders) to be crawled.
NumberOfCompletedServers The number of crawled seed lists. The number of crawled elements. The number of crawled elements.
NumberOfTargets The number of seed lists in the crawl space (always 1). The number of SharePoint elements to be crawled. The number of FileNet P8 elements (classes or folders) to be crawled.
NumberOfCompletedTargets The number of crawled seed lists. The number of crawled elements. The number of crawled elements.
NumberOfCompletedRecords The number of crawled documents. The number of crawled documents. The number of crawled files.
RunningThreads The number of crawler threads. The number of crawler threads. The number of crawler threads.
Table 13. Crawler status information for the BoardReader and Agent for Windows file systems crawlers
Element and attribute name BoardReader crawler Agent for Windows file systems crawler
Status Status (0, 1, 2, -1) Status (0, 1, 2, -1)
StatusMessage Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error Status: 0 - Idle, 1 - Running, 2 - Paused, -1 - Error
NumberOfServers The number of target BoardReader sources. The number of specified directories to be crawled.
NumberOfCompletedServers The number of crawled sources. The number of crawled directories that are explicitly defined in the configuration.
NumberOfTargets The number of target BoardReader sources. The number of specified directories to be crawled.
NumberOfCompletedTargets The number of crawled sources. The number of crawled directories that are explicitly defined in the configuration.
NumberOfCompletedRecords The number of crawled documents. The number of crawled documents.
RunningThreads The number of crawler threads. The number of crawler threads.

Crawl space status for non-Web crawlers

When you run the command to obtain crawl space status for a non-Web crawler, the command returns information in an XML document format. The following information can be returned by the getCrawlSpaceStatus command for non-Web crawlers:
FFQC5314I Result: <?xml version='1.0' encoding='UTF-8'?>
<ServerStatus>
   <Server Name ="FOUNTAIN">
     <Status>5</Status>
     <StatusMessage>Scheduled</StatusMessage>
     <NumberOfTargets>1</NumberOfTargets>
     <NumberOfCompletedTargets>1</NumberOfCompletedTargets>
     <NumberOfErrors>0</NumberOfErrors>
     <StartTime>1118354510512</StartTime>
     <EndTime>1118354514386</EndTime>
     <ScheduleConfigured>2</ScheduleConfigured>
     <ScheduleTime>1118393377000</ScheduleTime>
     <TotalTime>3874</TotalTime>
  </Server>
</ServerStatus>
The following tables describe the XML elements and attributes for each crawler except for the Web crawler. This information is returned with the crawl space status command. For Notes crawlers, when the aggregation level is 0, Server@Name is server name + database name. When the aggregation level is 1, Server@Name is server name + directory name.
Table 14. Crawl space status information for the NNTP, DB2, JDBC database, and Notes crawlers
Element and attribute name NNTP crawler DB2 and JDBC database crawlers Notes crawler
Server@Name News server name Database name Database name or directory name
Server/Status Status: (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/NumberOfTargets The number of news groups in the crawl space. The number of databases in the crawl space. The number of views and folders or directories in the crawl space.
Server/NumberOf CompletedTargets The number of crawled news groups. The number of crawled tables. The number of crawled views and folders or directories.
Server/NumberOfErrors The number of errors. The number of errors. The number of errors
Server/StartTime The start time if applicable. The start time if applicable. The start time if applicable.
Server/EndTime The end time if applicable. The end time if applicable. The end time if applicable.
Server/ScheduleConfigured 0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
Server/ScheduleTime Schedule time if applicable. Schedule time if applicable. Schedule time if applicable.
Server/TotalTime The total time if applicable. The total time if applicable. The total time if applicable.
Server/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode. 0, 1:
  • 0: The Notes crawler crawls documents with normal mode. (The other crawlers except for the Notes crawler always return 0.)
  • 1: The Notes crawler crawls documents with directory mode.
Table 15. Crawl space status information for the Exchange Server, Content Manager, and Content Integrator crawlers
Element and attribute name Exchange Server crawler Content Manager crawler Content Integrator crawler
Server@Name Exchange Server server name. Content Manager EE servers. Repository name.
Server/Status Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/NumberOfTargets The number of subfolders in the crawl space. The number of item types in the crawl space. The number of item classes in the crawl space.
Server/NumberOf CompletedTargets The number of crawled subfolders. The number of crawled item types. The number of crawled item classes.
Server/NumberOfErrors The number of errors. The number of errors. The number of errors.
Server/StartTime The start time if applicable. The start time if applicable. The start time if applicable.
Server/EndTime The end time if applicable. The end time if applicable. The end time if applicable.
Server/ScheduleConfigured 0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
Server/ScheduleTime Schedule time if applicable. Schedule time if applicable. Schedule time if applicable.
Server/TotalTime The total time if applicable. The total time if applicable. The total time if applicable.
Server/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode.
Table 16. Crawl space status information for the Quickr for Domino, UNIX file system, and Windows file system crawlers
Element and attribute name Quickr for Domino crawler UNIX file system and Windows file system crawlers
Server@Name Place directory A fixed value of localhost.
Server/Status Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/NumberOfTargets The number of place databases and room databases in the crawl space. The number of subdirectories in the crawl space.
Server/NumberOf CompletedTargets The number of crawled place databases and room databases. The number of subdirectories in the crawl space.
Server/NumberOfErrors The number of errors. The number of errors.
Server/StartTime The start time if applicable. The start time if applicable.
Server/EndTime The end time if applicable. The end time if applicable.
Server/ScheduleConfigured 0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
Server/ScheduleTime Schedule time if applicable. Schedule time if applicable.
Server/TotalTime The total time if applicable. The total time if applicable.
Server/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode.
Table 17. Crawl space status information for the Seed list, SharePoint, and FileNet P8 crawlers
Element and attribute name Seed list crawler SharePoint crawler FileNet P8 crawler
Server@Name Seed list server name Library database A fixed value of localhost.
Server@SpaceId N/A Crawl space internal ID. Crawl space internal ID.
Server@SpaceLabel N/A SharePoint element full name. FileNet P8 element full name.
Server/Status Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/NumberOfTargets The number of seed lists in the crawl space (always 1). The number of SharePoint elements to be crawled. The number of FileNet P8 elements to be crawled.
Server/NumberOf CompletedTargets The number of crawled seed lists. The number of crawled elements. The number of crawled elements.
Server/NumberOfErrors The number of errors. The number of errors. The number of errors.
Server/StartTime The start time if applicable. The start time if applicable. The start time if applicable.
Server/EndTime The end time if applicable. The end time if applicable. The end time if applicable.
Server/ScheduleConfigured 0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
Server/ScheduleTime Schedule time if applicable. Schedule time if applicable. Schedule time if applicable.
Server/TotalTime The total time if applicable. The total time if applicable. The total time if applicable.
Server/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode.
Table 18. Crawl space status information for the BoardReader and Agent for Windows file systems crawlers
Element and attribute name BoardReader crawler Agent for Windows file systems crawler
Server@Name Target source name. Directory name.
Server@SpaceId Crawl space internal ID. Crawl space internal ID.
Server@SpaceLabel Target source full name. Directory full name.
Server/Status Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Status (0, 1, 2, 3, 4, 5, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • 5: Scheduled
  • -1: Error
Server/NumberOfTargets The number of BoardReader target sources. The number of specified directories to be crawled.
Server/NumberOf CompletedTargets The number of crawled sources. The number of crawled directories that are explicitly defined in the configuration.
Server/NumberOfErrors The number of errors. The number of errors.
Server/StartTime The start time if applicable. The start time if applicable.
Server/EndTime The end time if applicable. The end time if applicable.
Server/ScheduleConfigured 0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
0, 1, 2
  • 0: The crawler is not configured for scheduling according to the crawler configuration files.
  • 1: The crawler is configured for scheduling, but the scheduling was disabled for the session
  • 2: The crawler is configured for scheduling, and the scheduling is enabled for the session
Server/ScheduleTime Schedule time if applicable. Schedule time if applicable.
Server/TotalTime The total time if applicable. The total time if applicable.
Server/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode.

Detailed crawl space status for non-Web crawlers

When you run the command to obtain detailed crawl space status for non-Web crawlers, the command returns information in an XML document format. The following information can be returned by the getCrawlSpaceStatusDetail command for non-Web crawlers:
FFQC5314I Result: <?xml version='1.0' encoding='UTF-8'?>
<TargetStatus>
  <Target Name ="escmgr.crawlerinstances">
    <Status>2</Status>
    <StatusMessage>Completed</StatusMessage>
    <NumberOfRecords></NumberOfRecords>
    <NumberOfCrawledRecords>117</NumberOfCrawledRecords>
    <NumberOfInsertedRecords>21</NumberOfInsertedRecords>
    <NumberOfUpdatedRecords>45</NumberOfUpdatedRecords>
    <StartTime>1118354510727</StartTime>
    <EndTime>1118354514386</EndTime>
    <AggregationLevel>0<AggregationLevel>
  <Target>
</TargetStatus>
Table 19. Detailed crawl space status information for the NNTP, DB2, JDBC database, and Notes crawlers
Element and attribute name NNTP crawler DB2 and JDBC database crawlers Notes crawler
Target@Name News group name Table name View or folder name
Target@CrawlType Not applicable. 0,1 (DB2); 0 (JDBC database)
  • 0: Active crawl (Normal)
  • 1: Passive crawl (DB2 Event Publishing)
0
Target/Status Status: (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Status: (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Status: (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/NumberOf Records The last article number on the server. The number of crawled records. The number of crawled documents.
Target/NumberOf CompletedRecords The number of crawled articles. The number of crawled records. The number of crawled documents.
Target/NumberOf InsertedRecords The number of newly posted articles. The number of inserted records. The number of inserted records.
Target/NumberOf UpdatedRecords Not applicable. The number of updated records. The number of updated records.
Target/NumberOf DeletedRecords Not applicable. The number of deleted records. The number of deleted records.
Target/StartTime The date and time that the crawler last started. The date and time that the crawler last started. The date and time that the crawler last started.
Target/EndTime The date and time that crawling was completed. The date and time that crawling was completed. The date and time that crawling was completed.
Target/TotalTime The amount of time that the crawler spent crawling. The amount of time that the crawler spent crawling. The amount of time that the crawler spent crawling.
Target/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode. 0, 1:
  • 0: The crawler crawls documents with normal mode.
  • The crawler crawls documents with directory mode.
Target/LastUpdatedTime Not applicable. The last updated time:
  • 0: Active crawl (Normal)
  • 1: Passive crawl (DB2 Event Publishing)
Not applicable.
Target/LastResetTime Not applicable. The last time reset statistics:
  • 0: Active crawl (Normal)
  • 1: Passive crawl (DB2 Event Publishing)
Not applicable.
Table 20. Detailed crawl space status information for the Exchange Server, Content Manager, and Content Integrator crawlers
Element and attribute name Exchange Server crawler Content Manager crawler Content Integrator crawler
Target@Name Subfolder name Item type name Item class name
Target@CrawlType 0 0 0
Target/Status Status (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Status (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Status (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/NumberOf Records Not applicable. Not applicable. Not applicable.
Target/NumberOf CompletedRecords The number of crawled documents. The number of crawled documents. The number of crawled documents.
Target/NumberOf InsertedRecords The number of inserted records. The number of inserted records. The number of inserted records.
Target/NumberOf UpdatedRecords Not applicable. The number of updated records. The number of updated records.
Target/NumberOf DeletedRecords Not applicable. The number of deleted records. The number of deleted records.
Target/StartTime The date and time that the crawler last started. The date and time that the crawler last started. The date and time that the crawler last started.
Target/EndTime The date and time that crawling was completed. The date and time that crawling was completed. The date and time that crawling was completed.
Target/TotalTime The amount of time that the crawler spent crawling. The amount of time that the crawler spent crawling. The amount of time that the crawler spent crawling.
Target/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode.
Target/LastUpdatedTime Not applicable. Not applicable. Not applicable.
Target/LastResetTime Not applicable. Not applicable. Not applicable.
Table 21. Detailed crawl space status information for the Quickr for Domino, UNIX file system, and Windows file system crawlers
Element and attribute name Quickr for Domino crawler UNIX file system and Windows file system crawlers
Target@Name PPlace database name or room database name Subdirectory name
Target@CrawlType 0 0
Target/Status Status (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Status (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/NumberOf Records Not applicable. Not applicable.
Target/NumberOf CompletedRecords The number of crawled documents. The number of crawled files.
Target/NumberOf InsertedRecords The number of inserted records. The number of inserted records.
Target/NumberOf UpdatedRecords The number of updated records. The number of updated records.
Target/NumberOf DeletedRecords The number of deleted records. The number of deleted records.
Target/StartTime The date and time that the crawler last started. The date and time that the crawler last started.
Target/EndTime The date and time that crawling was completed. The date and time that crawling was completed.
Target/TotalTime The amount of time that the crawler spent crawling. The amount of time that the crawler spent crawling.
Target/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode.
Target/LastUpdatedTime Not applicable. Not applicable.
Target/LastResetTime Not applicable. Not applicable.
Table 22. Detailed crawl space status information for the Seed list, SharePoint, and FileNet P8 crawlers
Element and attribute name Seed list crawler SharePoint crawler FileNet P8 crawler
Target@Name Seed list server address Item type name Item class name
Target@SpaceId N/A Crawl space internal ID Crawl space internal ID
Target@SpaceLabel N/A SharePoint element full name FileNet P8 element full name
Target@CrawlType 0 0 0
Target/Status Status (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Status (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Status (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/NumberOf Records Not applicable. Not applicable. Not applicable.
Target/NumberOf CompletedRecords The number of crawled records. The number of crawled documents. The number of crawled documents.
Target/NumberOf InsertedRecords The number of inserted records. The number of inserted records. The number of inserted records.
Target/NumberOf UpdatedRecords The number of updated records. The number of updated records. The number of updated records.
Target/NumberOf DeletedRecords The number of deleted records. The number of deleted records. The number of deleted records.
Target/StartTime The date and time that the crawler last started. The date and time that the crawler last started. The date and time that the crawler last started.
Target/EndTime The date and time that crawling was completed. The date and time that crawling was completed. The date and time that crawling was completed.
Target/TotalTime The amount of time that the crawler spent crawling. The amount of time that the crawler spent crawling. The amount of time that the crawler spent crawling.
Target/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode.
Target/LastUpdatedTime Not applicable. Not applicable. Not applicable.
Target/LastResetTime Not applicable. Not applicable. Not applicable.
Table 23. Detailed crawl space status information for the BoardReader and Agent for Windows file systems crawlers
Element and attribute name BoardReader crawler Agent for Windows file systems crawler
Target@Name Target source name Directory name
Target@SpaceId Crawl space internal ID Crawl space internal ID
Target@SpaceLabel Target source full name Directory full name
Target@CrawlType 0 0
Target/Status Status: (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Status: (0, 1, 2, 3, 4, -1)
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/StatusMessage
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
  • 0: Not Crawled
  • 1: Crawling
  • 2: Completed (not scheduled)
  • 3: Waiting
  • 4: Paused
  • -1: Error
Target/NumberOf Records Not applicable. Not applicable.
Target/NumberOf CompletedRecords The total number of crawled records. The total number of crawled records.
Target/NumberOf InsertedRecords The number of inserted records. The number of inserted records.
Target/NumberOf UpdatedRecords The number of updated records. The number of updated records.
Target/NumberOf DeletedRecords The number of deleted records. The number of deleted records.
Target/StartTime The date and time that the crawler last started. The date and time that the crawler last started.
Target/EndTime The date and time that crawling was completed. The date and time that crawling was completed.
Target/TotalTime The amount of time that the crawler spent crawling. The amount of time that the crawler spent crawling.
Target/AggregationLevel 0: The crawler crawls documents with normal mode. 0: The crawler crawls documents with normal mode.
Target/LastUpdatedTime Not applicable. Not applicable.
Target/LastResetTime Not applicable. Not applicable.

Parser status

When you run the command to obtain parser status, the command returns information in an XML document format. The following information can be returned by the parser status command:
FFQC5314I The following result occurred:
<Monitor Type="Parser">
<ParserStatus>
<Status>1</Status>
</ParserStatus>
</Monitor>
The following table describes the XML elements for information that is returned by the parser status command:
Table 24. Elements for the parser status command
Element Description
Status
  • 0: The parser session for this collection is stopped.
  • 1: The parser session for this collection is running.

Search server status

When you run the command to obtain search server status, the command returns information in an XML document format. The following information can be returned by the search server status command:
FFQC5314I Result: <?xml version="1.0"?>
<Monitor Type="Search" Count="1">
<SearchStatus Name="Search Manager (node1)" SearchID=
"searchmanager.node1" HostName="myComputer.svl.ibm.com">
<Status>1</Status>
</SearchStatus>
</Monitor>
The following table describes the XML elements for information that is returned by the search server status command:
Table 25. Elements for the search server status command
Element Description
SearchStatusName The name and ID of the search manager session that is monitoring and maintaining the search index for this collection.
HostName The host name of the server where the search index is running.
Status
  • 0 if the search index for this collection is not running.
  • 1 if the search index for this collection is running.
Detailed search server status is available. The command to return to search server status can return the following information:
FFQC5303I Search Manager (node1) (sid: searchmanager.node1) 
is already running. PID: 15711
FFQC5314I Result: PID=18390
CacheHits=3
QueryRate=1
Port=44008
SessionId=col1.runtime.node1
CacheHitRate=0.333
ResponseTime=70
Status=1
SessionName=col1.runtime.node1.1
The following table describes the items in the information that is returned from the detailed search server status command:
Table 26. Items for the detailed search server status command
Item Description
CacheHits The number of results retrieved from the search cache.
QueryRate The number of queries received in the last time interval. By default, the time interval is five minutes.
Port The port number that is used by the search index to listen or receive queries.
SessionId The session ID for this collection's search index.
CacheHitRate The number of results retrieved from the search cache as a percentage of all search results.
ResponseTime The average response time in milliseconds for the specified time interval. (The default is five minutes.)
Status
  • 0 if the search index for this collection is not running.
  • 1 if the search index for this collection is running.
SessionName The session name for this collection's search index.

Return codes for esadmin commands

The following codes can be returned for esadmin commands:
Table 27. Return codes for esadmin commands
Code Name Description
0 CODE_ERROR_NONE The command completed successfully.
102 CODE_ERROR_INSTANTIATION_EXCEPTION An error occurred when instantiating a command handler.
103 CODE_ERROR_ACCESS_EXCEPTION An illegal access error occurred when instantiating a command handler.
104 CODE_ERROR_EXECUTE_EXCEPTION  
105 CODE_ERROR_THROWABLE  
106 CODE_ERROR_NO_SUCH_METHOD  
107 CODE_ERROR_INVALID_SESSION  
108 CODE_ERROR_INVALID_PARAMETER  
109 CODE_ERROR_SESSION_NOT_RUNNING  

Obtaining sessions IDs

Use the esadmin check command to show a list of components and their corresponding session IDs. The following table shows a list of common sessions, their IDs, the server that they are on, and the state of the session.
Table 28. Examples of sessions names, origin servers, session IDs, and session states
Session Server where the session is running Session ID Session state
configmanager master server 10433 Started
controller master server 10464 Started
customcommunication master server Not applicable Not applicable
discovery master server 10649 Started
monitor master server 10682 Started
parserservice master server 10718 Started
resource.node1 master server 10759 Started
samplecpp master server 10827 Started
sampletest master server 10857 Started
scheduler master server 10889 Started
searchmanager.node1 master server 10927 Started
utilities.node1 master server 10384 Started