Troubleshooting
Problem
When you configure an IBM Content Analytics DB2 crawler to crawl a remote server, some databases in DB2COPY environments might not be discovered if more than one copy of DB2COPY is installed on the same DB2 Enterprise Server Edition Version 9.1, 9.5, or 9.7 server.
Resolving The Problem
Use one of the following methods to create a crawler for crawling DB2 databases in a multiple DB2COPY environment:
- Use the JDBC database crawler instead of the DB2 crawler. Specify the database URL (DBURL) for a database that is not discovered.
- Catalog a database that is not discovered on the crawler server so that the database can be accessed as if it were a local database.
For example, assume that the SAMPLE1 database on DB2COPY1 at example.ibm.com is not discovered by the DB2 crawler, and that DB2COPY1 uses port 50001 for remote access.
- Open a DB2 command line process on the crawler server.
- Enter the following command to catalog the DB2COPY1 server:
CATALOG TCPIP NODE example REMOTE example.ibm.com SERVER 50001 - Enter the following command to catalog the SAMPLE1 database:
CATALOG DATABASE SAMPLE1 AT NODE example - Log in to the administration console and create a DB2 crawler. Select Local or cataloged databases on the DB2 Database Type page, and then discover the database.
Related Information
[{"Product":{"code":"SS5RWK","label":"Content Analytics with Enterprise Search"},"Business Unit":{"code":"BU053","label":"Cloud & Data Platform"},"Component":"--","Platform":[{"code":"PF002","label":"AIX"},{"code":"PF016","label":"Linux"},{"code":"PF033","label":"Windows"}],"Version":"3.0;2.2;2.1;3.5","Edition":"","Line of Business":{"code":"LOB10","label":"Data and AI"}}]
Was this topic helpful?
Document Information
More support for:
Content Analytics with Enterprise Search
Software version:
3.0, 2.2, 2.1, 3.5
Operating system(s):
AIX, Linux, Windows
Document number:
132063
Modified date:
23 June 2018
UID
swg21411524