IBM Support

Validating your Kerberos configuration for CDC replication

How To


You can validate your Kerberos configuration outside of CDC replication before you attempt to set up replication. Taking this approach can reduce the overall complexity of the task.


Step 1: Validate keytab, principal name, and Kerberos client configuration by using the command line

This step validates a keytab file, principal name, and Kerberos client configuration that you received from Kerberos administrator.

First, print the contents of a keytab file. This command lists all the keys stored in a keytab file, along with encryption algorithms and corresponding principals. Verify that the principal name that you received is on the list.

klist -k -e -K -t FILE:/path/to/keytab

Next, manually request a Kerberos ticket. The command prints debug information on standard output. 

KRB5_TRACE=/dev/stdout kinit -V -k -t /path/to/keytab principalName

Check whether the ticket was successfully acquired by consulting the content of credentials cache.


Once the test is done, destroy the tickets in credentials cache.


Step 2: Validate IDR CDC for Kafka configuration

Once you determine that a keytab file, principal name and Kerberos client configuration are correct, move on to testing your IDR CDC for Kafka configuration.

You will be validating:

  • JAAS file.
  • “” file.
  • “” file.

For instructions on how to populate the files, refer to the article How to install and configure the CDC replication engine for Apache Kafka®.

You can find relevant information in “Specify Kafka producer properties” and “Create a JAAS file” sections.

To conduct the test, first log in to Kafka server and issue these commands to produce to a Kafka topic.

export JAVA_HOME=/cdc/install/dir/jre64/jre

echo 16830912 | /kafka/install/dir/bin/kafka-avro-console-producer --broker-list

brokerHostname:brokerPort --topic topicName --producer.config /path/to/ -property schema.registry.url=http://schemaHost:schemaPort --property value.schema='{"type":"int"}'

Next, consume the data that you just produced.

/kafka/install/dir/bin/kafka-avro-console-consumer --bootstrap-server brokerHostname:brokerPort --topic topicName --consumer.config /path/to/ --property schema.registry.url=http://schemaHost:schemaPort --from-beginning

If your Kafka environment does not use a schema registry, issue these commands.

export JAVA_HOME=/cdc/install/dir/jre64/jre
export KAFKA_OPTS=""
echo 16830912 | /kafka/install/dir/bin/kafka-console-producer --broker-list brokerHostname:brokerPort -topic topicName --producer.config /path/to/

/kafka/install/dir/bin/kafka-console-consumer --bootstrap-server brokerHostname:brokerPort --topic topicName --consumer.config /path/to/ --from-beginning

If you need Kerberos debug information, set these SCHEMA_REGISTRY_OPTS or KAFKA_OPTS environmental variables values.


export KAFKA_OPTS=""

Related Information

Document Location


[{"Line of Business":{"code":"LOB10","label":"Data and AI"},"Business Unit":{"code":"BU059","label":"IBM Software w\/o TPS"},"Product":{"code":"SSTRGZ","label":"InfoSphere Data Replication"},"ARM Category":[{"code":"a8m0z0000001gSjAAI","label":"CDC"}],"ARM Case Number":"","Platform":[{"code":"PF016","label":"Linux"}],"Version":"All Version(s)"}]

Product Synonym

IDR;IBM Data Replication;IIDR;IBM InfoSphere Data Replication;CDC;Change Data Capture

Document Information

Modified date:
19 November 2020