Kafka Monthly Digest: September 2019
3 min read
In this 20th edition of the Kafka Monthly Digest, I will cover what happened in the Kafka Community in September.
For last month’s digest, see “Kafka Monthly Digest: August 2019.”
The third Kafka Summit of the year went down on September 30 and October 1, 2019, at the Hilton Union Square in San Francisco. Both days were packed with sessions splits across four tracks: Core Kafka, Event-Driven Development, Stream Processing, and Use Cases.
The keynotes are already available on the Summit website and the videos of the sessions should be there in the next couple of weeks.
The first class of Community Catalysts was also announced. See the nomination page to find the list of nominees or to nominate someone.
There are currently two bugfix releases and one minor release in progress.
- 2.2.2: Randall Hauch started the process for this bugfix release on September 12, 2019. Since 2.2.1, 25 issues have been identified, including three blocker JIRAs. For more details, see the release plan on the wiki.
- 2.3.1: On September 5, 2019, David Arthur volunteered to run this bugfix release. It contains over 30 fixes, including four blocker issues. You can find the release plan on the wiki.
- 2.4.0: The process for 2.4.0 continued, with KIP freeze on September 25, 2019. The release date is still expected for the end of October. As always, the release plan on the wiki contains all the details.
Last month, the community submitted 13 KIPs (KIP-516 to KIP-528) and these are the ones that caught my eye:
KIP-516: Topic Identifiers: At the moment, topics are solely identified by their names. This works well most of the time but is often causing issues when a topic needs to be deleted and re-created. Topic deletion is asynchronous and only completes when all replicas have been deleted. For example, when deleting a topic that has a replica on a broker offline for maintenance, a new topic with the same name cannot be created. This KIP proposes adding unique identifiers to topics to differentiate them and allow for re-creation of a topic immediately after its deletion.
KIP-517: Add consumer metrics to observe user poll behavior: Kafka Consumers have to keep calling
poll() regularly in order to stay active. When failing to do so, this causes a group rebalance, which can cause disruptions. When doing long processing in the Consumer, it can be hard to ensure
poll() is called often enough. This KIP aims at simplifying this by providing metrics for
poll(), such as the delay (max/average/current) between invocations and the fraction of time spent between calls to
KIP-524: Allow users to choose config source when describing configs: The
kafka-configs.sh tool allows you to list and edit configurations for many Kafka entities, such as topics, brokers, and users. However, until now, when listing configurations for a broker, it would only print dynamically set configurations. This KIPs adds a new argument,
--all, to list all broker configurations.
KIP-525 - Return topic metadata and configs in CreateTopics response: Currently, when creating a topic, the response only contains whether the operation succeeded or not (and in case it failed, the error too). This KIP proposes including the topic metadata when creation is successful. This will allow tools to directly retrieve this information upon creation without having to do another call to fetch the topic configurations.
- Building a Relational Database using Kafka
- How to Count Events in Kafka Streams
- Why Do You Need Apache Kafka for Your Cloud Migration
- Building Audit Logs with Change Data Capture and Stream Processing
IBM Event Streams for Cloud is Apache Kafka-as-a-Service for IBM Cloud.