What happened in the Kafka community in March 2019?

Releases:

After three Release Candidates, Matthias J. Sax released Apache Kafka 2.2.0 on the March 26, 2019. This new minor version contains a number of interesting features:

  • The ability to re-authenticate SASL connections periodically (KIP-368)

  • Hardened inter-broker protocol (KIP-380)

  • The ability to separate Controller traffic from data plane (KIP-291)

  • Producer/AdminClient/Streams API improvements

  • Improved consumer group management (KIP-289)

  • Metrics without a value are now emitted as NaN instead of various values (KIP-386)

  • Updated kafka-topics.sh tool—it now uses the AdminClient API and does not require access to zookeeper anymore (KIP-377)

  • All command line tools now accept the --help flag (KIP-374)

KIPs:

Last month, the community submitted 10 KIPs (KIP-438 to KIP-447). These are the ones that caught my eye:

KIP-440: Extend Connect Converter to support headers
At the moment, Kafka Connect Converters don’t support message headers. In environments that rely on a lot of headers, this makes it impossible to use Connect to import or export messages. This KIP aims at fixing this discrepancy and allowing Converters to use headers.

KIP-443: Return to default segment.ms and segment.index.bytes in Streams repartition topics
In 2.0.0, the default configurations of Kafka Streams repartition topics changed in order to improve high throughput applications. However, it turns out these settings are a bit too aggressive for low throughput use cases. This KIP proposes removing these settings and instead use the default values.

KIP-444: Augment metrics for Kafka Streams
Operating Streams applications is currently relatively hard due to the lack of several key metrics. This KIP aims at addressing these issues and started doing a full review of all existing Streams metrics. The idea is to remove metrics that turned out not useful and provide new metrics to allow better monitoring and ease debugging.

Blogs:

IBM Event Streams for Cloud is Apache Kafka-as-a-service for IBM Cloud.

Get started with IBM Event Streams

Categories

More from Announcements

IBM TechXchange underscores the importance of AI skilling and partner innovation

3 min read - Generative AI and large language models are poised to impact how we all access and use information. But as organizations race to adopt these new technologies for business, it requires a global ecosystem of partners with industry expertise to identify the right enterprise use-cases for AI and the technical skills to implement the technology. During TechXchange, IBM's premier technical learning event in Las Vegas last week, IBM Partner Plus members including our Strategic Partners, resellers, software vendors, distributors and service…

Introducing Inspiring Voices, a podcast exploring the impactful journeys of great leaders

< 1 min read - Learning about other people's careers, life challenges, and successes is a true source of inspiration that can impact our own ambitions as well as life and business choices in great ways. Brought to you by the Executive Search and Integration team at IBM, the Inspiring Voices podcast will showcase great leaders, taking you inside their personal stories about life, career choices and how to make an impact. In this first episode, host David Jones, Executive Search Lead at IBM, brings…

IBM watsonx Assistant and NICE CXone combine capabilities for a new chapter in CCaaS

5 min read - In an age of instant everything, ensuring a positive customer experience has become a top priority for enterprises. When one third of customers (32%) say they will walk away from a brand they love after just one bad experience (source: PWC), organizations are now applying massive investments to this experience, particularly with their live agents and contact centers.  For many enterprises, that investment includes modernizing their call centers by moving to cloud-based Contact Center as a Service (CCaaS) platforms. CCaaS solutions…

See what’s new in SingleStoreDB with IBM 8.0

3 min read - Despite decades of progress in database systems, builders have compromised on at least one of the following: speed, reliability, or ease. They have two options: one, they could get a document database that is fast and easy, but can’t be relied on for mission-critical transactional applications. Or two, they could rely on a cloud data warehouse that is easy to set up, but only allows lagging analytics. Even then, each solution lacks something, forcing builders to deploy other databases for…