A hands-on guide to teach you how to interact with Kafka using the Rust programming language

This is a two-part series to help you get started with Rust and Kafka. We will be using the rust-rdkafka crate which itself is based on librdkafka (C library).

In this post we will cover the Kafka Consumer API.

Here is the first part:

Initial setup

Make sure you install a Kafka broker — a local setup should suffice. Of course you will need to have Rust installed as well — you will need version 1.45 or above

Before you begin, clone the GitHub repo:

git clone https://github.com/abhirockzz/rust-kafka-101
cd part2

Simple Consumer

Creating a low-level consumer (BaseConsumer) is strikingly similar to how you’d create…


A hands-on guide to teach you how to interact with Kafka using the Rust programming language

This is a two-part series to help you get started with Rust and Kafka. We will be using the rust-rdkafka crate which itself is based on librdkafka (C library).

In this post we will cover the Kafka Producer API.

Update: Part 2 is available here

Initial setup

Make sure you install a Kafka broker — a local setup should suffice. Of course you will need to have Rust installed as well — you will need version 1.45 or above

Before you begin, clone the GitHub repo:

git clone https://github.com/abhirockzz/rust-kafka-101
cd part1

Check the Cargo.toml file:

...
[dependencies]
rdkafka = { version = "0.25", features…


A practical example: Ingest tweets in real-time and query them flexibly using Redis

Redis has a versatile set of data structures ranging from simple Strings all the way to powerful abstractions such as Redis Streams. The native data types can take you a long way, but there are certain use cases that may require a workaround. One example is the requirement to use secondary indexes in Redis in order to go beyond the key-based search/lookup for richer query capabilities. Though you can use Sorted Sets, Lists, and so on to get the job done, you’ll need to factor in some trade-offs.

Enter RediSearch! Available as a Redis module, RediSearch provides flexible search capabilities…


Originally published at https://www.confluent.io on March 25, 2021.

Self-managing a distributed system like Apache Kafka ®, along with building and operating Kafka connectors, is complex and resource intensive. It requires significant Kafka skills and expertise in the development and operations teams of your organization. Additionally, the higher the volumes of real-time data that you work with, the more challenging it becomes to ensure that all of the infrastructure scales efficiently and runs reliably.

Confluent and Microsoft are working together to make the process of adopting event streaming easier than ever by alleviating the typical infrastructure management needs that often pull…


Use KEDA to scale up/down Celery worker containers

This blog post demonstrates how to auto-scale your Redis based applications on Kubernetes. Redis is a widely used (and loved!) database which supports a rich set of data structures (String, Hash, Streams, Geospatial), as well as other features such as pub/sub messaging, clustering (HA) etc. One such data structure is a List which supports operations such as inserts (LPUSH, RPUSH, LINSERT etc.), reads (LRANGE), deletes (LREM, LPOP etc.) etc. But that's not all!

Redis Lists are quite versatile and used as the backbone for implementing scalable architectural patterns such as consumer-producer (based on queues), where producer applications push items into…


How do you pick the right data model in Azure Cosmos DB? SQL, MongoDB, Cassandra, Graph or Table?

Photo by Guillermo Ferla on Unsplash

Azure Cosmos DB is a fully managed, elastically scalable and globally distributed database with a multi-model approach, and provides you with the ability to use document, key-value, wide-column, or graph-based data.

In this blog, we will dive into the multi-model capabilities and explore the options that are available to store and access data. Hopefully, it can help you make an informed decision on the right API are the right choice.

  • Core (SQL) API: Flexibility of a NoSQL document store combined with the power of SQL for querying.
  • MongoDB API: Supports the MongoDB wire protocol so that existing MongoDB client continue…


Using Redis Data Source for Grafana

Recently, I discovered a nice way of plugging in monitoring for Redis using Grafana, thanks to this great Data Source plugin that works with any Redis database, including Azure Cache for Redis!

It’s really easy to setup and try

Setup an Azure Cache for Redis instance

Start Grafana in Docker:

docker run -d -p 3000:3000 --name=grafana -e "GF_INSTALL_PLUGINS=redis-datasource" grafana/grafana

Access Grafana dashboard — browse to http://localhost:3000/

Enter admin as the username and password

Add the Data Source

Choose Redis


A hands-on tutorial on how to use Redis with the Rust programming language

Are you learning Rust and looking for ways to get hands-on practice with concrete examples? A good approach might be to try and integrate Rust with external systems. Why not try to integrate it with Redis, which is a powerful, versatile database but dead simple to get started with!

Photo by Danial RiCaRoS on Unsplash

In this blog post, you will learn how to use the Rust programming language to interact with Redis using the redis-rs client. We will walk through commonly used Redis data structures such as String, Hash, List etc. …


A tutorial on how use Kafka Connect for real-time data synchronisation

Photo by Quinten de Graaf on Unsplash

Apache Kafka often serves as a central component in the overall data architecture with other systems pumping data into it. But, data in Kafka (topics) is only useful when consumed by other applications or ingested into other systems. Although, it is possible to build a solution using the Kafka Producer/Consumer APIs using a language and client SDK of your choice, there are other options in the Kafka ecosystem.

One of them is Kafka Connect, which is a platform to stream data between Apache Kafka and other systems in a scalable and reliable manner. …


This tutorial is based on a practical scenario along with a re-usable Docker Compose setup to help with iterative development and experimentation

Azure Cosmos DB Cassandra API is a fully managed cloud service that is compatible with Cassandra Query Language (CQL) v3.11 API. It has no operational overhead and you can benefit from all the underlying Azure Cosmos DB capabilities such as global distribution, automatic scale out partitioning, availability and latency guarantees, encryption at rest, backups etc.

At the time of writing this blog, the Azure Cosmos DB Cassandra API serverless is available in preview mode !

Your existing Cassandra applications can work with the Azure Cosmos DB Cassandra API since it works with CQLv4 compliant drivers (see examples for Java, .Net…

Abhishek Gupta

Currently working with Kafka, Databases, Azure, Kubernetes and related open source projects | Confluent Community Catalyst (for Kafka)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store