How do you pick the right data model in Azure Cosmos DB? SQL, MongoDB, Cassandra, Graph or Table?

Image for post
Image for post
Photo by Guillermo Ferla on Unsplash

Azure Cosmos DB is a fully managed, elastically scalable and globally distributed database with a multi-model approach, and provides you with the ability to use document, key-value, wide-column, or graph-based data.

In this blog, we will dive into the multi-model capabilities and explore the options that are available to store and access data. Hopefully, it can help you make an informed decision on the right API are the right choice.

  • Core (SQL) API: Flexibility of a NoSQL document store combined with the power of SQL for querying.

Using Redis Data Source for Grafana

Recently, I discovered a nice way of plugging in monitoring for Redis using Grafana, thanks to this great Data Source plugin that works with any Redis database, including Azure Cache for Redis!

It’s really easy to setup and try

Setup an Azure Cache for Redis instance

Start Grafana in Docker:

docker run -d -p 3000:3000 --name=grafana -e "GF_INSTALL_PLUGINS=redis-datasource" grafana/grafana

Access Grafana dashboard — browse to http://localhost:3000/

Enter admin as the username and password

Add the Data Source

Image for post
Image for post

Choose Redis

A hands-on tutorial on how to use Redis with the Rust programming language

Are you learning Rust and looking for ways to get hands-on practice with concrete examples? A good approach might be to try and integrate Rust with external systems. Why not try to integrate it with Redis, which is a powerful, versatile database but dead simple to get started with!

Image for post
Image for post
Photo by Danial RiCaRoS on Unsplash

In this blog post, you will learn how to use the Rust programming language to interact with Redis using the redis-rs client. We will walk through commonly used Redis data structures such as String, Hash, List etc. …

A tutorial on how use Kafka Connect for real-time data synchronisation

Image for post
Image for post
Photo by Quinten de Graaf on Unsplash

Apache Kafka often serves as a central component in the overall data architecture with other systems pumping data into it. But, data in Kafka (topics) is only useful when consumed by other applications or ingested into other systems. Although, it is possible to build a solution using the Kafka Producer/Consumer APIs using a language and client SDK of your choice, there are other options in the Kafka ecosystem.

One of them is Kafka Connect, which is a platform to stream data between Apache Kafka and other systems in a scalable and reliable manner. …

This tutorial is based on a practical scenario along with a re-usable Docker Compose setup to help with iterative development and experimentation

Azure Cosmos DB Cassandra API is a fully managed cloud service that is compatible with Cassandra Query Language (CQL) v3.11 API. It has no operational overhead and you can benefit from all the underlying Azure Cosmos DB capabilities such as global distribution, automatic scale out partitioning, availability and latency guarantees, encryption at rest, backups etc.

At the time of writing this blog, the Azure Cosmos DB Cassandra API serverless is available in preview mode !

Your existing Cassandra applications can work with the Azure Cosmos DB Cassandra API since it works with CQLv4 compliant drivers (see examples for Java, .Net…

Learn how to implement a Slack app as a Serverless backend

Webhook backend is a popular use case for FaaS (Functions-as-a-service) platforms. They could be used for many use cases such as sending customer notifications to responding with funny GIFs! Using a Serverless function, it’s quite convenient to encapsulate the webhook functionality and expose it in the form of an HTTP endpoint. In this tutorial you will learn how to implement a Slack app as a Serverless backend using Azure Functions and Go. …

“In God we trust, all others must bring data”

William Edwards Deming

Well, Microsoft is bringing to you, Data Week 🙌 A celebration of Data & Data Technologies, running throughout the week, starting December 7, 2020!

Image for post
Image for post
Create: Data

It kicks off with Create: Data, a completely FREE online event.

Register at !

Learn how to use Change Data Capture to stream database modifications from PostgreSQL to Azure Data Explorer (Kusto) using Kafka Connect platform.


Change Data Capture (CDC) can be used to track row-level changes in database tables in response to create, update and delete operations. It is a powerful technique, but useful only when there is a way to leverage these events and make them available to other services.

Using Apache Kafka, it is possible to convert traditional batched ETL processes into real-time, streaming mode. You can do-it-yourself (DIY) and write good old Kafka producer/consumer using a client SDK of your choice. But why would you do that when you’ve Kafka Connect and it’s suite of ready-to-use connectors?

Once you opt for Kafka…

In this blog, we will go over how to ingest data into Azure Data Explorer using the open source Kafka Connect Sink connector for Azure Data Explorer running on Kubernetes using Strimzi. Kafka Connect is a tool for scalably and reliably streaming data between Apache Kafka and other systems using source and sink connectors and Strimzi provides a “Kubernetes-native” way of running Kafka clusters as well as Kafka Connect workers.

Azure Data Explorer is a fast and scalable data exploration service that lets you collect, store, and analyze large volumes of data from any diverse sources, such as websites, applications…

Image for post
Image for post

Azure Cosmos DB is a resource governed system that allows you to execute a certain number of operations per second based on the provisioned throughput you have configured. If clients exceed that limit and consume more request units than what was provisioned, it leads to rate limiting of subsequent requests and exceptions being thrown — they are also referred to as 429 errors.

With the help of a practical example, I’ll demonstrate how to incorporate fault-tolerance in your Go applications by handling and retrying operations affected by these rate limiting errors. …

Abhishek Gupta

Currently working with Kafka, Databases, Azure, Kubernetes and related open source projects | Confluent Community Catalyst (for Kafka)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store