A hands-on tutorial on how to use Redis with the Rust programming language

Are you learning Rust and looking for ways to get hands-on practice with concrete examples? A good approach might be to try and integrate Rust with external systems. Why not try to integrate it with Redis, which is a powerful, versatile database but dead simple to get started with!

Image for post
Image for post
Photo by Danial RiCaRoS on Unsplash

In this blog post, you will learn how to use the Rust programming language to interact with Redis using the redis-rs client. We will walk through commonly used Redis data structures such as String, Hash, List etc. …


A tutorial on how use Kafka Connect for real-time data synchronisation

Image for post
Image for post
Photo by Quinten de Graaf on Unsplash

Apache Kafka often serves as a central component in the overall data architecture with other systems pumping data into it. But, data in Kafka (topics) is only useful when consumed by other applications or ingested into other systems. Although, it is possible to build a solution using the Kafka Producer/Consumer APIs using a language and client SDK of your choice, there are other options in the Kafka ecosystem.

One of them is Kafka Connect, which is a platform to stream data between Apache Kafka and other systems in a scalable and reliable manner. …


This tutorial is based on a practical scenario along with a re-usable Docker Compose setup to help with iterative development and experimentation

Azure Cosmos DB Cassandra API is a fully managed cloud service that is compatible with Cassandra Query Language (CQL) v3.11 API. It has no operational overhead and you can benefit from all the underlying Azure Cosmos DB capabilities such as global distribution, automatic scale out partitioning, availability and latency guarantees, encryption at rest, backups etc.

At the time of writing this blog, the Azure Cosmos DB Cassandra API serverless is available in preview mode !

Your existing Cassandra applications can work with the Azure Cosmos DB Cassandra API since it works with CQLv4 compliant drivers (see examples for Java, .Net Core, Node.js, Python etc.) But, you also need to think about integrating with other systems with existing data and bringing that into Azure Cosmos DB. One such system is Apache Kafka, which is a distributed streaming platform. It is used in industries and organizations to solve a wide variety of problems ranging from traditional asynchronous messaging, website activity tracking, log aggregation, real-time fraud detection and much more! …


Learn how to implement a Slack app as a Serverless backend

Webhook backend is a popular use case for FaaS (Functions-as-a-service) platforms. They could be used for many use cases such as sending customer notifications to responding with funny GIFs! Using a Serverless function, it’s quite convenient to encapsulate the webhook functionality and expose it in the form of an HTTP endpoint. In this tutorial you will learn how to implement a Slack app as a Serverless backend using Azure Functions and Go. …


“In God we trust, all others must bring data”

William Edwards Deming

Well, Microsoft is bringing to you, Data Week 🙌 A celebration of Data & Data Technologies, running throughout the week, starting December 7, 2020!

Image for post
Image for post
Create: Data

It kicks off with Create: Data, a completely FREE online event.

Register at https://aka.ms/createdata !


Learn how to use Change Data Capture to stream database modifications from PostgreSQL to Azure Data Explorer (Kusto) using Kafka Connect platform.

Introduction

Change Data Capture (CDC) can be used to track row-level changes in database tables in response to create, update and delete operations. It is a powerful technique, but useful only when there is a way to leverage these events and make them available to other services.

Using Apache Kafka, it is possible to convert traditional batched ETL processes into real-time, streaming mode. You can do-it-yourself (DIY) and write good old Kafka producer/consumer using a client SDK of your choice. But why would you do that when you’ve Kafka Connect and it’s suite of ready-to-use connectors?

Once you opt for Kafka Connect, you have a couple of options. One is the JDBC connector which basically polls the target database table(s) to get the information. There is a better (albeit, a little more complex) way based on change data capture. Enter Debezium, which is a distributed platform that builds on top of Change Data Capture features available in different databases. It provides a set of Kafka Connect connectors which tap into row-level changes in database table(s) and convert them into event streams that are sent to Apache Kafka. Once the change log events are in Kafka, they will be available to all the downstream applications. …


In this blog, we will go over how to ingest data into Azure Data Explorer using the open source Kafka Connect Sink connector for Azure Data Explorer running on Kubernetes using Strimzi. Kafka Connect is a tool for scalably and reliably streaming data between Apache Kafka and other systems using source and sink connectors and Strimzi provides a “Kubernetes-native” way of running Kafka clusters as well as Kafka Connect workers.

Azure Data Explorer is a fast and scalable data exploration service that lets you collect, store, and analyze large volumes of data from any diverse sources, such as websites, applications, IoT devices, and more. It has a rich connector ecosystem that supports ingestion into Azure Data Explorer as detailed here. One of the supported sources is Apache Kafka and the sink connector allows you to move data from Kafka topics into Azure Data Explorer tables which you can later query and analyse. …


Originally published here: https://devblogs.microsoft.com/cosmosdb/build-fault-tolerant-applications-cassandra/

Image for post
Image for post

Azure Cosmos DB is a resource governed system that allows you to execute a certain number of operations per second based on the provisioned throughput you have configured. If clients exceed that limit and consume more request units than what was provisioned, it leads to rate limiting of subsequent requests and exceptions being thrown — they are also referred to as 429 errors.

With the help of a practical example, I’ll demonstrate how to incorporate fault-tolerance in your Go applications by handling and retrying operations affected by these rate limiting errors. …


This blog provides a practical example of how to use Azure Stream Analytics to process streaming data from Azure Event Hubs. You should be able to go through this tutorial using the Azure Portal (or Azure CLI), without writing any code. There are also other resources for exploring stream processing with Azure Stream Analytics at the end of this blog post.

What’s covered?

  • A quick overview of the use case, solution, and its constituents
  • How to setup the required Azure services: Event Hubs, Stream Analytics and Blob storage
  • How to configure and test your Azure Stream Analytics Job with sample…

With traditional architectures, it’s quite hard to counter challenges imposed by real-time streaming data — one such use case is joining streams of data from disparate sources. For example, think about a system that accepts processed orders from customers (real time, high velocity data source) and the requirement is to enrich these “raw” orders with additional customer info such as name, email, location etc. A possible solution is to build a service that fetches customer data for each customer ID from an external system (for example, a database), perform a join (in-memory) and stores the enriched data in another database perhaps (materialized view). …

About

Abhishek Gupta

Currently working with Kafka, Databases, Azure, Kubernetes and related open source projects | Confluent Community Catalyst (for Kafka)

Get the Medium app

A button that says 'Download on the App Store', and if clicked it will lead you to the iOS App store
A button that says 'Get it on, Google Play', and if clicked it will lead you to the Google Play store