apache-kafka

How to delete and then create topics correctly in a script for Kafka?

How to delete and then create topics correctly in a script for Kafka? Question: I am working on a script to refresh my topics on an AWS managed Kafka cluster. I need to wipe out the existing data whenever I run the script and I did it by deleting and creating the same topics again. …

Total answers: 2

consuming Kafka Avro massages in Python

consuming Kafka Avro massages in Python Question: I am trying to consume messages from Kafka Avro in Python. We have it in Java, and it’s working, but when trying to consume it in the Jupyter notebook, Parsing does not work. I followed the example given by the documentation: (I’ve removed conf information for security reasons) …

Total answers: 1

Error installing confluent kafka – ld: library not found for -lrdkafka

Error installing confluent kafka – ld: library not found for -lrdkafka Question: I am trying to install confluent kafka python module on my mac os. My python version is 3.9 I am keep getting below error. Does anyone know how to fix this? clang -Wno-unused-result -Wsign-compare -Wunreachable-code -fno-common -dynamic -DNDEBUG -g -fwrapv -O3 -Wall -I/opt/homebrew/opt/[email protected]/Frameworks/Python.framework/Versions/3.9/include/python3.9 …

Total answers: 1

Python Pandas – Simulate Streaming to Kafka

Python Pandas – Simulate Streaming to Kafka Question: I am trying to practice some Kafka producing / consuming and am trying to set up a simulated ‘stream’ of data. I have tried looping through with time.sleep(0.0000001) but it is too slow to catch the entries. Here is what I am trying to do: offsets = …

Total answers: 1

how to get count of unread messages in a Kafka topic for a certain group

how to get count of unread messages in a Kafka topic for a certain group Question: I know, Kafka is meant to be dealt with like an infinite stream of events, and getting the remaining messages count isn’t a built-in functionality. but I have to somehow monitor how my consumer processes are doing and if …

Total answers: 2

How to programatically register Avro Schema in Kafka Schema Registry using Python

How to programatically register Avro Schema in Kafka Schema Registry using Python Question: I put data and schema to kafka and schema registry with python. from confluent_kafka import avro from confluent_kafka.avro import AvroProducer value_schema_str = """ { "type":"record", "name":"myrecord", "fields":[ { "name":"ID", "type":["null", "int"], "default":null }, { "name":"PRODUCT", "type":["null", "string"], "default":null }, { "name":"QUANTITY", "type":["null", …

Total answers: 1

How to programmatically check if Kafka Broker is up and running in Python

How to programmatically check if Kafka Broker is up and running in Python Question: I’m trying to consume messages from a Kafka topic. I’m using a wrapper around confluent_kafka consumer. I need to check if connection is established before I start consuming messages. I read that the consumer is lazy, so I need to perform …

Total answers: 1

Push Messages from AWS Lambda to Kafka

Push Messages from AWS Lambda to Kafka Question: I have a kafka machine running in AWS which consists of several topics. I have the following Lambda function which Produces a message and push that to one of the kafka topic. import json from kafka import KafkaClient from kafka import SimpleProducer from kafka import KafkaProducer def …

Total answers: 1

Does confluent kafka provide an api for streaming, grouping and aggregation in python?

Does confluent kafka provide an api for streaming, grouping and aggregation in python? Question: Does confluent kafka provide an api for streaming, grouping and aggregation in python language? Asked By: Mahamutha M || Source Answers: There’s no such thing as “Confluent Kafka”. Confluent Platform is a distribution of Apache Kafka®, along with other components such …

Total answers: 1

Kafka AvroConsumer consume from timestamp using offsets_for_times

Kafka AvroConsumer consume from timestamp using offsets_for_times Question: Trying to use confluent_kafka.AvroConsumer to consume messages from a given time stamp. if flag: # creating a list topic_partitons_to_search = list( map(lambda p: TopicPartition(‘my_topic2’, p, int(time.time())), range(0, 1))) print(“Searching for offsets with %s” % topic_partitons_to_search) offsets = c.offsets_for_times(topic_partitons_to_search, timeout=1.0) print(“offsets_for_times results: %s” % offsets) for x in …

Total answers: 2