kandi background
Explore Kits

afkak | Kafka client written in Twisted Python | Pub Sub library

 by   ciena Python Version: 20.10.0 License: Apache-2.0

 by   ciena Python Version: 20.10.0 License: Apache-2.0

Download this library from

kandi X-RAY | afkak Summary

afkak is a Python library typically used in Messaging, Pub Sub, Kafka applications. afkak has no bugs, it has no vulnerabilities, it has build file available, it has a Permissive License and it has low support. You can install using 'pip install afkak' or download it from GitHub, PyPI.
Afkak is a Twisted-native Apache Kafka client library. It provides support for:. Learn more in the documentation, download from PyPI, or review the contribution guidelines. Please report any issues on GitHub.

kandi-support Support

  • afkak has a low active ecosystem.
  • It has 27 star(s) with 18 fork(s). There are 8 watchers for this library.
  • It had no major release in the last 12 months.
  • There are 18 open issues and 27 have been closed. On average issues are closed in 173 days. There are 3 open pull requests and 0 closed requests.
  • It has a neutral sentiment in the developer community.
  • The latest version of afkak is 20.10.0
afkak Support
Best in #Pub Sub
Average in #Pub Sub
afkak Support
Best in #Pub Sub
Average in #Pub Sub

quality kandi Quality

  • afkak has 0 bugs and 0 code smells.
afkak Quality
Best in #Pub Sub
Average in #Pub Sub
afkak Quality
Best in #Pub Sub
Average in #Pub Sub


  • afkak has no vulnerabilities reported, and its dependent libraries have no vulnerabilities reported.
  • afkak code analysis shows 0 unresolved vulnerabilities.
  • There are 0 security hotspots that need review.
afkak Security
Best in #Pub Sub
Average in #Pub Sub
afkak Security
Best in #Pub Sub
Average in #Pub Sub

license License

  • afkak is licensed under the Apache-2.0 License. This license is Permissive.
  • Permissive licenses have the least restrictions, and you can use them in most projects.
afkak License
Best in #Pub Sub
Average in #Pub Sub
afkak License
Best in #Pub Sub
Average in #Pub Sub


  • afkak releases are available to install and integrate.
  • Deployable package is available in PyPI.
  • Build file is available. You can build the component from source.
  • Installation instructions, examples and code snippets are available.
  • It has 10863 lines of code, 734 functions and 40 files.
  • It has medium code complexity. Code complexity directly impacts maintainability of the code.
afkak Reuse
Best in #Pub Sub
Average in #Pub Sub
afkak Reuse
Best in #Pub Sub
Average in #Pub Sub
Top functions reviewed by kandi - BETA

kandi has reviewed afkak and discovered the below as its top functions. This is intended to give you an instant insight into afkak implemented functionality, and help decide if they suit your requirements.

  • Handle a send response .
  • Send a request to the broker .
  • Process a fetch response .
  • Rejoin the group .
  • Send a request to broker .
  • Performs a puremurmur2 hash .
  • Decode a metadata response .
  • Shutdown the consumer .
  • Normalize host and port .
  • Encodes the given data using snappy .

afkak Key Features

Producing messages, with automatic batching and optional compression.

Consuming messages, with group coordination and automatic commit.

High level

copy iconCopydownload iconDownload
from afkak.client import KafkaClient
from afkak.consumer import Consumer
from afkak.producer import Producer

kClient = KafkaClient("localhost:9092")

# To send messages
producer = Producer(kClient)
d1 = producer.send_messages("my-topic", msgs=[b"some message"])
d2 = producer.send_messages("my-topic", msgs=[b"takes a list", b"of messages"])
# To get confirmations/errors on the sends, add callbacks to the returned deferreds
d1.addCallbacks(handleResponses, handleErrors)

# To wait for acknowledgements
# PRODUCER_ACK_LOCAL_WRITE : server will wait till the data is written to
#                         a local log before sending response
# [ the default ]
# PRODUCER_ACK_ALL_REPLICAS : server will block until the message is committed
#                            by all in sync replicas before sending a response
producer = Producer(kClient,

responseD = producer.send_messages("my-topic", msgs=[b"message"])

# Using twisted's @inlineCallbacks:
responses = yield responseD
if response:

# To send messages in batch: You can use a producer with any of the
# partitioners for doing this. The following producer will collect
# messages in batch and send them to Kafka after 20 messages are
# collected or every 60 seconds (whichever comes first). You can
# also batch by number of bytes.
# Notes:
# * If the producer dies before the messages are sent, the caller would
# * not have had the callbacks called on the send_messages() returned
# * deferreds, and so can retry.
# * Calling producer.stop() before the messages are sent will
# errback() the deferred(s) returned from the send_messages call(s)
producer = Producer(kClient, batch_send=True,
responseD1 = producer.send_messages("my-topic", msgs=[b"message"])
responseD2 = producer.send_messages("my-topic", msgs=[b"message 2"])

# To consume messages
# define a function which takes a list of messages to process and
# possibly returns a deferred which fires when the processing is
# complete.
def processor_func(consumer, messages):
    #  Store_Messages_In_Database may return a deferred
    result = store_messages_in_database(messages)
    # record last processed message
    return result

the_partition = 3  # Consume only from partition 3.
consumer = Consumer(kClient, "my-topic", the_partition, processor_func)
d = consumer.start(OFFSET_EARLIEST)  # Start reading at earliest message
# The deferred returned by consumer.start() will fire when an error
# occurs that can't handled by the consumer, or when consumer.stop()
# is called
yield d


Low level

copy iconCopydownload iconDownload
from afkak.client import KafkaClient
kafka = KafkaClient("localhost:9092")
req = ProduceRequest(topic="my-topic", partition=1,
    messages=[KafkaProtocol.encode_message(b"some message")])
resps = afkak.send_produce_request(payloads=[req], fail_on_error=True)

resps[0].topic      # b"my-topic"
resps[0].partition  # 1
resps[0].error      # 0 (hopefully)
resps[0].offset     # offset of the first message sent in this request


copy iconCopydownload iconDownload
make venv

Run the unit tests

copy iconCopydownload iconDownload
make toxu

Run the integration tests

copy iconCopydownload iconDownload
KAFKA_VER=1.1.1 make toxi

Run all the tests against the default Kafka version

copy iconCopydownload iconDownload
make toxa

Run the integration tests against all the Kafka versions the Makefile knows about

copy iconCopydownload iconDownload
make toxik

Community Discussions

Trending Discussions on Pub Sub
  • Build JSON content in R according Google Cloud Pub Sub message format
  • BigQuery Table a Pub Sub Topic not working in Apache Beam Python SDK? Static source to Streaming Sink
  • Pub Sub Lite topics with Peak Capacity Throughput option
  • How do I add permissions to a NATS User to allow the User to query & create Jestream keyvalue stores?
  • MSK vs SQS + SNS
  • Dataflow resource usage
  • Run code on Python Flask AppEngine startup in GCP
  • Is there a way to listen for updates on multiple Google Classroom Courses using Pub Sub?
  • Flow.take(ITEM_COUNT) returning all the elements rather then specified amount of elements
  • Wrapping Pub-Sub Java API in Akka Streams Custom Graph Stage
Trending Discussions on Pub Sub


Build JSON content in R according Google Cloud Pub Sub message format

Asked 2022-Apr-16 at 09:59

In R, I want to build json content according this Google Cloud Pub Sub message format: https://cloud.google.com/pubsub/docs/reference/rest/v1/PubsubMessage

It have to respect :

  "data": string,
  "attributes": {
    string: string,
  "messageId": string,
  "publishTime": string,
  "orderingKey": string

The message built will be readed from this Python code:

def pubsub_read(data, context):
    '''This function is executed from a Cloud Pub/Sub'''
    message = base64.b64decode(data['data']).decode('utf-8')
    file_name = data['attributes']['file_name']

This following R code builds a R dataframe and converts it to json content:

data="Hello World!"
df <- data.frame(data)
attributes <- data.frame(file_name=c('gfs_data_temp_FULL.csv'))
df$attributes <- attributes

msg <- df %>%
    toJSON(auto_unbox = TRUE, dataframe = 'columns', pretty = T) %>%
    # Pub/Sub expects a base64 encoded string
    googlePubsubR::msg_encode() %>%

It seems good but when I visualise it with a json editor :

enter image description here

indexes are added.

Additionally there is the message content: enter image description here

I dont'sure it respects Google Cloud Pub Sub message format...


Answered 2022-Apr-16 at 09:59

Not sure why, but replacing the dataframe by a list seems to work:


df = list(data = "Hello World")
attributes <- list(file_name=c('toto.csv'))
df$attributes <- attributes

df %>%
  toJSON(auto_unbox = TRUE, simplifyVector=TRUE, dataframe = 'columns', pretty = T)


  "data": "Hello World",
  "attributes": {
    "file_name": "toto.csv"

Source https://stackoverflow.com/questions/71892778

Community Discussions, Code Snippets contain sources that include Stack Exchange Network


No vulnerabilities reported

Install afkak

Afkak releases are available on PyPI.


For any new features, suggestions and bugs create an issue on GitHub. If you have any questions check and ask questions on community page Stack Overflow .

DOWNLOAD this Library from

Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases

Save this library and start creating your kit

Explore Related Topics

Share this Page

share link
Find, review, and download reusable Libraries, Code Snippets, Cloud APIs from
over 430 million Knowledge Items
Find more libraries
Reuse Solution Kits and Libraries Curated by Popular Use Cases

Save this library and start creating your kit

  • © 2022 Open Weaver Inc.