Replies: 3 comments
-
Tagging a few contributors to the eventhubs sdk @conniey @joshfree @anuchandy |
Beta Was this translation helpful? Give feedback.
-
There is no consumer decompression that happens implicitly in the AMQP layer. We expose the BinaryData but the user would have to pass this data through a decompression library.
At the moment, there is no way to "transparently" pass in a decompression/compression mechanism through the AMQP client. This is a story we are looking at. @jsquire : may have additional insights. |
Beta Was this translation helpful? Give feedback.
-
No additional insights, currently. As Connie said, compression/decompression are not part of the AMQP protocol and are a coordination of client and server logic for Kafka. There's no way to fully simulate with Event Hubs, currently, as the service is not compression-aware and won't dynamically manage it when committing data to a partition or sending data to consumers. To use compression with Event Hubs, your producers would need to send compressed data, which would be stored as-is in the partition. Consumers would have to know to expect compressed data and understand how to decompress when reading. |
Beta Was this translation helpful? Give feedback.
-
According to this link, we have support for gzip for Kafka over Eventhubs.
That link mentions "While the feature is only supported for Apache Kafka traffic producer and consumer traffic, AMQP consumer can consume compressed Kafka traffic as decompressed messages."
Questions -
Beta Was this translation helpful? Give feedback.
All reactions