I am working on using Kafka for building a highly scalable system. As I understand and have seen, Kafka broker has a very impressive and scalable file handling mechanisms to provide guaranteed delivery. However in one of the scenarios, I am facing a different challenge.
The scenario is such that the message payload is buffered and guaranteed for delivery by an external system, wherein there is no compelling need for guaranteed delivery from Kafka, but there is a need to parallel process the message streams. This made me wonder, if there is some way in Kafka, wherein I can avoid creation of files and instead stream the messages in-memory as they come and still take advantage of Kafka message streams, avoiding the small overhead of file management (avoid some more disk level IOPS).
Would greatly appreciate community's response.
Thanks & Regards Pankaj Misra ________________________________ NOTE: This message may contain information that is confidential, proprietary, privileged or otherwise protected by law. The message is intended solely for the named addressee. If received in error, please destroy and notify the sender. Any use of this email is prohibited when received in error. Impetus does not represent, warrant and/or guarantee, that the integrity of this communication has been maintained nor that the communication is free of errors, virus, interception or interference.