Re: Using Stunnel to encrypt/authenticate Kafka producers and consumers...
Unfortunately 'stunneling everything' is not really possible. Stunnel acts like a proxy service ... in the sense that the Stunnel client (on your log producer, or log consumer) has to be explicitly configured to connect to an exact endpoint (ie, kafka1.mydomain.com:1234) -- or multiple endpoints, that are randomly selected by stunnel.
In a few cases you can use Stunnel as an SSL offloader for certain protocols, but thats done on the server-side... ie, in front of a Postgres server, so that Stunnel can do the encryption rather than Postgres itself.
It would make a bit of a difference I think if our log producers were the only ones that needed to be able to talk to 'all' of the Kafka nodes. We could do something where we ship logs via an encrypted TCP session to some group of Kafka "log funnel" machines, where they can reach the Kafka servers directly and dump the log data. Maybe.
I'm still digging around, but I'm really surprised this hasn't been a larger topic of discussion. If Kafka natively allowed a single connection through a single server to reach all of the other servers in the farm, it would be far easier to secure and encrypt the communication. ElasticSearch and RabbitMQ are good examples of this model.
On Apr 22, 2013, at 12:21 PM, Scott Clasen <[EMAIL PROTECTED]> wrote: