Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Kafka >> mail # user >> Re: kafka appender layout does not work for kafka 0.7.1


+
Jun Rao 2013-03-29, 22:50
Copy link to this message
-
Re: kafka appender layout does not work for kafka 0.7.1

Hi Jun,
I do not know how to open a jira for this bug. Could you tell me how to do that or create a new one?

I also find another problem with the same configure.
This is a StackOverflowError.

I am using kafka 0.7.1 right now.
I am using the following log4j properties file and trying to send some log
information to kafka server.
log4j.rootLogger=INFO,file,stdout

log4j.appender.stdout=org.apache.log4j.ConsoleAppender
log4j.appender.stdout.layout=org.apache.log4j.PatternLayout
log4j.appender.stdout.layout.ConversionPattern=[%d] %p %t %m (%c)%n

log4j.appender.file=org.apache.log4j.RollingFileAppender
#log4j.appender.file.FileNamePattern=c:\\development\\producer-agent_%d{yyyy-MM-dd}.log
log4j.appender.file.File=${AC_DATA_HOME}\\lmservice\\tailer-aggregator.log
log4j.appender.file.MaxFileSize=100MB
log4j.appender.file.MaxBackupIndex=1
log4j.appender.file.layout=org.apache.log4j.PatternLayout
#log4j.appender.file.layout.ConversionPattern= %-4r [%t] %-5p %c %x - %m%n
log4j.appender.file.layout.ConversionPattern=[%d] %p %t %m (%c)%n

log4j.appender.KAFKA=kafka.producer.KafkaLog4jAppender
log4j.appender.KAFKA.layout=org.apache.log4j.PatternLayout
log4j.appender.KAFKA.layout.ConversionPattern=[%d] %p %t %m (%c)%n
log4j.appender.KAFKA.BrokerList=0:localhost:9092
log4j.appender.KAFKA.SerializerClass=kafka.serializer.StringEncoder
log4j.appender.KAFKA.Topic=test.topic
# Turn on all our debugging info
log4j.logger.kafka=INFO, KAFKA
log4j.logger.org=INFO, KAFKA
log4j.logger.com=INFO, KAFKA
When I run the program without KafkaLog4jAppender, everything is fine. Then I add KafkaLog4jAppender.
The producer can send messages through KafkaLog4jAppender to Kafka server.
However, after the producer send messages for a while. I see StackOverflowError. It seems that the producer or something else is recursively calling some functions.
This error will come out from many threads. I only copy 3 threads call stack below.
CallStack:
Exception in thread "Process_Manager" java.lang.StackOverflowError
        at java.util.regex.Pattern$1.isSatisfiedBy(Unknown Source)
        at java.util.regex.Pattern$5.isSatisfiedBy(Unknown Source)
        at java.util.regex.Pattern$5.isSatisfiedBy(Unknown Source)
        at java.util.regex.Pattern$CharProperty.match(Unknown Source)
        at java.util.regex.Pattern$GroupHead.match(Unknown Source)
        at java.util.regex.Pattern$Branch.match(Unknown Source)
        at java.util.regex.Pattern$Branch.match(Unknown Source)
        at java.util.regex.Pattern$Branch.match(Unknown Source)
        at java.util.regex.Pattern$BranchConn.match(Unknown Source)
        at java.util.regex.Pattern$GroupTail.match(Unknown Source)
        at java.util.regex.Pattern$Curly.match0(Unknown Source)
        at java.util.regex.Pattern$Curly.match(Unknown Source)
        at java.util.regex.Pattern$GroupHead.match(Unknown Source)
        at java.util.regex.Pattern$Branch.match(Unknown Source)
        at java.util.regex.Pattern$Branch.match(Unknown Source)
        at java.util.regex.Pattern$BmpCharProperty.match(Unknown Source)
        at java.util.regex.Pattern$Start.match(Unknown Source)
        at java.util.regex.Matcher.search(Unknown Source)
        at java.util.regex.Matcher.find(Unknown Source)
        at java.util.Formatter.parse(Unknown Source)
        at java.util.Formatter.format(Unknown Source)
        at java.util.Formatter.format(Unknown Source)
        at java.lang.String.format(Unknown Source)
        at scala.collection.immutable.StringLike$class.format(StringLike.scala:251)
        at scala.collection.immutable.StringOps.format(StringOps.scala:31)
        at kafka.utils.Logging$class.msgWithLogIdent(Logging.scala:28)
        at kafka.utils.Logging$class.info(Logging.scala:58)
        at kafka.producer.SyncProducer.info(SyncProducer.scala:39)
        at kafka.producer.SyncProducer.disconnect(SyncProducer.scala:153)
        at kafka.producer.SyncProducer.send(SyncProducer.scala:108)
        at kafka.producer.SyncProducer.send(SyncProducer.scala:125)
        at kafka.producer.ProducerPool$$anonfun$send$1.apply$mcVI$sp(ProducerPool.scala:114)
        at kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
        at kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
        at kafka.producer.ProducerPool.send(ProducerPool.scala:100)
        at kafka.producer.Producer.configSend(Producer.scala:159)
        at kafka.producer.Producer.send(Producer.scala:100)
        at kafka.producer.KafkaLog4jAppender.append(KafkaLog4jAppender.scala:83)
        at org.apache.log4j.AppenderSkeleton.doAppend(AppenderSkeleton.java:251)
        at org.apache.log4j.helpers.AppenderAttachableImpl.appendLoopOnAppenders(AppenderAttachableImpl.java:66)
        at org.apache.log4j.Category.callAppenders(Category.java:206)
        at org.apache.log4j.Category.forcedLog(Category.java:391)
        at org.apache.log4j.Category.info(Category.java:666)
        at kafka.utils.Logging$class.info(Logging.scala:58)
        at kafka.producer.SyncProducer.info(SyncProducer.scala:39)
        at kafka.producer.SyncProducer.disconnect(SyncProducer.scala:153)
        at kafka.producer.SyncProducer.send(SyncProducer.scala:108)
        at kafka.producer.SyncProducer.send(SyncProducer.scala:125)
        at kafka.producer.ProducerPool$$anonfun$send$1.apply$mcVI$sp(ProducerPool.scala:114)
        at kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
        at kafka.producer.ProducerPool$$anonfun$send$1.apply(ProducerPool.scala:100)
        at scala.collection.mutable.ResizableArray$class.foreach(ResizableArray.scala:57)
        at scala.collection.mutable.ArrayBuffer.foreach(ArrayBuffer.scala:43)
        at kafka.producer.ProducerPool.send(ProducerPool.scala:100)
        at kafka.producer.Pr
+
Jun Rao 2013-04-03, 03:52
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB