Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> weird hbase sink behavior


+
Xu 2013-01-09, 04:19
+
Brock Noland 2013-01-09, 04:39
+
Xu 2013-01-09, 04:43
Copy link to this message
-
Re: weird hbase sink behavior
Tried this new config, still doesn't work...

root@flume-agent1:/usr/local/lib/flume# cat conf/test.conf
a1.channels = c1 c2
a1.sinks = k1 k2
a1.sources = r1

a1.sources.r1.type = avro
a1.sources.r1.bind = 0.0.0.0
a1.sources.r1.port = 8080

a1.sinks.k1.type = org.apache.flume.sink.hbase.HBaseSink
a1.sinks.k1.table = foo_table
a1.sinks.k1.columnFamily = bar_cf
a1.sinks.k1.payloadColumn = test
a1.sinks.k1.incrementColumn = count
a1.sinks.k1.serializer = org.apache.flume.sink.hbase.SimpleHbaseEventSerializer

a1.sinks.k2.type = logger

#a1.channels.c1.type = file
#a1.channels.c1.checkpointDir = /mnt/flume/checkpoint
#a1.channels.c1.dataDirs = /mnt/flume/data
a1.channels.c1.type = memory
a1.channels.c2.type = memory

a1.sources.r1.channels = c1 c2
a1.sinks.k1.channel = c1
a1.sinks.k2.channel = c2
On Tue, Jan 8, 2013 at 11:43 PM, Xu (Simon) Chen <[EMAIL PROTECTED]> wrote:
> Yeah, now I remembered, I was using Regex serializer...
>
> The comment in the code is actually confusing:
> https://github.com/apache/flume/blob/trunk/flume-ng-sinks/flume-ng-hbase-sink/src/main/java/org/apache/flume/sink/hbase/SimpleHbaseEventSerializer.java
>
> It gives me the impression that incrementColumn can be skipped...
>
> On Tue, Jan 8, 2013 at 11:39 PM, Brock Noland <[EMAIL PROTECTED]> wrote:
>> payloadColumn or incrementColumn must be specified. This should give
>> you a better message and the open JIRA is
>> https://issues.apache.org/jira/browse/FLUME-1757
>>
>> Brock
>>
>> On Tue, Jan 8, 2013 at 10:19 PM, Xu (Simon) Chen <[EMAIL PROTECTED]> wrote:
>>> Hi folks,
>>>
>>> I am using flume to sink into hbase. I had the set up working
>>> yesterday, but when I restarted my flume agent, it for some reason
>>> cannot dump to hbase anymore (too bad I don't have a version-control
>>> for my config yet.)
>>>
>>> Here is the configuration and execution output. The agent essentially
>>> listens for objects via avro and dump to console for debugging and to
>>> hbase. The last portion of the console output shows that an object is
>>> indeed received, but for some reason "scan" on the hbase doesn't show
>>> anything new. I tried Regex/Simple serializer and HBase and
>>> AsyncHbase, none works now...
>>>
>>> From the log I don't see anything obviously wrong. I did a tcpdump on
>>> the agent, and saw it established a couple of connections to the
>>> regionservers, but closed them a few seconds later.
>>>
>>> Any info on how to correctly configure this or troubleshoot would be
>>> appreciated.
>>>
>>> Thanks!
>>> -Simon
>>>
>>>
>>>
>>> root@flume-agent1:/usr/local/lib/flume# cat conf/test.conf
>>> a1.channels = c1 c2
>>> a1.sinks = k1 k2
>>> a1.sources = r1
>>>
>>> a1.sources.r1.type = avro
>>> a1.sources.r1.bind = 0.0.0.0
>>> a1.sources.r1.port = 8080
>>>
>>> a1.sinks.k1.type = org.apache.flume.sink.hbase.HBaseSink
>>> a1.sinks.k1.table = foo_table
>>> a1.sinks.k1.columnFamily = bar_cf
>>> a1.sinks.k1.serializer = org.apache.flume.sink.hbase.RegexHbaseEventSerializer
>>>
>>> a1.sinks.k2.type = logger
>>>
>>> #a1.channels.c1.type = file
>>> #a1.channels.c1.checkpointDir = /mnt/flume/checkpoint
>>> #a1.channels.c1.dataDirs = /mnt/flume/data
>>> a1.channels.c1.type = memory
>>> a1.channels.c2.type = memory
>>>
>>> a1.sources.r1.channels = c1 c2
>>> a1.sinks.k1.channel = c1
>>> a1.sinks.k2.channel = c2
>>>
>>> root@flume-agent1:/usr/local/lib/flume# bin/flume-ng agent --name a1
>>> --conf conf --conf-file conf/test.conf
>>> -Dflume.root.logger=INFO,console
>>> Info: Including Hadoop libraries found via
>>> (/usr/local/hadoop/bin/hadoop) for HDFS access
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Warning: $HADOOP_HOME is deprecated.
>>>
>>> Info: Excluding /usr/local/hadoop/libexec/../lib/slf4j-api-1.4.3.jar
>>> from classpath
>>> Info: Excluding
>>> /usr/local/hadoop/libexec/../lib/slf4j-log4j12-1.4.3.jar from
>>> classpath
>>> Info: Including HBASE libraries found via (/usr/local/hbase/bin/hbase)
>>> for HBASE access
>>> Info: Excluding /usr/local/hbase/lib/slf4j-api-1.4.3.jar from classpath
+
Brock Noland 2013-01-09, 05:10
+
Xu 2013-01-09, 14:36