Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - flume-ng error while writing to hdfs


Copy link to this message
-
Re: flume-ng error while writing to hdfs
Mohammad Tariq 2012-08-16, 20:55
Hello Sandeep,

 Sorry for late reply. Just make sure that the Hadoop and Flume are both
under the same user, the user which you have specified as the value of "
dfs.web.ugi". And this user has proper privileges.

Regards,
    Mohammad Tariq

On Fri, Aug 17, 2012 at 1:27 AM, Sandeep Reddy P <
[EMAIL PROTECTED]> wrote:

> Hi,
> Using flume i'm unable to write in hdfs. Hadoop is working fine.
>
>
> On Thu, Aug 16, 2012 at 3:35 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>
>> Are u able to write through the hdfs shell??
>>
>> On Friday, August 17, 2012, Sandeep Reddy P <[EMAIL PROTECTED]>
>> wrote:
>> > Hi,
>> > Thaks for the reply. I added the property. But flume cannot write a log
>> file from linux to HDFS.
>> >
>> >
>> > On Thu, Aug 16, 2012 at 2:25 PM, Mohammad Tariq <[EMAIL PROTECTED]>
>> wrote:
>> >>
>> >> Hello Sandeep
>> >>    Please edit the "dfs.web.ugi" property in hdfs-site.xml. It is by
>> default "webuser,webgroup".
>> >> Regards,
>> >>     Mohammad Tariq
>> >>
>> >>
>> >> On Thu, Aug 16, 2012 at 11:50 PM, Sandeep Reddy P <
>> [EMAIL PROTECTED]> wrote:
>> >>>
>> >>> Hi,
>> >>> I'm using flume-ng to write data from log file to HDFS but unable to
>> write. I got the following exceptions in Name node logs
>> >>>
>> >>> 2012-08-16 13:57:42,560 INFO
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem: Number of
>> transactions: 1 Total time for transactions(ms): 0Number of transactions
>> batched in Syncs: 0 Number of syncs: 0 SyncTimes(ms): 0 0 0
>> >>> 2012-08-16 13:57:59,293 WARN
>> org.apache.hadoop.security.ShellBasedUnixGroupsMapping: got exception
>> trying to get groups for user webuser
>> >>> org.apache.hadoop.util.Shell$ExitCodeException: id: webuser: No such
>> user
>> >>>
>> >>>         at org.apache.hadoop.util.Shell.runCommand(Shell.java:255)
>> >>>         at org.apache.hadoop.util.Shell.run(Shell.java:182)
>> >>>         at
>> org.apache.hadoop.util.Shell$ShellCommandExecutor.execute(Shell.java:375)
>> >>>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:461)
>> >>>         at org.apache.hadoop.util.Shell.execCommand(Shell.java:444)
>> >>>         at
>> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getUnixGroups(ShellBasedUnixGroupsMapping.java:66)
>> >>>         at
>> org.apache.hadoop.security.ShellBasedUnixGroupsMapping.getGroups(ShellBasedUnixGroupsMapping.java:43)
>> >>>         at org.apache.hadoop.security.Groups.getGroups(Groups.java:79)
>> >>>         at
>> org.apache.hadoop.security.UserGroupInformation.getGroupNames(UserGroupInformation.java:1083)
>> >>>         at
>> org.apache.hadoop.hdfs.server.namenode.FSPermissionChecker.<init>(FSPermissionChecker.java:50)
>> >>>         at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkPermission(FSNamesystem.java:5203)
>> >>>         at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.checkTraverse(FSNamesystem.java:5186)
>> >>>         at
>> org.apache.hadoop.hdfs.server.namenode.FSNamesystem.getFileInfo(FSNamesystem.java:1994)
>> >>>         at
>> org.apache.hadoop.hdfs.server.namenode.NameNode.getFileInfo(NameNode.java:819)
>> >>>         at sun.reflect.GeneratedMethodAccessor4.invoke(Unknown Source)
>> >>>         at
>> sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:25)
>> >>>         at java.lang.reflect.Method.invoke(Method.java:597)
>> >>>         at org.apache.hadoop.ipc.RPC$Server.call(RPC.java:557)
>> >>>         at
>> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1434)
>> >>>         at
>> org.apache.hadoop.ipc.Server$Handler$1.run(Server.java:1430)
>> >>>         at java.security.AccessController.doPrivileged(Native Method)
>> >>>         at javax.security.auth.Subject.doAs(Subject.java:396)
>> >>>         at
>> org.apache.hadoop.security.UserGroupInformation.doAs(UserGroupInformation.java:1177)
>> >>>         at org.apache.hadoop.ipc.Server$Handler.run(Server.java:1428)
>> >>>
>> >>>
>> >>> --
>> >>> Thanks,
>> >>> sandeep