Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume, mail # user - Automatically upload files into HDFS


Copy link to this message
-
Re: Automatically upload files into HDFS
Mohammad Tariq 2012-11-19, 14:53
It would be good if I could have a look on the files. Meantime try some
other directories. Also, check the directory permissions once.

Regards,
    Mohammad Tariq

On Mon, Nov 19, 2012 at 8:13 PM, kashif khan <[EMAIL PROTECTED]> wrote:

>
> I have tried through root user and made the following changes:
>
>
> Path inputFile = new Path("/usr/Eclipse/Output.csv");
> Path outputFile = new Path("/user/root/Output1.csv");
>
> No result. The following is the log output. The log shows the destination
> is null.
>
>
> 2012-11-19 14:36:38,960 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=getfileinfo src=/user dst=null perm=null
> 2012-11-19 14:36:38,977 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/user dst=null perm=null
> 2012-11-19 14:36:39,933 INFO FSNamesystem.audit: allowed=true ugi=hbase (auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/hbase/.oldlogs dst=null perm=null
> 2012-11-19 14:36:41,147 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=getfileinfo src=/user/root dst=null perm=null
> 2012-11-19 14:36:41,229 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/user/root dst=null perm=null
>
>
> Thanks
>
>
>
>
>
>
> On Mon, Nov 19, 2012 at 2:29 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>
>> Yeah, My cluster running. When brows http://hadoop1.example.com:
>> 50070/dfshealth.jsp. I am getting the main page. Then click on Brows file
>> system. I am getting the following:
>>
>> hbase
>> tmp
>> user
>>
>> And when click on user getting:
>>
>> beeswax
>> huuser (I have created)
>> root (I have created)
>>
>> Would you like to see my configuration file. As did not change any
>> things, all by default. I have installed CDH4.1 and running on VMs.
>>
>> Many thanks
>>
>>
>>
>>
>>
>> On Mon, Nov 19, 2012 at 2:04 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>
>>> Is your cluster running fine? Are you able to browse Hdfs through the
>>> Hdfs Web Console at 50070?
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>>
>>> On Mon, Nov 19, 2012 at 7:31 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>>
>>>> Many thanks.
>>>>
>>>> I have changed the program accordingly. It does not show any error but
>>>> one warring , but when I am browsing the HDFS folder, file is not copied.
>>>>
>>>>
>>>> public class CopyData {
>>>> public static void main(String[] args) throws IOException{
>>>>         Configuration conf = new Configuration();
>>>>         //Configuration configuration = new Configuration();
>>>>         //configuration.addResource(new
>>>> Path("/home/mohammad/hadoop-0.20.205/conf/core-site.xml"));
>>>>         //configuration.addResource(new
>>>> Path("/home/mohammad/hadoop-0.20.205/conf/hdfs-site.xml"));
>>>>
>>>>         conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
>>>>         conf.addResource(new Path ("/etc/hadoop/conf/hdfs-site.xml"));
>>>>          FileSystem fs = FileSystem.get(conf);
>>>>         Path inputFile = new Path("/usr/Eclipse/Output.csv");
>>>>         Path outputFile = new Path("/user/hduser/Output1.csv");
>>>>         fs.copyFromLocalFile(inputFile, outputFile);
>>>>         fs.close();
>>>>     }
>>>> }
>>>>
>>>> 19-Nov-2012 13:50:32 org.apache.hadoop.util.NativeCodeLoader <clinit>
>>>> WARNING: Unable to load native-hadoop library for your platform...
>>>> using builtin-java classes where applicable
>>>>
>>>> Have any idea?
>>>>
>>>> Many thanks
>>>>
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Mon, Nov 19, 2012 at 1:18 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> If it is just copying the files without any processing or change, you
>>>>> can use something like this :
>>>>>
>>>>> public class CopyData {
>>>>>
>>>>>     public static void main(String[] args) throws IOException{
>>>>>
>>>>>         Configuration configuration = new Configuration();
>>>>>         configuration.addResource(new
>>>>> Path("/home/mohammad/hadoop-0.20.205/conf/core-site.xml"));