Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Flume >> mail # user >> Automatically upload files into HDFS


Copy link to this message
-
Re: Automatically upload files into HDFS
It should work. Same code is working fine for me. Try to create some other
directory in your Hdfs and use it as your output path. Also see if you find
something in datanode logs.

Regards,
    Mohammad Tariq

On Mon, Nov 19, 2012 at 9:04 PM, kashif khan <[EMAIL PROTECTED]> wrote:

> The input path is fine. Problem in output path. I am just wonder that it
> copy the data into local disk  (/user/root/) not into hdfs. I dont know
> why? Is it we give the correct statement to point to hdfs?
>
> Thanks
>
>
>
> On Mon, Nov 19, 2012 at 3:10 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>
>> Try this as your input file path
>> Path inputFile = new Path("file:///usr/Eclipse/Output.csv");
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Mon, Nov 19, 2012 at 8:31 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>
>>> when I am applying the command as
>>>
>>> $ hadoop fs -put /usr/Eclipse/Output.csv /user/root/Output.csv.
>>>
>>> its work fine and file browsing in the hdfs. But i dont know why its not
>>> work in program.
>>>
>>> Many thanks for your cooperation.
>>>
>>> Best regards,
>>>
>>>
>>>
>>>
>>> On Mon, Nov 19, 2012 at 2:53 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>
>>>> It would be good if I could have a look on the files. Meantime try some
>>>> other directories. Also, check the directory permissions once.
>>>>
>>>> Regards,
>>>>     Mohammad Tariq
>>>>
>>>>
>>>>
>>>> On Mon, Nov 19, 2012 at 8:13 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>>>
>>>>>
>>>>> I have tried through root user and made the following changes:
>>>>>
>>>>>
>>>>> Path inputFile = new Path("/usr/Eclipse/Output.csv");
>>>>> Path outputFile = new Path("/user/root/Output1.csv");
>>>>>
>>>>> No result. The following is the log output. The log shows the
>>>>> destination is null.
>>>>>
>>>>>
>>>>> 2012-11-19 14:36:38,960 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=getfileinfo src=/user dst=null perm=null
>>>>> 2012-11-19 14:36:38,977 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/user dst=null perm=null
>>>>> 2012-11-19 14:36:39,933 INFO FSNamesystem.audit: allowed=true ugi=hbase (auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/hbase/.oldlogs dst=null perm=null
>>>>> 2012-11-19 14:36:41,147 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=getfileinfo src=/user/root dst=null perm=null
>>>>> 2012-11-19 14:36:41,229 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/user/root dst=null perm=null
>>>>>
>>>>>
>>>>> Thanks
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Nov 19, 2012 at 2:29 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>>>>
>>>>>> Yeah, My cluster running. When brows http://hadoop1.example.com:
>>>>>> 50070/dfshealth.jsp. I am getting the main page. Then click on Brows file
>>>>>> system. I am getting the following:
>>>>>>
>>>>>> hbase
>>>>>> tmp
>>>>>> user
>>>>>>
>>>>>> And when click on user getting:
>>>>>>
>>>>>> beeswax
>>>>>> huuser (I have created)
>>>>>> root (I have created)
>>>>>>
>>>>>> Would you like to see my configuration file. As did not change any
>>>>>> things, all by default. I have installed CDH4.1 and running on VMs.
>>>>>>
>>>>>> Many thanks
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>>
>>>>>> On Mon, Nov 19, 2012 at 2:04 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>>>>
>>>>>>> Is your cluster running fine? Are you able to browse Hdfs through
>>>>>>> the Hdfs Web Console at 50070?
>>>>>>>
>>>>>>> Regards,
>>>>>>>     Mohammad Tariq
>>>>>>>
>>>>>>>
>>>>>>>
>>>>>>> On Mon, Nov 19, 2012 at 7:31 PM, kashif khan <[EMAIL PROTECTED]
>>>>>>> > wrote:
>>>>>>>
>>>>>>>> Many thanks.
>>>>>>>>
>>>>>>>> I have changed the program accordingly. It does not show any error
>>>>>>>> but one warring , but when I am browsing the HDFS folder, file is not
>>>>>>>> copied.
>>>>>>>>
>>>>>>>>
>>>>>>>> public class CopyData {
>>>>>>>> public static void main(String[] args) throws IOException{