Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> Automatically upload files into HDFS


+
kashif khan 2012-11-19, 10:44
+
Mohammad Tariq 2012-11-19, 10:50
+
kashif khan 2012-11-19, 12:30
+
Mohammad Tariq 2012-11-19, 12:34
+
Mohammad Tariq 2012-11-19, 12:35
+
Alexander Alten-Lorenz 2012-11-19, 12:26
+
kashif khan 2012-11-19, 12:35
+
Mohammad Tariq 2012-11-19, 12:41
+
kashif khan 2012-11-19, 12:53
+
Mohammad Tariq 2012-11-19, 13:18
+
kashif khan 2012-11-19, 14:01
+
Mohammad Tariq 2012-11-19, 14:04
+
kashif khan 2012-11-19, 14:29
+
kashif khan 2012-11-19, 14:43
+
Mohammad Tariq 2012-11-19, 14:53
+
kashif khan 2012-11-19, 15:01
Copy link to this message
-
Re: Automatically upload files into HDFS
Try this as your input file path
Path inputFile = new Path("file:///usr/Eclipse/Output.csv");

Regards,
    Mohammad Tariq

On Mon, Nov 19, 2012 at 8:31 PM, kashif khan <[EMAIL PROTECTED]> wrote:

> when I am applying the command as
>
> $ hadoop fs -put /usr/Eclipse/Output.csv /user/root/Output.csv.
>
> its work fine and file browsing in the hdfs. But i dont know why its not
> work in program.
>
> Many thanks for your cooperation.
>
> Best regards,
>
>
>
>
> On Mon, Nov 19, 2012 at 2:53 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>
>> It would be good if I could have a look on the files. Meantime try some
>> other directories. Also, check the directory permissions once.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Mon, Nov 19, 2012 at 8:13 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>
>>>
>>> I have tried through root user and made the following changes:
>>>
>>>
>>> Path inputFile = new Path("/usr/Eclipse/Output.csv");
>>> Path outputFile = new Path("/user/root/Output1.csv");
>>>
>>> No result. The following is the log output. The log shows the
>>> destination is null.
>>>
>>>
>>> 2012-11-19 14:36:38,960 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=getfileinfo src=/user dst=null perm=null
>>> 2012-11-19 14:36:38,977 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/user dst=null perm=null
>>> 2012-11-19 14:36:39,933 INFO FSNamesystem.audit: allowed=true ugi=hbase (auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/hbase/.oldlogs dst=null perm=null
>>> 2012-11-19 14:36:41,147 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=getfileinfo src=/user/root dst=null perm=null
>>> 2012-11-19 14:36:41,229 INFO FSNamesystem.audit: allowed=true ugi=dr.who (auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/user/root dst=null perm=null
>>>
>>>
>>> Thanks
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Mon, Nov 19, 2012 at 2:29 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>>
>>>> Yeah, My cluster running. When brows http://hadoop1.example.com:
>>>> 50070/dfshealth.jsp. I am getting the main page. Then click on Brows file
>>>> system. I am getting the following:
>>>>
>>>> hbase
>>>> tmp
>>>> user
>>>>
>>>> And when click on user getting:
>>>>
>>>> beeswax
>>>> huuser (I have created)
>>>> root (I have created)
>>>>
>>>> Would you like to see my configuration file. As did not change any
>>>> things, all by default. I have installed CDH4.1 and running on VMs.
>>>>
>>>> Many thanks
>>>>
>>>>
>>>>
>>>>
>>>>
>>>> On Mon, Nov 19, 2012 at 2:04 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> Is your cluster running fine? Are you able to browse Hdfs through the
>>>>> Hdfs Web Console at 50070?
>>>>>
>>>>> Regards,
>>>>>     Mohammad Tariq
>>>>>
>>>>>
>>>>>
>>>>> On Mon, Nov 19, 2012 at 7:31 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>>>>
>>>>>> Many thanks.
>>>>>>
>>>>>> I have changed the program accordingly. It does not show any error
>>>>>> but one warring , but when I am browsing the HDFS folder, file is not
>>>>>> copied.
>>>>>>
>>>>>>
>>>>>> public class CopyData {
>>>>>> public static void main(String[] args) throws IOException{
>>>>>>         Configuration conf = new Configuration();
>>>>>>         //Configuration configuration = new Configuration();
>>>>>>         //configuration.addResource(new
>>>>>> Path("/home/mohammad/hadoop-0.20.205/conf/core-site.xml"));
>>>>>>         //configuration.addResource(new
>>>>>> Path("/home/mohammad/hadoop-0.20.205/conf/hdfs-site.xml"));
>>>>>>
>>>>>>         conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
>>>>>>         conf.addResource(new Path ("/etc/hadoop/conf/hdfs-site.xml"));
>>>>>>          FileSystem fs = FileSystem.get(conf);
>>>>>>         Path inputFile = new Path("/usr/Eclipse/Output.csv");
>>>>>>         Path outputFile = new Path("/user/hduser/Output1.csv");
>>>>>>         fs.copyFromLocalFile(inputFile, outputFile);
>>>>>>         fs.close();
+
kashif khan 2012-11-19, 15:34
+
Mohammad Tariq 2012-11-19, 15:41
+
kashif khan 2012-11-20, 10:40
+
Mohammad Tariq 2012-11-20, 14:19
+
kashif khan 2012-11-20, 14:27
+
Mohammad Tariq 2012-11-20, 14:33
+
kashif khan 2012-11-20, 14:36
+
Mohammad Tariq 2012-11-20, 14:53
+
kashif khan 2012-11-20, 15:04
+
kashif khan 2012-11-20, 16:22
+
shekhar sharma 2012-11-20, 19:06
+
kashif khan 2012-11-21, 12:36
+
shekhar sharma 2012-11-26, 16:42
+
kashif khan 2012-11-27, 21:25
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB