Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume, mail # user - Automatically upload files into HDFS


+
kashif khan 2012-11-19, 10:44
+
Mohammad Tariq 2012-11-19, 10:50
+
kashif khan 2012-11-19, 12:30
+
Mohammad Tariq 2012-11-19, 12:34
+
Mohammad Tariq 2012-11-19, 12:35
+
Alexander Alten-Lorenz 2012-11-19, 12:26
+
kashif khan 2012-11-19, 12:35
+
Mohammad Tariq 2012-11-19, 12:41
+
kashif khan 2012-11-19, 12:53
+
Mohammad Tariq 2012-11-19, 13:18
+
kashif khan 2012-11-19, 14:01
+
Mohammad Tariq 2012-11-19, 14:04
+
kashif khan 2012-11-19, 14:29
Copy link to this message
-
Re: Automatically upload files into HDFS
kashif khan 2012-11-19, 14:43
I have tried through root user and made the following changes:

Path inputFile = new Path("/usr/Eclipse/Output.csv");
Path outputFile = new Path("/user/root/Output1.csv");

No result. The following is the log output. The log shows the destination
is null.
2012-11-19 14:36:38,960 INFO FSNamesystem.audit:
allowed=true ugi=dr.who
(auth:SIMPLE) ip=/134.91.36.41 cmd=getfileinfo src=/user dst=null perm=null
2012-11-19 14:36:38,977 INFO FSNamesystem.audit:
allowed=true ugi=dr.who
(auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/user dst=null perm=null
2012-11-19 14:36:39,933 INFO FSNamesystem.audit:
allowed=true ugi=hbase
(auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/hbase/.oldlogs dst=null perm=null
2012-11-19 14:36:41,147 INFO FSNamesystem.audit:
allowed=true ugi=dr.who
(auth:SIMPLE) ip=/134.91.36.41 cmd=getfileinfo src=/user/root dst=null perm=null
2012-11-19 14:36:41,229 INFO FSNamesystem.audit:
allowed=true ugi=dr.who
(auth:SIMPLE) ip=/134.91.36.41 cmd=listStatus src=/user/root dst=null perm=null
Thanks

On Mon, Nov 19, 2012 at 2:29 PM, kashif khan <[EMAIL PROTECTED]> wrote:

> Yeah, My cluster running. When brows http://hadoop1.example.com:
> 50070/dfshealth.jsp. I am getting the main page. Then click on Brows file
> system. I am getting the following:
>
> hbase
> tmp
> user
>
> And when click on user getting:
>
> beeswax
> huuser (I have created)
> root (I have created)
>
> Would you like to see my configuration file. As did not change any things,
> all by default. I have installed CDH4.1 and running on VMs.
>
> Many thanks
>
>
>
>
>
> On Mon, Nov 19, 2012 at 2:04 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>
>> Is your cluster running fine? Are you able to browse Hdfs through the
>> Hdfs Web Console at 50070?
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Mon, Nov 19, 2012 at 7:31 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>
>>> Many thanks.
>>>
>>> I have changed the program accordingly. It does not show any error but
>>> one warring , but when I am browsing the HDFS folder, file is not copied.
>>>
>>>
>>> public class CopyData {
>>> public static void main(String[] args) throws IOException{
>>>         Configuration conf = new Configuration();
>>>         //Configuration configuration = new Configuration();
>>>         //configuration.addResource(new
>>> Path("/home/mohammad/hadoop-0.20.205/conf/core-site.xml"));
>>>         //configuration.addResource(new
>>> Path("/home/mohammad/hadoop-0.20.205/conf/hdfs-site.xml"));
>>>
>>>         conf.addResource(new Path("/etc/hadoop/conf/core-site.xml"));
>>>         conf.addResource(new Path ("/etc/hadoop/conf/hdfs-site.xml"));
>>>          FileSystem fs = FileSystem.get(conf);
>>>         Path inputFile = new Path("/usr/Eclipse/Output.csv");
>>>         Path outputFile = new Path("/user/hduser/Output1.csv");
>>>         fs.copyFromLocalFile(inputFile, outputFile);
>>>         fs.close();
>>>     }
>>> }
>>>
>>> 19-Nov-2012 13:50:32 org.apache.hadoop.util.NativeCodeLoader <clinit>
>>> WARNING: Unable to load native-hadoop library for your platform... using
>>> builtin-java classes where applicable
>>>
>>> Have any idea?
>>>
>>> Many thanks
>>>
>>>
>>>
>>>
>>>
>>>
>>> On Mon, Nov 19, 2012 at 1:18 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>
>>>> If it is just copying the files without any processing or change, you
>>>> can use something like this :
>>>>
>>>> public class CopyData {
>>>>
>>>>     public static void main(String[] args) throws IOException{
>>>>
>>>>         Configuration configuration = new Configuration();
>>>>         configuration.addResource(new
>>>> Path("/home/mohammad/hadoop-0.20.205/conf/core-site.xml"));
>>>>         configuration.addResource(new
>>>> Path("/home/mohammad/hadoop-0.20.205/conf/hdfs-site.xml"));
>>>>         FileSystem fs = FileSystem.get(configuration);
>>>>         Path inputFile = new Path("/home/mohammad/pc/work/FFT.java");
>>>>         Path outputFile = new Path("/mapout/FFT.java");
>>>>         fs.copyFromLocalFile(inputFile, outputFile);
+
Mohammad Tariq 2012-11-19, 14:53
+
kashif khan 2012-11-19, 15:01
+
Mohammad Tariq 2012-11-19, 15:10
+
kashif khan 2012-11-19, 15:34
+
Mohammad Tariq 2012-11-19, 15:41
+
kashif khan 2012-11-20, 10:40
+
Mohammad Tariq 2012-11-20, 14:19
+
kashif khan 2012-11-20, 14:27
+
Mohammad Tariq 2012-11-20, 14:33
+
kashif khan 2012-11-20, 14:36
+
Mohammad Tariq 2012-11-20, 14:53
+
kashif khan 2012-11-20, 15:04
+
kashif khan 2012-11-20, 16:22
+
shekhar sharma 2012-11-20, 19:06
+
kashif khan 2012-11-21, 12:36
+
shekhar sharma 2012-11-26, 16:42
+
kashif khan 2012-11-27, 21:25