Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Flume >> mail # user >> Automatically upload files into HDFS


+
kashif khan 2012-11-19, 10:44
+
Mohammad Tariq 2012-11-19, 10:50
+
kashif khan 2012-11-19, 12:30
+
Mohammad Tariq 2012-11-19, 12:34
+
Mohammad Tariq 2012-11-19, 12:35
+
Alexander Alten-Lorenz 2012-11-19, 12:26
+
kashif khan 2012-11-19, 12:35
+
Mohammad Tariq 2012-11-19, 12:41
+
kashif khan 2012-11-19, 12:53
+
Mohammad Tariq 2012-11-19, 13:18
+
kashif khan 2012-11-19, 14:01
+
Mohammad Tariq 2012-11-19, 14:04
+
kashif khan 2012-11-19, 14:29
+
kashif khan 2012-11-19, 14:43
+
Mohammad Tariq 2012-11-19, 14:53
+
kashif khan 2012-11-19, 15:01
+
Mohammad Tariq 2012-11-19, 15:10
+
kashif khan 2012-11-19, 15:34
+
Mohammad Tariq 2012-11-19, 15:41
+
kashif khan 2012-11-20, 10:40
+
Mohammad Tariq 2012-11-20, 14:19
+
kashif khan 2012-11-20, 14:27
+
Mohammad Tariq 2012-11-20, 14:33
+
kashif khan 2012-11-20, 14:36
Copy link to this message
-
Re: Automatically upload files into HDFS
You can download the jar here :
http://search.maven.org/remotecontent?filepath=com/google/guava/guava/13.0.1/guava-13.0.1.jar

Regards,
    Mohammad Tariq

On Tue, Nov 20, 2012 at 8:06 PM, kashif khan <[EMAIL PROTECTED]> wrote:

> Could please let me know the name of jar file and location
>
> Many thanks
>
> Best regards
>
>
> On Tue, Nov 20, 2012 at 2:33 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>
>> Download the required jar and include it in your project.
>>
>> Regards,
>>     Mohammad Tariq
>>
>>
>>
>> On Tue, Nov 20, 2012 at 7:57 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>
>>> Dear Tariq Thanks
>>>
>>> I have added the jar files from Cdh and download the cdh4 eclipse plugin
>>> and copied into eclipse plugin folder. The previous error I think sorted
>>> out but now I am getting another strange error.
>>>
>>> Exception in thread "main" java.lang.NoClassDefFoundError:
>>> com/google/common/collect/Maps
>>>     at
>>> org.apache.hadoop.metrics2.lib.MetricsRegistry.<init>(MetricsRegistry.java:42)
>>>     at
>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:87)
>>>     at
>>> org.apache.hadoop.metrics2.impl.MetricsSystemImpl.<init>(MetricsSystemImpl.java:133)
>>>     at
>>> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<init>(DefaultMetricsSystem.java:38)
>>>     at
>>> org.apache.hadoop.metrics2.lib.DefaultMetricsSystem.<clinit>(DefaultMetricsSystem.java:36)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation$UgiMetrics.create(UserGroupInformation.java:97)
>>>     at
>>> org.apache.hadoop.security.UserGroupInformation.<clinit>(UserGroupInformation.java:190)
>>>     at
>>> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2373)
>>>     at
>>> org.apache.hadoop.fs.FileSystem$Cache$Key.<init>(FileSystem.java:2365)
>>>     at org.apache.hadoop.fs.FileSystem$Cache.get(FileSystem.java:2233)
>>>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:300)
>>>     at org.apache.hadoop.fs.FileSystem.get(FileSystem.java:156)
>>>     at CopyFile.main(CopyFile.java:14)
>>> Caused by: java.lang.ClassNotFoundException:
>>> com.google.common.collect.Maps
>>>     at java.net.URLClassLoader$1.run(URLClassLoader.java:217)
>>>     at java.security.AccessController.doPrivileged(Native Method)
>>>     at java.net.URLClassLoader.findClass(URLClassLoader.java:205)
>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:321)
>>>     at sun.misc.Launcher$AppClassLoader.loadClass(Launcher.java:294)
>>>     at java.lang.ClassLoader.loadClass(ClassLoader.java:266)
>>>     ... 13 more
>>>
>>> Have any idea about this error.
>>>
>>> Many thanks
>>>
>>>
>>>
>>>
>>>
>>> On Tue, Nov 20, 2012 at 2:19 PM, Mohammad Tariq <[EMAIL PROTECTED]>wrote:
>>>
>>>> Hello Kashif,
>>>>
>>>>      You are correct. This because of some version mismatch. I am not
>>>> using CDH personally but AFAIK, CDH4 uses Hadoop-2.x.
>>>>
>>>> Regards,
>>>>     Mohammad Tariq
>>>>
>>>>
>>>>
>>>> On Tue, Nov 20, 2012 at 4:10 PM, kashif khan <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> HI M Tariq
>>>>>
>>>>>
>>>>> I am trying the following the program to create directory and copy
>>>>> file to hdfs. But I am getting the following errors
>>>>>
>>>>>
>>>>>
>>>>> Program:
>>>>>
>>>>> import org.apache.hadoop.conf.Configuration;
>>>>> import org.apache.hadoop.fs.FileSystem;
>>>>> import org.apache.hadoop.fs.Path;
>>>>> import java.io.IOException;
>>>>>
>>>>> public class CopyFile {
>>>>>
>>>>>
>>>>>         public static void main(String[] args) throws IOException{
>>>>>         Configuration conf = new Configuration();
>>>>>          conf.set("fs.default.name", "hadoop1.example.com:8020");
>>>>>         FileSystem dfs = FileSystem.get(conf);
>>>>>         String dirName = "Test1";
>>>>>         Path src = new Path(dfs.getWorkingDirectory() + "/" + dirName);
>>>>>         dfs.mkdirs(src);
>>>>>         Path scr1 = new Path("/usr/Eclipse/Output.csv");
>>>>>         Path dst = new Path(dfs.getWorkingDirectory() + "/Test1/");
+
kashif khan 2012-11-20, 15:04
+
kashif khan 2012-11-20, 16:22
+
shekhar sharma 2012-11-20, 19:06
+
kashif khan 2012-11-21, 12:36
+
shekhar sharma 2012-11-26, 16:42
+
kashif khan 2012-11-27, 21:25
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB