Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Re: Integrating hadoop with java UI application deployed on tomcat


Copy link to this message
-
Re: Integrating hadoop with java UI application deployed on tomcat
do i have to do some tomcat configuration settings ???

On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak <[EMAIL PROTECTED]>wrote:

> but the problem is that my  code gets executed with the warning but file
> is not copied to hdfs , actually i m trying to copy a file from local to
> hdfs
>
>    Configuration hadoopConf=new Configuration();
>         //get the default associated file system
>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>         //copy from lfs to hdfs
>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
> Path("/user/TestDir/"));
>
>
>  On Thu, Aug 30, 2012 at 9:57 PM, Steve Loughran <[EMAIL PROTECTED]>wrote:
>
>>
>>
>>  On 30 August 2012 13:54, Visioner Sadak <[EMAIL PROTECTED]>wrote:
>>
>>> Thanks a ton guys for your help i did used hadoop-core-1.0.3.jar &
>>> commons-lang-2.1.jar to get rid of the class not found error now i am
>>> getting this error is this becoz i am using my app and hadoop on windows???
>>>
>>> util.NativeCodeLoader: Unable to load native-hadoop library for your
>>> platform... using builtin-java classes where applicable
>>>
>>
>> no, that's warning you that the native code to help with some operations
>> (especially compression) aren't loading as your JVM's native lib path
>> aren't set up right.
>>
>> Just edit log4j to hide that classes log messages.
>>
>> FWIW, I've downgraded some other messages that are over noisy, especially
>> if you bring up a MiniMR/MiniDFS cluster for test runs:
>>
>>  log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataStorage=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataXceiverServer=WARNING
>>
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.DataBlockScanner=WARNING
>> log4j.logger.org.apache.hadoop.hdfs.server.datanode.FSDataset=FATAL
>> log4j.logger.org.apache.hadoop.metrics2=FATAL
>> log4j.logger.org.apache.hadoop.ipc.metrics.RpcInstrumentation=WARNING
>> log4j.logger.org.apache.hadoop.ipc.Server=WARNING
>> log4j.logger.org.apache.hadoop.metrics=FATAL
>>
>>
>>
>>>
>>>
>>>
>>>
>>> On Thu, Aug 30, 2012 at 5:22 PM, Steve Loughran <[EMAIL PROTECTED]>wrote:
>>>
>>>> you will need almost the entire hadoop client-side JAR set and
>>>> dependencies for this, I'm afraid.
>>>>
>>>> The new webhdfs filesys -HDFS over HTTP- is designed to be lighter
>>>> weight and only need an HTTP client, but I'm not aware of any ultra-thin
>>>> client yet (apache http components should suffice).
>>>>
>>>> If you are using any of the build tools with dependency management:
>>>> Ant+Ivy, Maven, Gradle, ask for Hadoop JARs and have the dependencies
>>>> pulled in.
>>>>
>>>> If you aren't using any of the build tools w/ dependency management,
>>>> now is the time.
>>>>
>>>>
>>>> On 30 August 2012 09:32, Visioner Sadak <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>>   I have a WAR which is deployed on tomcat server the WAR contains
>>>>> some java classes which uploads files, will i be able to upload directly in
>>>>> to hadoop iam using the below code in one of my java class
>>>>>
>>>>>        Configuration hadoopConf=new Configuration();
>>>>>         //get the default associated file system
>>>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>>>>        // HarFileSystem harFileSystem= new HarFileSystem(fileSystem);
>>>>>         //copy from lfs to hdfs
>>>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>>>> Path("/user/TestDir/"));
>>>>>
>>>>> but its throwing up this error
>>>>>
>>>>> java.lang.NoClassDefFoundError: org/apache/hadoop/conf/Configuration
>>>>>
>>>>> when this code is run independtly using a single jar deployed in
>>>>> hadoop bin it wrks fine
>>>>>
>>>>>
>>>>>
>>>>
>>>>
>>>
>>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB