Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Re: Integrating hadoop with java UI application deployed on tomcat


Copy link to this message
-
Re: Integrating hadoop with java UI application deployed on tomcat
any solution guys,badly stuck in this [?][?][?]

On Tue, Sep 4, 2012 at 4:28 PM, Visioner Sadak <[EMAIL PROTECTED]>wrote:

> Thanks bejoy, actually my hadoop is also on windows(i have installed it in
> psuedo-distributed mode for testing) its not a remote cluster....
>
>
> On Tue, Sep 4, 2012 at 3:38 PM, Bejoy KS <[EMAIL PROTECTED]> wrote:
>
>> **
>> Hi
>>
>> You are running tomact on a windows machine and trying to connect to a
>> remote hadoop cluster from there. Your core site has
>>
>> <name>
>> fs.default.name</name>
>> <value>hdfs://localhost:9000</value>
>>
>> But It is localhost here.( I assume you are not running hadoop on this
>> windows environment for some testing)
>>
>> You need to have the exact configuration files and hadoop jars from the
>> cluster machines on this tomcat environment as well. I mean on the
>> classpath of your application.
>> Regards
>> Bejoy KS
>>
>> Sent from handheld, please excuse typos.
>> ------------------------------
>> *From: *Visioner Sadak <[EMAIL PROTECTED]>
>> *Date: *Tue, 4 Sep 2012 15:31:25 +0530
>> *To: *<[EMAIL PROTECTED]>
>> *ReplyTo: *[EMAIL PROTECTED]
>>  *Subject: *Re: Integrating hadoop with java UI application deployed on
>> tomcat
>>
>> also getting one more error
>>
>> *
>>
>> org.apache.hadoop.ipc.RemoteException
>> *: Server IPC version 5 cannot communicate with client version 4
>>
>> On Tue, Sep 4, 2012 at 2:44 PM, Visioner Sadak <[EMAIL PROTECTED]>wrote:
>>
>>> Thanks shobha tried adding conf folder to tomcats classpath  still
>>> getting same error
>>>
>>>
>>> Call to localhost/127.0.0.1:9000 failed on local exception:
>>> java.io.IOException: An established connection was aborted by the software
>>> in your host machine
>>>
>>>  On Tue, Sep 4, 2012 at 11:18 AM, Mahadevappa, Shobha <
>>> [EMAIL PROTECTED]> wrote:
>>>
>>>>  Hi,****
>>>>
>>>> Try adding the hadoop/conf directory in the TOMCAT’s classpath ****
>>>>
>>>> ** **
>>>>
>>>> Ex :
>>>> CLASSPATH=/usr/local/Apps/hbase-0.90.4/conf:/usr/local/Apps/hadoop-0.20.203.0/conf:
>>>> ****
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> ** **
>>>>
>>>> Regards,****
>>>>
>>>> *Shobha M *****
>>>>
>>>> ** **
>>>>
>>>> *From:* Visioner Sadak [mailto:[EMAIL PROTECTED]]
>>>> *Sent:* 03 September 2012 PM 04:01
>>>> *To:* [EMAIL PROTECTED]
>>>>
>>>> *Subject:* Re: Integrating hadoop with java UI application deployed on
>>>> tomcat****
>>>>
>>>> ** **
>>>>
>>>> Thanks steve thers nothing in logs and no exceptions as well i found
>>>> that some file is created in my F:\user with directory name but its not
>>>> visible inside my hadoop browse filesystem directories i also added the
>>>> config by using the below method ****
>>>>
>>>> hadoopConf.addResource(****
>>>>
>>>> "F:/hadoop-0.22.0/conf/core-site.xml"); ****
>>>>
>>>> when running thru WAR printing out the filesystem i m getting
>>>> org.apache.hadoop.fs.LocalFileSystem@9cd8db ****
>>>>
>>>> when running an independet jar within hadoop i m getting
>>>> DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]****
>>>>
>>>> when running an independet jar i m able to do uploads....****
>>>>
>>>>  ****
>>>>
>>>> just wanted to know will i have to add something in my classpath of
>>>> tomcat or is there any other configurations of core-site.xml that i am
>>>> missing out..thanks for your help.....****
>>>>
>>>>  ****
>>>>
>>>> ** **
>>>>
>>>> On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <[EMAIL PROTECTED]>
>>>> wrote:****
>>>>
>>>> ** **
>>>>
>>>> well, it's worked for me in the past outside Hadoop itself:****
>>>>
>>>> ** **
>>>>
>>>>
>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>> ****
>>>>
>>>> ** **
>>>>
>>>>    1. Turn logging up to DEBUG****
>>>>    2. Make sure that the filesystem you've just loaded is what you
>>>>    expect, by logging its value. It may turn out to be file:///,
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB