Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # user - Re: Integrating hadoop with java UI application deployed on tomcat


Copy link to this message
-
Re: Integrating hadoop with java UI application deployed on tomcat
Visioner Sadak 2012-09-03, 17:53
Thanks senthil name node is up and running and in core-site.xml i have

<configuration>
<property>
<name>fs.default.name</name>
<value>hdfs://localhost:9000</value>
</property>
</configuration>

shud i change my ip  or any other config??

On Mon, Sep 3, 2012 at 10:11 PM, Senthil Kumar <
[EMAIL PROTECTED]> wrote:

> The error says call to 127.0.0.1:9000 fails. It is failing when it tries
> to contact the namenode (9000 is the default namenode port) configured in
> core-site.xml. You should also check whether the namenode is configured
> correctly and also whether the namenode is up.
>
>
> On Mon, Sep 3, 2012 at 7:43 PM, Visioner Sadak <[EMAIL PROTECTED]>wrote:
>
>> Thanks Senthil i tried on trying with new path getting this error do i
>> have to do any ssl setting on tomcat as well
>>
>> *
>>
>> java.io.IOException
>> *: Call to localhost/127.0.0.1:9000 failed on local exception: *
>> java.io.IOException*: An established connection was aborted by the
>> software in your host machine
>>
>> at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)
>>
>> at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)
>>
>> at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
>>
>>
>>  On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <
>> [EMAIL PROTECTED]> wrote:
>>
>>> Try using hadoopConf.addResource(*new
>>> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
>>> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>>>
>>> or you should add your core-site.xml to a location which is in your
>>> class path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>>>
>>>
>>> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <[EMAIL PROTECTED]
>>> > wrote:
>>>
>>>> thanks hemanth i tried adding ext folder conf and extn root folder
>>>> unable to add xml only but still same problem thanks for the help
>>>>
>>>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <[EMAIL PROTECTED]>wrote:
>>>>
>>>>> Hi,
>>>>>
>>>>> If you are getting the LocalFileSystem, you could try by putting
>>>>> core-site.xml in a directory that's there in the classpath for the
>>>>> Tomcat App (or include such a path in the classpath, if that's
>>>>> possible)
>>>>>
>>>>> Thanks
>>>>> hemanth
>>>>>
>>>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <
>>>>> [EMAIL PROTECTED]> wrote:
>>>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>>>> that
>>>>> > some file is created in my F:\user with directory name but its not
>>>>> visible
>>>>> > inside my hadoop browse filesystem directories i also added the
>>>>> config by
>>>>> > using the below method
>>>>> > hadoopConf.addResource(
>>>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>>>> > when running thru WAR printing out the filesystem i m getting
>>>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>>>> > when running an independet jar within hadoop i m getting
>>>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>>>> > when running an independet jar i m able to do uploads....
>>>>> >
>>>>> > just wanted to know will i have to add something in my classpath of
>>>>> tomcat
>>>>> > or is there any other configurations of core-site.xml that i am
>>>>> missing
>>>>> > out..thanks for your help.....
>>>>> >
>>>>> >
>>>>> >
>>>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <
>>>>> [EMAIL PROTECTED]>
>>>>> > wrote:
>>>>> >>
>>>>> >>
>>>>> >> well, it's worked for me in the past outside Hadoop itself:
>>>>> >>
>>>>> >>
>>>>> >>
>>>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>>>> >>
>>>>> >> Turn logging up to DEBUG
>>>>> >> Make sure that the filesystem you've just loaded is what you
>>>>> expect, by
>>>>> >> logging its value. It may turn out to be file:///, because the
>>>>> normal Hadoop
>>>>> >> site-config.xml isn't being picked up
>>>>> >>
>>>>> >>
>>>>> >>>