Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce >> mail # user >> Re: Integrating hadoop with java UI application deployed on tomcat


Copy link to this message
-
Re: Integrating hadoop with java UI application deployed on tomcat
Thanks Senthil i tried on trying with new path getting this error do i have
to do any ssl setting on tomcat as well

*

java.io.IOException*: Call to localhost/127.0.0.1:9000 failed on local
exception: *java.io.IOException*: An established connection was aborted by
the software in your host machine

at org.apache.hadoop.ipc.Client.wrapException(*Client.java:1107*)

at org.apache.hadoop.ipc.Client.call(*Client.java:1075*)

at org.apache.hadoop.ipc.RPC$Invoker.invoke(*RPC.java:225*)
On Mon, Sep 3, 2012 at 7:22 PM, Senthil Kumar <[EMAIL PROTECTED]
> wrote:

> Try using hadoopConf.addResource(*new
> Path("F:/hadoop-0.22.0/conf/core-site.xml")*); instead
> of hadoopConf.addResource("F:/hadoop-0.22.0/conf/core-site.xml");
>
> or you should add your core-site.xml to a location which is in your class
> path(WEB-INF\classes or WEB-INF\lib in case of a web application)
>
>
> On Mon, Sep 3, 2012 at 6:02 PM, Visioner Sadak <[EMAIL PROTECTED]>wrote:
>
>> thanks hemanth i tried adding ext folder conf and extn root folder
>> unable to add xml only but still same problem thanks for the help
>>
>> On Mon, Sep 3, 2012 at 4:11 PM, Hemanth Yamijala <[EMAIL PROTECTED]>wrote:
>>
>>> Hi,
>>>
>>> If you are getting the LocalFileSystem, you could try by putting
>>> core-site.xml in a directory that's there in the classpath for the
>>> Tomcat App (or include such a path in the classpath, if that's
>>> possible)
>>>
>>> Thanks
>>> hemanth
>>>
>>> On Mon, Sep 3, 2012 at 4:01 PM, Visioner Sadak <[EMAIL PROTECTED]>
>>> wrote:
>>> > Thanks steve thers nothing in logs and no exceptions as well i found
>>> that
>>> > some file is created in my F:\user with directory name but its not
>>> visible
>>> > inside my hadoop browse filesystem directories i also added the config
>>> by
>>> > using the below method
>>> > hadoopConf.addResource(
>>> > "F:/hadoop-0.22.0/conf/core-site.xml");
>>> > when running thru WAR printing out the filesystem i m getting
>>> > org.apache.hadoop.fs.LocalFileSystem@9cd8db
>>> > when running an independet jar within hadoop i m getting
>>> > DFS[DFSClient[clientName=DFSClient_296231340, ugi=dell]]
>>> > when running an independet jar i m able to do uploads....
>>> >
>>> > just wanted to know will i have to add something in my classpath of
>>> tomcat
>>> > or is there any other configurations of core-site.xml that i am missing
>>> > out..thanks for your help.....
>>> >
>>> >
>>> >
>>> > On Sat, Sep 1, 2012 at 1:38 PM, Steve Loughran <[EMAIL PROTECTED]
>>> >
>>> > wrote:
>>> >>
>>> >>
>>> >> well, it's worked for me in the past outside Hadoop itself:
>>> >>
>>> >>
>>> >>
>>> http://smartfrog.svn.sourceforge.net/viewvc/smartfrog/trunk/core/hadoop-components/hadoop-ops/src/org/smartfrog/services/hadoop/operations/utils/DfsUtils.java?revision=8882&view=markup
>>> >>
>>> >> Turn logging up to DEBUG
>>> >> Make sure that the filesystem you've just loaded is what you expect,
>>> by
>>> >> logging its value. It may turn out to be file:///, because the normal
>>> Hadoop
>>> >> site-config.xml isn't being picked up
>>> >>
>>> >>
>>> >>>
>>> >>>
>>> >>> On Fri, Aug 31, 2012 at 1:08 AM, Visioner Sadak
>>> >>> <[EMAIL PROTECTED]> wrote:
>>> >>>>
>>> >>>> but the problem is that my  code gets executed with the warning but
>>> file
>>> >>>> is not copied to hdfs , actually i m trying to copy a file from
>>> local to
>>> >>>> hdfs
>>> >>>>
>>> >>>>    Configuration hadoopConf=new Configuration();
>>> >>>>         //get the default associated file system
>>> >>>>        FileSystem fileSystem=FileSystem.get(hadoopConf);
>>> >>>>        // HarFileSystem harFileSystem= new
>>> HarFileSystem(fileSystem);
>>> >>>>         //copy from lfs to hdfs
>>> >>>>        fileSystem.copyFromLocalFile(new Path("E:/test/GANI.jpg"),new
>>> >>>> Path("/user/TestDir/"));
>>> >>>>
>>> >>
>>> >>
>>> >
>>>
>>
>>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB