Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hadoop >> mail # user >> Re: java.net.ConnectException when using Httpfs


Copy link to this message
-
Re: java.net.ConnectException when using Httpfs
yeah sure its

# Do not remove the following line, or various programs
# that require network functionality will fail.

132.168.0.10 server1 server1.local

132.168.0.11 server2 server2.local

132.168.0.12 server3 server3.local
On Tue, Sep 3, 2013 at 5:33 PM, Jitendra Yadav
<[EMAIL PROTECTED]>wrote:

> Can you please share your /etc/hosts file?
>
> Thanks
> Jitendra
>
> On Tue, Sep 3, 2013 at 4:53 PM, Visioner Sadak <[EMAIL PROTECTED]>wrote:
>
>> i removed 127.0.0.1 references from my  etc hosts now its throwing
>>
>> {"RemoteException":{"message":"Call From redsigma1.local\/132.168.0.10 to localhost:8020 failed on connection exception: java.net.ConnectException: Connection refused; For more details see:  http://wiki.apache.org/hadoop/ConnectionRefused","exception":"ConnectException","javaClassName":"java.net.ConnectException"}}
>>
>>
>>
>> On Tue, Sep 3, 2013 at 4:33 PM, Visioner Sadak <[EMAIL PROTECTED]>wrote:
>>
>>> yeah shud i change my etc/hosts :)
>>>
>>>
>>> On Tue, Sep 3, 2013 at 3:59 PM, Andre Kelpe <[EMAIL PROTECTED]>wrote:
>>>
>>>> something is wrong with your name-resolution. If you look at the error
>>>> message, it says you are trying to connect to 127.0.0.1 instead of the
>>>> remote host.
>>>>
>>>> -André
>>>>
>>>> On Tue, Sep 3, 2013 at 12:05 PM, Visioner Sadak
>>>> <[EMAIL PROTECTED]> wrote:
>>>> > Hello Hadoopers,
>>>> >
>>>> >                           I am trying to configure httpfs below are my
>>>> > configurations in
>>>> >
>>>> >
>>>> > httpfs-site.xml
>>>> >
>>>> > <property>
>>>> > <name>httpfs.fsAccess.conf:fs.default.name
>>>> > </name>
>>>> > <value>hdfs://132.168.0.10:8020</value>
>>>> > </property>
>>>> >
>>>> > and in core-site.xml
>>>> >
>>>> > <property>
>>>> >  <name>hadoop.proxyuser.hadoop.hosts</name>
>>>> >  <value>132.168.0.10</value>
>>>> >  </property>
>>>> >
>>>> > <property>
>>>> > <name>fs.default.name</name>
>>>> > <value>viewfs:///</value>
>>>> > </property>
>>>> >
>>>> > <property>
>>>> > <name>fs.viewfs.mounttable.default.link./NN1Home</name>
>>>> > <value>hdfs://132.168.0.10:8020/NN1Home</value>
>>>> > </property>
>>>> >
>>>> >  I am able to start httpfs but when i try  to access a file through
>>>> httpfs
>>>> > it throws the below error
>>>> >
>>>> > {"RemoteException":{"message":"Call From server1.local\/127.0.0.1 to
>>>> > localhost:8020 failed on connection exception:
>>>> java.net.ConnectException:
>>>> > Connection refused; For more details see:
>>>> > http://wiki.apache.org
>>>> \/hadoop\/ConnectionRefused","exception":"ConnectException","javaClassName":"java.net.ConnectException"}}
>>>> >
>>>> >
>>>> >
>>>> >
>>>> >
>>>> >
>>>> >
>>>>
>>>>
>>>>
>>>> --
>>>> André Kelpe
>>>> [EMAIL PROTECTED]
>>>> http://concurrentinc.com
>>>>
>>>
>>>
>>
>
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB