Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Datanode not created on hadoop-0.20.203.0


Copy link to this message
-
Re: Datanode not created on hadoop-0.20.203.0
What user are you running as?

-Joey

On Thu, Jun 16, 2011 at 1:46 AM, rutesh <[EMAIL PROTECTED]> wrote:
> Hi,
>
> I tried formatting the datanode too, but again same result:
>
> #hadoop datanode -format
> Unrecognized option: -jvm
> Could not create the Java virtual machine.
>
> Also, the datanodes have been defined in the slaves file. Is there any other
> workaround or process?
>
> With regards
> Rutesh
>
> On Wed, Jun 15, 2011 at 10:32 PM, <[EMAIL PROTECTED]> wrote:
>
>> You have to format the datanode too hadoop datanode -format also make sure
>> it is in the slaves file -
>>
>> Cheers -
>>
>> -----Original Message-----
>> From: Joey Echeverria [mailto:[EMAIL PROTECTED]]
>> Sent: Wednesday, June 15, 2011 12:01 PM
>> To: [EMAIL PROTECTED]
>> Subject: Re: Datanode not created on hadoop-0.20.203.0
>>
>> By any chance, are you running as root? If so, try running as a different
>> user.
>>
>> -Joey
>>
>> On Wed, Jun 15, 2011 at 12:53 PM, rutesh <[EMAIL PROTECTED]> wrote:
>> > Hi,
>> >
>> >   I am new to hadoop (Just 1 month old). These are the steps I followed
>> to
>> > install and run hadoop-0.20.203.0:
>> >
>> > 1) Downloaded tar file from
>> >
>> http://mirrors.axint.net/apache/hadoop/common/hadoop-0.20.203.0/hadoop-0.20.203.0rc1.tar.gz
>> .
>> > 2) Untarred it in /usr/local/ .
>> > 3) Set JAVA_HOME=/usr/lib/jvm/java-6-sun (which has already been
>> installed)
>> > 4) Modified the config files viz. core-site.xml , hdfs-site.xml and
>> > mapred-site.xml as provided on the single node installation page [
>> >
>> http://hadoop.apache.org/common/docs/r0.20.203.0/single_node_setup.html#PseudoDistributed
>> ].
>> > 5) Formatted the new distributed-filesystem using bin/hadoop namenode
>> > -format
>> > 6) Started the hdfs daemon using bin/start-dfs.sh
>> >
>> > Now, here is the error...
>> >
>> > # start-dfs.sh
>> > starting namenode, logging to
>> > /usr/local/hadoop/bin/../logs/hadoop-root-namenode-ip-10-98-94-62.out
>> > localhost: starting datanode, logging to
>> > /usr/local/hadoop/bin/../logs/hadoop-root-datanode-ip-10-98-94-62.out
>> > localhost: starting secondarynamenode, logging to
>> >
>> /usr/local/hadoop/bin/../logs/hadoop-root-secondarynamenode-ip-10-98-94-62.out
>> >
>> > The terminal says that datanode has been started, but when I run jps
>> > command, its shows different.
>> >
>> > # jps
>> > 395 Jps
>> > 32612 SecondaryNameNode
>> > 32442 NameNode
>> >
>> > And in the /usr/local/hadoop/logs/hadoop-root-datanode-ip-10-98-94-62.out
>> > this is the log:
>> >
>> > Unrecognized option: -jvm
>> > Could not create the Java virtual machine.
>> >
>> > My question is, can anybody tell me what is the error or what am I doing
>> > wrong, or in general, how I can make my datanode run?
>> >
>> > Thanks.
>> >
>> > With regards,
>> > Rutesh Chavda
>> >
>>
>>
>>
>> --
>> Joseph Echeverria
>> Cloudera, Inc.
>> 443.305.9434
>>
>>
>>
>

--
Joseph Echeverria
Cloudera, Inc.
443.305.9434
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB