Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS >> mail # user >> Re: Run multiple HDFS instances


Copy link to this message
-
Re: Run multiple HDFS instances
Yes you can but if you want the scripts to work, you should have them
use a different PID directory (I think its called HADOOP_PID_DIR)
every time you invoke them.

I instead prefer to start the daemons up via their direct command such
as "hdfs namenode" and so and move them to the background, with a
redirect for logging.

On Thu, Apr 18, 2013 at 2:34 PM, Lixiang Ao <[EMAIL PROTECTED]> wrote:
> Hi all,
>
> Can I run mutiple HDFS instances, that is, n seperate namenodes and n
> datanodes, on a single machine?
>
> I've modified core-site.xml and hdfs-site.xml to avoid port and file
> conflicting between HDFSes, but when I started the second HDFS, I got the
> errors:
>
> Starting namenodes on [localhost]
> localhost: namenode running as process 20544. Stop it first.
> localhost: datanode running as process 20786. Stop it first.
> Starting secondary namenodes [0.0.0.0]
> 0.0.0.0: secondarynamenode running as process 21074. Stop it first.
>
> Is there a way to solve this?
> Thank you in advance,
>
> Lixiang Ao

--
Harsh J