Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
Hive >> mail # user >> Modify the number of map tasks


+
imen Megdiche 2012-12-12, 10:41
+
Mohammad Tariq 2012-12-12, 11:04
+
imen Megdiche 2012-12-12, 11:41
+
Mohammad Tariq 2012-12-12, 12:06
+
imen Megdiche 2012-12-12, 12:11
+
imen Megdiche 2012-12-12, 12:12
+
Mohammad Tariq 2012-12-12, 12:19
+
imen Megdiche 2012-12-12, 12:23
+
Mohammad Tariq 2012-12-12, 12:25
+
imen Megdiche 2012-12-12, 12:30
+
Mohammad Tariq 2012-12-12, 12:36
+
imen Megdiche 2012-12-12, 12:44
+
Mohammad Tariq 2012-12-12, 12:53
+
imen Megdiche 2012-12-12, 13:01
+
Mohammad Tariq 2012-12-12, 13:07
+
imen Megdiche 2012-12-12, 13:16
+
Mohammad Tariq 2012-12-12, 13:22
+
imen Megdiche 2012-12-12, 13:38
+
Mohammad Tariq 2012-12-12, 13:48
+
imen Megdiche 2012-12-12, 14:01
+
Mohammad Tariq 2012-12-12, 14:07
Copy link to this message
-
Re: Modify the number of map tasks
have you please commented the configuration of hadoop on cluster

thanks
2012/12/12 Mohammad Tariq <[EMAIL PROTECTED]>

> You are always welcome. If you still need any help, you can go here :
> http://cloudfront.blogspot.in/2012/07/how-to-configure-hadoop.html
> I have outlined the entire process here along with few small(but
> necessary) explanations.
>
> Regards,
>     Mohammad Tariq
>
>
>
> On Wed, Dec 12, 2012 at 7:31 PM, imen Megdiche <[EMAIL PROTECTED]>wrote:
>
>> thank you very much you re awsome.
>>
>> Fixed
>>
>>
>> 2012/12/12 Mohammad Tariq <[EMAIL PROTECTED]>
>>
>>> Uncomment the property in core-site.xml. That is a must. After doing
>>> this  you have to restart the daemons?
>>>
>>> Regards,
>>>     Mohammad Tariq
>>>
>>>
>>>
>>> On Wed, Dec 12, 2012 at 7:08 PM, imen Megdiche <[EMAIL PROTECTED]>wrote:
>>>
>>>> I changed the files
>>>> now when i run i have this response :
>>>>
>>>> 12/12/12 14:37:33 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 0 time(s).
>>>> 12/12/12 14:37:34 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 1 time(s).
>>>> 12/12/12 14:37:35 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 2 time(s).
>>>> 12/12/12 14:37:36 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 3 time(s).
>>>> 12/12/12 14:37:37 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 4 time(s).
>>>> 12/12/12 14:37:38 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 5 time(s).
>>>> 12/12/12 14:37:39 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 6 time(s).
>>>> 12/12/12 14:37:40 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 7 time(s).
>>>> 12/12/12 14:37:41 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 8 time(s).
>>>> 12/12/12 14:37:42 INFO ipc.Client: Retrying connect to server:
>>>> localhost/127.0.0.1:9001. Already tried 9 time(s).
>>>> Exception in thread "main" java.net.ConnectException: Call to localhost/
>>>> 127.0.0.1:9001 failed on connection exception:
>>>> java.net.ConnectException: Connexion refusée
>>>>     at org.apache.hadoop.ipc.Client.wrapException(Client.java:1099)
>>>>     at org.apache.hadoop.ipc.Client.call(Client.java:1075)
>>>>     at org.apache.hadoop.ipc.RPC$Invoker.invoke(RPC.java:225)
>>>>     at org.apache.hadoop.mapred.$Proxy1.getProtocolVersion(Unknown
>>>> Source)
>>>>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:396)
>>>>     at org.apache.hadoop.ipc.RPC.getProxy(RPC.java:379)
>>>>     at
>>>> org.apache.hadoop.mapred.JobClient.createRPCProxy(JobClient.java:480)
>>>>     at org.apache.hadoop.mapred.JobClient.init(JobClient.java:474)
>>>>     at org.apache.hadoop.mapred.JobClient.<init>(JobClient.java:457)
>>>>     at org.apache.hadoop.mapred.JobClient.runJob(JobClient.java:1260)
>>>>     at org.myorg.WordCount.run(WordCount.java:115)
>>>>     at org.apache.hadoop.util.ToolRunner.run(ToolRunner.java:65)
>>>>     at org.myorg.WordCount.main(WordCount.java:120)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
>>>>     at sun.reflect.NativeMethodAccessorImpl.invoke(Unknown Source)
>>>>     at sun.reflect.DelegatingMethodAccessorImpl.invoke(Unknown Source)
>>>>     at java.lang.reflect.Method.invoke(Unknown Source)
>>>>     at org.apache.hadoop.util.RunJar.main(RunJar.java:156)
>>>> Caused by: java.net.ConnectException: Connexion refusée
>>>>     at sun.nio.ch.SocketChannelImpl.checkConnect(Native Method)
>>>>     at sun.nio.ch.SocketChannelImpl.finishConnect(Unknown Source)
>>>>     at
>>>> org.apache.hadoop.net.SocketIOWithTimeout.connect(SocketIOWithTimeout.java:206)
>>>>     at org.apache.hadoop.net.NetUtils.connect(NetUtils.java:489)
>>>>     at
>>>> org.apache.hadoop.ipc.Client$Connection.setupConnection(Client.java:434)
+
Mohammad Tariq 2012-12-12, 14:48
+
imen Megdiche 2012-12-12, 15:25
+
Mohammad Tariq 2012-12-12, 15:37
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB