Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
HDFS >> mail # user >> Which is hdfs?


Copy link to this message
-
Re: Which is hdfs?
1. So if we are running a Mr job form eclipse by setting
                        Configuration conf = new Configuration();
conf.set("fs.defaultFS", "remotesystem-ip/");
                conf.set("hadoop.job.ugi", "hdfs");
the results are getting reflected in my remote cluster.
2. Exporting jar file from my home and "scp" to remote cluster and run jar
in remote cluster.

Do 1 and 2 yields the same result?(Not getting same results :( )
whether eclipse act as the original remote cluster even if we are setting
 Configuration conf = new Configuration();
conf.set("fs.defaultFS", "remotesystem-ip/");
                conf.set("hadoop.job.ugi", "hdfs");

On Tue, Dec 10, 2013 at 12:25 PM, unmesha sreeveni <[EMAIL PROTECTED]>wrote:

> Thank you vinayakumar and Jagat :)
>
>
> On Mon, Dec 9, 2013 at 3:06 PM, Jagat Singh <[EMAIL PROTECTED]> wrote:
>
>> Both are inside hdfs
>>
>> One file is under your name other is under system user hdfs name.
>>
>> Hadoop fs  command says i am accessing hdfs.
>>
>> Read about hdfs append in 2.x version.
>> On 09/12/2013 6:22 PM, "unmesha sreeveni" <[EMAIL PROTECTED]> wrote:
>>
>>> Can anyone tell me what is the difference between the below details
>>>
>>> My cluster is a remote system "sree".
>>>
>>> 1. I have a "chck" file in my /home/sree
>>>    I did
>>>             > hadoop fs -copFromLocal /home/sree/chck
>>>             > hadoop fs -ls
>>>
>>>                        -rw-r--r--   1 *sree*     supergroup         32
>>> 2013-12-03 14:27 chck
>>>
>>>                *  whether chck file is now resided in hdfs?*
>>> 2.After executing wordcount in my remote system my output folder looks
>>> like this
>>>
>>>                    drwxr-xr-x   - *hdfs*      supergroup          0
>>> 2013-11-19 09:41 wcout
>>>
>>>
>>> I have a confusion -  which is *hdfs*?
>>> The area where *chck* resided or *wcout* ?
>>>
>>>
>>> 3. Am i able to update/append "chck" file through MR job?
>>>
>>> 4.  -rw-r--r--   1 *hdfs*     supergroup         32 2013-12-03 14:27
>>> myfile
>>>     Am i able to update/append "myfile" file through MR job?
>>>
>>> **I read that updation is not allowed in hdfs**
>>>
>>> --
>>> *Thanks & Regards*
>>>
>>> Unmesha Sreeveni U.B
>>>
>>> *Junior Developer*
>>>
>>>
>>>
>
>
> --
> *Thanks & Regards*
>
> Unmesha Sreeveni U.B
>
> *Junior Developer*
>
>
>
--
*Thanks & Regards*

Unmesha Sreeveni U.B

*Junior Developer*
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB