Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # general >> jni files


Amit,

On Fri, Jul 9, 2010 at 2:39 PM, amit kumar verma <[EMAIL PROTECTED]> wrote:
>  Hi Hemant,
>
> The version are same as copied it to all client machine.
>
> I think I got a solution. As I read more about hadoop and JNI, I learned
> that I need to copy jni files to
> HADOOP_INSTALLATION_DIR//lib/native/Linux-xxx-xxx. I though my linux machine
> is Linux-i386-32. then I found in "org.apache.hadoop.util.PlatformName"
> class gives you your machine type and its Linux-amd64-64 and asa I copied
> jni files to this directory error are not coming.
>
> Though full code is still not running as I developed the application using
> java.file class and i am still thinking how to make changes so that it can
> access hdfs !!!  Do i need to change my all API with respect to HDFS and
> rewrite using hadoop fs or ??!!!
>

To access files from HDFS, you should use the Hadoop FileSystem API.
Please take a look at the Javadoc and also a tutorial such as this:
http://developer.yahoo.com/hadoop/tutorial/module2.html#programmatically
for more information.

> It will be great if someone advice on this.
>
>
>
> Thanks,
> Amit Kumar Verma
> Verchaska Infotech Pvt. Ltd.
>
>
>
> On 07/09/2010 02:04 PM, Hemanth Yamijala wrote:
>>
>> Hi,
>>
>> Possibly another silly question, but can you cross check if the
>> versions of Hadoop on the client and the server are the same ?
>>
>> Thanks
>> hemanth
>>
>> On Thu, Jul 8, 2010 at 10:57 PM, Allen Wittenauer
>> <[EMAIL PROTECTED]>  wrote:
>>>
>>> On Jul 8, 2010, at 1:08 AM, amit kumar verma wrote:
>>>
>>>>     DistributedCache.addCacheFile("hdfs://*
>>>>     /192.168.0.153:50075*/libraries/mylib.so.1#mylib.so", conf);
>>>
>>> Do you actually have asterisks in this?  If so, that's the problem.
>>>
>>>
>