Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Threaded View
Hadoop >> mail # user >> Error:Hdfs Client for hadoop using native java api


Copy link to this message
-
Re: Error:Hdfs Client for hadoop using native java api
By the way you have these two options
   1.

   Put the configuration files in the classpath, so that the code picks it.
   2.

   Use Configuration.set()<http://hadoop.apache.org/common/docs/r0.21.0/api/org/apache/hadoop/conf/Configuration.html#set%28java.lang.String,%20java.lang.String%29>to
set the required parameters in the code.

because the configuration may be directing to local file whice code is nt
abl to find
On Thu, Jul 19, 2012 at 9:32 PM, shashwat shriparv <
[EMAIL PROTECTED]> wrote:

> Can you provide the code(some part of it if you don't want to throw the
> full code here) and let us know which part of your code is throwing this
> error.
>
> Regards
>
> ∞
> Shashwat Shriparv
>
>
>
>
> On Thu, Jul 19, 2012 at 6:46 PM, Sandeep Reddy P <
> [EMAIL PROTECTED]> wrote:
>
>> Hi John,
>> We have applications in windows. So our dev's need to connect to HDFS from
>> eclipse installed in windows. I'm trying to put data from <local-file> to
>> <hdfs-file> using java code from windows.
>>
>> On Thu, Jul 19, 2012 at 5:41 AM, John Hancock <[EMAIL PROTECTED]
>> >wrote:
>>
>> > Sandeep,
>> >
>> > I don't understand your situation completely, but why not just use
>> > bin/hadoop dfs -copyFromLocal <local-file-name> <hdfs-file-name> ?
>> >
>> > -John
>> >
>> > On Wed, Jul 18, 2012 at 11:33 AM, Sandeep Reddy P <
>> > [EMAIL PROTECTED]> wrote:
>> >
>> > > Hi,
>> > > I'm trying to load data into hdfs from local linux file system using
>> java
>> > > code from a windows machine.But i'm getting the error
>> > >
>> > > java.lang.IllegalArgumentException: Wrong FS:
>> > > hdfs://hadoop1.devqa.local:8020/user/hdfs/java, expected: file:///
>> > >     at org.apache.hadoop.fs.FileSystem.checkPath(FileSystem.java:410)
>> > >     at
>> > >
>> > >
>> >
>> org.apache.hadoop.fs.RawLocalFileSystem.pathToFile(RawLocalFileSystem.java:56)
>> > >     at
>> > >
>> > >
>> >
>> org.apache.hadoop.fs.RawLocalFileSystem.getFileStatus(RawLocalFileSystem.java:404)
>> > >     at
>> > >
>> > >
>> >
>> org.apache.hadoop.fs.FilterFileSystem.getFileStatus(FilterFileSystem.java:251)
>> > >     at org.apache.hadoop.fs.FileSystem.exists(FileSystem.java:797)
>> > >     at org.apache.hadoop.fs.FileUtil.checkDest(FileUtil.java:349)
>> > >     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:205)
>> > >     at org.apache.hadoop.fs.FileUtil.copy(FileUtil.java:157)
>> > >     at
>> > >
>> > >
>> >
>> org.apache.hadoop.fs.LocalFileSystem.copyFromLocalFile(LocalFileSystem.java:55)
>> > >     at
>> > >
>> org.apache.hadoop.fs.FileSystem.copyFromLocalFile(FileSystem.java:1292)
>> > >     at Hdfs.main(Hdfs.java:18)
>> > > File not found
>> > >
>> > > Can any one please help me with the issue.
>> > >
>> > > --
>> > > Thanks,
>> > > sandeep
>> > >
>> >
>>
>>
>>
>> --
>> Thanks,
>> sandeep
>>
>
>
>
> --
>
>
> ∞
> Shashwat Shriparv
>
>
>
--

Shashwat Shriparv
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB