Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Plain View
MapReduce, mail # user - Re: Problem using distributed cache


Copy link to this message
-
Re: Problem using distributed cache
Harsh J 2012-12-06, 17:02
What is your conf object there? Is it job.getConfiguration() or an
independent instance?

On Thu, Dec 6, 2012 at 10:29 PM, Peter Cogan <[EMAIL PROTECTED]> wrote:
> Hi ,
>
> I want to use the distributed cache to allow my mappers to access data. In
> main, I'm using the command
>
> DistributedCache.addCacheFile(new URI("/user/peter/cacheFile/testCache1"),
> conf);
>
> Where /user/peter/cacheFile/testCache1 is a file that exists in hdfs
>
> Then, my setup function looks like this:
>
> public void setup(Context context) throws IOException, InterruptedException{
>     Configuration conf = context.getConfiguration();
>     Path[] localFiles = DistributedCache.getLocalCacheFiles(conf);
>     //etc
> }
>
> However, this localFiles array is always null.
>
> I was initially running on a single-host cluster for testing, but I read
> that this will prevent the distributed cache from working. I tried with a
> pseudo-distributed, but that didn't work either
>
> I'm using hadoop 1.0.3
>
> thanks Peter
>
>

--
Harsh J
+
surfer 2012-12-07, 14:49
+
Peter Cogan 2012-12-07, 14:06
+
bejoy.hadoop@... 2012-12-07, 15:13
+
Harsh J 2012-12-07, 14:25
+
Peter Cogan 2012-12-07, 15:22
+
Dhaval Shah 2012-12-07, 14:23