Home | About | Sematext search-lucene.com search-hadoop.com
 Search Hadoop and all its subprojects:

Switch to Threaded View
MapReduce, mail # dev - Fwd: Map task failed: No protocol specified

Copy link to this message
Fwd: Map task failed: No protocol specified
rohit sarewar 2013-09-21, 13:31
Hi Dev team

Can you please help me in this regard ?
I am using CDH4 and I am trying to access GPU from cleanup() method of
mapper class using JOCL <http://www.jocl.org/>.
(Note: My normal code(without map reduce) works fine on GPU).

When I execute my map-reduce code, It throws an error (specified below).
attempt_201309171647_0021_m_000000_1: No protocol specified
attempt_201309171647_0021_m_000000_1: No protocol specified
13/09/20 18:03:01 INFO mapred.JobClient: Task Id :
attempt_201309171647_0021_m_000000_2, Status : FAILED
org.jocl.CLException: CL_DEVICE_NOT_FOUND
    at org.jocl.CL.checkResult(CL.java:569)
    at org.jocl.CL.clGetDeviceIDs(CL.java:2239)
    at com.testMR.jocl.WordCountMapper.cleanup(WordCountMapper.java:106)
    at org.apache.hadoop.mapreduce.Mapper.run(Mapper.java:142)
    at org.apache.hadoop.mapred.MapTask.runNewMapper(MapTask.java:672)
    at org.apache.hadoop.mapred.MapTask.run(MapTask.java:330)
    at org.apache.hadoop.mapred.Child$4.run(Child.java:268)
    at java.security.AccessController.doPrivileged(Native Method)
    at javax.security.auth.Subject.doAs(Subject.java:396)

    at org.apache.hadoop.mapred.Child.main(Child.java:262)


Each map task throws an error  "No protocol specified". what does this
What are the protocols used in mapper class ?
Rohit Sarewar
---------- Forwarded message ----------
From: rohit sarewar <[EMAIL PROTECTED]>
Date: Sat, Sep 21, 2013 at 12:09 AM
Subject: Re: Map task failed: No protocol specified
Hi Harsh

I am currently using  a single node cluster.
This is a sample JOCL code which I tried to execute on GPU.
You can find "  * final long deviceType = CL_DEVICE_TYPE_GPU;*" in bold in
the code snippet below.

I have an AMD GPU on my machine.
CL_DEVICE_NAME:             Tahiti
CL_DEVICE_VENDOR:             Advanced Micro Devices, Inc.
CL_DRIVER_VERSION:             1214.3 (VM)


If I change this to CPU instead of GPU (i.e *final long deviceType CL_DEVICE_TYPE_CPU)* then the mapper runs to completion and job is
Please find the code snippet(Mapper Class) below:

package com.testMR.jocl;
import static org.jocl.CL.CL_CONTEXT_PLATFORM;
import static org.jocl.CL.CL_DEVICE_TYPE_ALL;
import static org.jocl.CL.CL_DEVICE_TYPE_GPU;
import static org.jocl.CL.CL_DEVICE_TYPE_CPU;
import static org.jocl.CL.CL_MEM_COPY_HOST_PTR;
import static org.jocl.CL.CL_MEM_READ_ONLY;
import static org.jocl.CL.CL_MEM_READ_WRITE;
import static org.jocl.CL.CL_TRUE;
import static org.jocl.CL.clBuildProgram;
import static org.jocl.CL.clCreateBuffer;
import static org.jocl.CL.clCreateCommandQueue;
import static org.jocl.CL.clCreateContext;
import static org.jocl.CL.clCreateKernel;
import static org.jocl.CL.clCreateProgramWithSource;
import static org.jocl.CL.clEnqueueNDRangeKernel;
import static org.jocl.CL.clEnqueueReadBuffer;
import static org.jocl.CL.clGetDeviceIDs;
import static org.jocl.CL.clGetPlatformIDs;
import static org.jocl.CL.clReleaseCommandQueue;
import static org.jocl.CL.clReleaseContext;
import static org.jocl.CL.clReleaseKernel;
import static org.jocl.CL.clReleaseMemObject;
import static org.jocl.CL.clReleaseProgram;
import static org.jocl.CL.clSetKernelArg;

import java.io.IOException;
import java.util.StringTokenizer;
import org.apache.hadoop.io.*;
import org.apache.hadoop.mapreduce.Mapper;
import org.jocl.CL;
import org.jocl.Pointer;
import org.jocl.Sizeof;
import org.jocl.cl_command_queue;
import org.jocl.cl_context;
import org.jocl.cl_context_properties;
import org.jocl.cl_device_id;
import org.jocl.cl_kernel;
import org.jocl.cl_mem;
import org.jocl.cl_platform_id;
import org.jocl.cl_program;

public class WordCountMapper extends Mapper<LongWritable, Text, Text,
     private static String programSource              "__kernel void "+
             "sampleKernel(__global const float *a,"+
             "             __global const float *b,"+
             "             __global float *c)"+
             "    int gid = get_global_id(0);"+
             "    c[gid] = a[gid] * b[gid];"+

      //hadoop supported data types
      private final static IntWritable one = new IntWritable(1);
      private Text word = new Text();
           public void map(LongWritable key, Text value, Context context)
throws IOException, InterruptedException
             //context.write(arg0, arg1);
           protected void cleanup(Context context) throws IOException,
                    // Create input- and output data
                    int n = 10;
                    float srcArrayA[] = new float[n];
                    float srcArrayB[] = new float[n];
                    float dstArray[] = new float[n];
                    for (int i=0; i<n; i++)
                        srcArrayA[i] = i;
                        srcArrayB[i] = i;
                    Pointer srcA = Pointer.to(srcArrayA);
                    Pointer srcB = Pointer.to(srcArrayB);
                    Pointer dst = Pointer.to(dstArray);

                    // The platform, device type and device number
                    // that will be used
                    final int platformIndex = 0;
                   * final long deviceType = CL_DEVICE_TYPE_GPU;*
                    final int deviceIndex = 0;

                    // Enable exceptions and subsequently omit error checks
in this sample

                    // Obtain the number of platforms