rohit sarewar 2013-05-13, 13:34
Hadoop just runs as a standard java process, you should find something that
bridges between OpenCL and java, a quick google search yields:
I expect that you'll find everything you need to accomplish the handoff from
your mapreduce code to OpenCL there.
As for HDFS, hadoop will generally handle marshaling data in/out of HDFS,
remember that you're thinking of your problem in terms of KEY,VALUE pairs,
you're going to implement a map(.) and reduce(.) method and in those methods
you'll pass the data to/from OpenCL via the OpenCL java bindings. It's quite
common to need to run multiple map/reduce steps to accomplish an end goal.
From: rohit sarewar [mailto:[EMAIL PROTECTED]]
Sent: Monday, May 13, 2013 8:35 PM
To: [EMAIL PROTECTED]
Subject: Access HDFS from OpenCL
My data set resides in HDFS. I need to compute 5 metrics, among which 2 are
compute intensive. So I want to compute those 2 metrics on GPU using OpenCL
and the rest 3 metrics using java map reduce code on Hadoop.
How can I pass data from HDFS to GPU ? or How can my opencl code access data
from HDFS ?
How can I trigger OpenCL codes from my Java map reduce codes ?
It would be great if someone could share a sample code.