Home | About | Sematext search-lucene.com search-hadoop.com
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB
 Search Hadoop and all its subprojects:

Switch to Plain View
HDFS >> mail # user >> How-to use DFSClient's BlockReader from Java


+
David Pavlis 2012-01-06, 11:01
Copy link to this message
-
Re: How-to use DFSClient's BlockReader from Java
hi David
     Please refer to the method "DFSInputStream#blockSeekTo", it has
same purpose with you.

***************************************************************************
        LocatedBlock targetBlock = getBlockAt(target, true);
        assert (target==this.pos) : "Wrong postion " + pos + " expect " +
target;
        long offsetIntoBlock = target - targetBlock.getStartOffset();

        DNAddrPair retval = chooseDataNode(targetBlock);
        chosenNode = retval.info;
        InetSocketAddress targetAddr = retval.addr;

        try {
          s = socketFactory.createSocket();
          NetUtils.connect(s, targetAddr, socketTimeout);
          s.setSoTimeout(socketTimeout);
          Block blk = targetBlock.getBlock();
          Token<BlockTokenIdentifier> accessToken targetBlock.getBlockToken();

          blockReader = BlockReader.newBlockReader(s, src, blk.getBlockId(),
              accessToken,
              blk.getGenerationStamp(),
              offsetIntoBlock, blk.getNumBytes() - offsetIntoBlock,
              buffersize, verifyChecksum, clientName);

***************************************************************************
-Regards
Denny Ye

2012/1/6 David Pavlis <[EMAIL PROTECTED]>

> Hi,
>
> I am relatively new to Hadoop and I am trying to utilize HDFS for own
> application where I want to take advantage of data partitioning HDFS
> performs.
>
> The idea is that I get list of individual blocks - BlockLocations of
> particular file and then directly read those (go to individual DataNodes).
> So far I found org.apache.hadoop.hdfs.DFSClient.BlockReader to be the way
> to go.
>
> However I am struggling with instantiating the BlockReader() class, namely
> creating the "Token<BlockTokenIdentifier>".
>
> Is there an example Java code showing how to access individual blocks of
> particular file stored on HDFS ?
>
> Thanks in advance,
>
> David.
>
>
>
>
>
>
+
David Pavlis 2012-01-09, 17:56
+
Todd Lipcon 2012-01-09, 17:59
+
David Pavlis 2012-01-09, 20:01
+
Todd Lipcon 2012-01-09, 20:30
+
David Pavlis 2012-01-10, 12:32
+
Todd Lipcon 2012-01-10, 18:19
+
David Pavlis 2012-01-11, 16:31
+
Joey Echeverria 2012-01-11, 16:34
NEW: Monitor These Apps!
elasticsearch, apache solr, apache hbase, hadoop, redis, casssandra, amazon cloudwatch, mysql, memcached, apache kafka, apache zookeeper, apache storm, ubuntu, centOS, red hat, debian, puppet labs, java, senseiDB