public class AvroMultipleInputsKeyInputFormat<T>
extends org.apache.hadoop.mapreduce.lib.input.FileInputFormat<org.apache.avro.mapred.AvroKey<T>,org.apache.hadoop.io.NullWritable>
Keys are AvroKey wrapper objects that contain the Avro data. Since Avro container files store only records (not key/value pairs), the value from this InputFormat is a NullWritable.
| Constructor and Description |
|---|
AvroMultipleInputsKeyInputFormat() |
| Modifier and Type | Method and Description |
|---|---|
org.apache.hadoop.mapreduce.RecordReader<org.apache.avro.mapred.AvroKey<T>,org.apache.hadoop.io.NullWritable> |
createRecordReader(org.apache.hadoop.mapreduce.InputSplit split,
org.apache.hadoop.mapreduce.TaskAttemptContext context) |
addInputPath, addInputPathRecursively, addInputPaths, computeSplitSize, getBlockIndex, getFormatMinSplitSize, getInputDirRecursive, getInputPathFilter, getInputPaths, getMaxSplitSize, getMinSplitSize, getSplits, isSplitable, listStatus, makeSplit, makeSplit, setInputDirRecursive, setInputPathFilter, setInputPaths, setInputPaths, setMaxInputSplitSize, setMinInputSplitSizepublic org.apache.hadoop.mapreduce.RecordReader<org.apache.avro.mapred.AvroKey<T>,org.apache.hadoop.io.NullWritable> createRecordReader(org.apache.hadoop.mapreduce.InputSplit split, org.apache.hadoop.mapreduce.TaskAttemptContext context) throws java.io.IOException, java.lang.InterruptedException
createRecordReader in class org.apache.hadoop.mapreduce.InputFormat<org.apache.avro.mapred.AvroKey<T>,org.apache.hadoop.io.NullWritable>java.io.IOExceptionjava.lang.InterruptedException