org.apache.hadoop.mapred.LineRecordReader Java Examples
The following examples show how to use
org.apache.hadoop.mapred.LineRecordReader.
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: JsonAccessor.java From pxf with Apache License 2.0 | 5 votes |
@Override protected Object getReader(JobConf conf, InputSplit split) throws IOException { if (!isEmpty(identifier)) { conf.set(JsonRecordReader.RECORD_MEMBER_IDENTIFIER, identifier); conf.setInt(JsonRecordReader.RECORD_MAX_LENGTH, maxRecordLength); return new JsonRecordReader(conf, (FileSplit) split); } else { return new LineRecordReader(conf, (FileSplit) split); } }
Example #2
Source File: LineBreakAccessor.java From pxf with Apache License 2.0 | 5 votes |
@Override protected Object getReader(JobConf jobConf, InputSplit split) throws IOException { // for HDFS, try to use ChunkRecordReader, if possible (not reading from encrypted zone) if (hcfsType == HcfsType.HDFS) { try { return new ChunkRecordReader(jobConf, (FileSplit) split); } catch (IncompatibleInputStreamException e) { // ignore and fallback to using LineRecordReader LOG.debug("Failed to use ChunkRecordReader, falling back to LineRecordReader : " + e.getMessage()); } } return new LineRecordReader(jobConf, (FileSplit) split); }
Example #3
Source File: NLineInputFormat.java From hadoop with Apache License 2.0 | 5 votes |
public RecordReader<LongWritable, Text> getRecordReader( InputSplit genericSplit, JobConf job, Reporter reporter) throws IOException { reporter.setStatus(genericSplit.toString()); return new LineRecordReader(job, (FileSplit) genericSplit); }
Example #4
Source File: NLineInputFormat.java From big-c with Apache License 2.0 | 5 votes |
public RecordReader<LongWritable, Text> getRecordReader( InputSplit genericSplit, JobConf job, Reporter reporter) throws IOException { reporter.setStatus(genericSplit.toString()); return new LineRecordReader(job, (FileSplit) genericSplit); }
Example #5
Source File: Base64TextInputFormat.java From bigdata-tutorial with Apache License 2.0 | 5 votes |
public RecordReader<LongWritable, BytesWritable> getRecordReader( InputSplit genericSplit, JobConf job, Reporter reporter) throws IOException { reporter.setStatus(genericSplit.toString()); Base64LineRecordReader reader = new Base64LineRecordReader( new LineRecordReader(job, (FileSplit) genericSplit)); reader.configure(job); return reader; }
Example #6
Source File: MyDemoInputFormat.java From bigdata-tutorial with Apache License 2.0 | 5 votes |
@Override public RecordReader<LongWritable, Text> getRecordReader( InputSplit genericSplit, JobConf job, Reporter reporter) throws IOException { reporter.setStatus(genericSplit.toString()); MyDemoRecordReader reader = new MyDemoRecordReader( new LineRecordReader(job, (FileSplit) genericSplit)); return reader; }
Example #7
Source File: NLineInputFormat.java From RDFS with Apache License 2.0 | 5 votes |
public RecordReader<LongWritable, Text> getRecordReader( InputSplit genericSplit, JobConf job, Reporter reporter) throws IOException { reporter.setStatus(genericSplit.toString()); return new LineRecordReader(job, (FileSplit) genericSplit); }
Example #8
Source File: NLineInputFormat.java From hadoop-gpu with Apache License 2.0 | 5 votes |
public RecordReader<LongWritable, Text> getRecordReader( InputSplit genericSplit, JobConf job, Reporter reporter) throws IOException { reporter.setStatus(genericSplit.toString()); return new LineRecordReader(job, (FileSplit) genericSplit); }
Example #9
Source File: HiveLineBreakAccessor.java From pxf with Apache License 2.0 | 4 votes |
@Override protected Object getReader(JobConf jobConf, InputSplit split) throws IOException { return new LineRecordReader(jobConf, (FileSplit) split); }
Example #10
Source File: Base64TextInputFormat.java From bigdata-tutorial with Apache License 2.0 | 4 votes |
public Base64LineRecordReader(LineRecordReader reader) { this.reader = reader; text = reader.createValue(); }
Example #11
Source File: MyDemoInputFormat.java From bigdata-tutorial with Apache License 2.0 | 4 votes |
public MyDemoRecordReader(LineRecordReader reader) { this.reader = reader; text = reader.createValue(); }
Example #12
Source File: TeraInputFormat.java From RDFS with Apache License 2.0 | 4 votes |
public TeraRecordReader(Configuration job, FileSplit split) throws IOException { in = new LineRecordReader(job, split); }
Example #13
Source File: TeraInputFormat.java From hadoop-book with Apache License 2.0 | 4 votes |
public TeraRecordReader(Configuration job, FileSplit split) throws IOException { in = new LineRecordReader(job, split); }
Example #14
Source File: TeraInputFormat.java From hadoop-gpu with Apache License 2.0 | 4 votes |
public TeraRecordReader(Configuration job, FileSplit split) throws IOException { in = new LineRecordReader(job, split); }