Java Code Examples for org.apache.flink.streaming.api.environment.StreamExecutionEnvironment#readFile()
The following examples show how to use
org.apache.flink.streaming.api.environment.StreamExecutionEnvironment#readFile() .
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: ContinuousFileProcessingCheckpointITCase.java From flink with Apache License 2.0 | 6 votes |
@Override public void testProgram(StreamExecutionEnvironment env) { env.enableCheckpointing(10); // create and start the file creating thread. fc = new FileCreator(); fc.start(); // create the monitoring source along with the necessary readers. TextInputFormat format = new TextInputFormat(new org.apache.flink.core.fs.Path(localFsURI)); format.setFilesFilter(FilePathFilter.createDefaultFilter()); DataStream<String> inputStream = env.readFile(format, localFsURI, FileProcessingMode.PROCESS_CONTINUOUSLY, INTERVAL); TestingSinkFunction sink = new TestingSinkFunction(); inputStream.flatMap(new FlatMapFunction<String, String>() { @Override public void flatMap(String value, Collector<String> out) throws Exception { out.collect(value); } }).addSink(sink).setParallelism(1); }
Example 2
Source File: ContinuousFileProcessingCheckpointITCase.java From Flink-CEPplus with Apache License 2.0 | 5 votes |
@Override public void testProgram(StreamExecutionEnvironment env) { // set the restart strategy. env.getConfig().setRestartStrategy(RestartStrategies.fixedDelayRestart(NO_OF_RETRIES, 0)); env.enableCheckpointing(10); // create and start the file creating thread. fc = new FileCreator(); fc.start(); // create the monitoring source along with the necessary readers. TextInputFormat format = new TextInputFormat(new org.apache.flink.core.fs.Path(localFsURI)); format.setFilesFilter(FilePathFilter.createDefaultFilter()); DataStream<String> inputStream = env.readFile(format, localFsURI, FileProcessingMode.PROCESS_CONTINUOUSLY, INTERVAL); TestingSinkFunction sink = new TestingSinkFunction(); inputStream.flatMap(new FlatMapFunction<String, String>() { @Override public void flatMap(String value, Collector<String> out) throws Exception { out.collect(value); } }).addSink(sink).setParallelism(1); }
Example 3
Source File: InsideDataSource.java From flink-simple-tutorial with Apache License 2.0 | 5 votes |
public static void main(String[] args) throws Exception { final StreamExecutionEnvironment env = StreamExecutionEnvironment.getExecutionEnvironment(); // 添加数组作为数据输入源 String[] elementInput = new String[]{"hello Flink", "Second Line"}; DataStream<String> text = env.fromElements(elementInput); // 添加List集合作为数据输入源 List<String> collectionInput = new ArrayList<>(); collectionInput.add("hello Flink"); DataStream<String> text2 = env.fromCollection(collectionInput); // 添加Socket作为数据输入源 // 4个参数 -> (hostname:Ip地址, port:端口, delimiter:分隔符, maxRetry:最大重试次数) DataStream<String> text3 = env.socketTextStream("localhost", 9999, "\n", 4); // 添加文件源 // 直接读取文本文件 DataStream<String> text4 = env.readTextFile("/opt/history.log"); // 指定 CsvInputFormat, 监控csv文件(两种模式), 时间间隔是10ms DataStream<String> text5 = env.readFile(new CsvInputFormat<String>(new Path("/opt/history.csv")) { @Override protected String fillRecord(String s, Object[] objects) { return null; } },"/opt/history.csv", FileProcessingMode.PROCESS_CONTINUOUSLY,10); text.print(); env.execute("Inside DataSource Demo"); }
Example 4
Source File: ContinuousFileProcessingCheckpointITCase.java From flink with Apache License 2.0 | 5 votes |
@Override public void testProgram(StreamExecutionEnvironment env) { // set the restart strategy. env.getConfig().setRestartStrategy(RestartStrategies.fixedDelayRestart(NO_OF_RETRIES, 0)); env.enableCheckpointing(10); // create and start the file creating thread. fc = new FileCreator(); fc.start(); // create the monitoring source along with the necessary readers. TextInputFormat format = new TextInputFormat(new org.apache.flink.core.fs.Path(localFsURI)); format.setFilesFilter(FilePathFilter.createDefaultFilter()); DataStream<String> inputStream = env.readFile(format, localFsURI, FileProcessingMode.PROCESS_CONTINUOUSLY, INTERVAL); TestingSinkFunction sink = new TestingSinkFunction(); inputStream.flatMap(new FlatMapFunction<String, String>() { @Override public void flatMap(String value, Collector<String> out) throws Exception { out.collect(value); } }).addSink(sink).setParallelism(1); }