org.apache.spark.sql.sources.v2.writer.DataWriterFactory Java Examples
The following examples show how to use
org.apache.spark.sql.sources.v2.writer.DataWriterFactory.
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: ParallelRowReadWriteDataSource.java From spark-data-sources with MIT License | 4 votes |
@Override public DataWriterFactory<Row> createWriterFactory() { DataWriterFactory<Row> factory = new RowDataWriterFactory(_host, _port, _table, _schema); return factory; }
Example #2
Source File: Writer.java From iceberg with Apache License 2.0 | 4 votes |
@Override public DataWriterFactory<InternalRow> createWriterFactory() { return new WriterFactory( table.spec(), format, table.locationProvider(), table.properties(), io, encryptionManager, targetFileSize, writeSchema, dsSchema); }
Example #3
Source File: Writer.java From iceberg with Apache License 2.0 | 4 votes |
@Override public DataWriterFactory<InternalRow> createInternalRowWriterFactory() { return new WriterFactory(table.spec(), format, dataLocation(), table.properties(), conf); }
Example #4
Source File: HiveWarehouseDataSourceWriter.java From spark-llap with Apache License 2.0 | 4 votes |
@Override public DataWriterFactory<InternalRow> createInternalRowWriterFactory() { return new HiveWarehouseDataWriterFactory(jobId, schema, path, new SerializableHadoopConfiguration(conf)); }
Example #5
Source File: HiveStreamingDataSourceWriter.java From spark-llap with Apache License 2.0 | 4 votes |
@Override public DataWriterFactory<InternalRow> createInternalRowWriterFactory() { return new HiveStreamingDataWriterFactory(jobId, schema, commitIntervalRows, db, table, partition, metastoreUri, metastoreKrbPrincipal); }
Example #6
Source File: HiveStreamingDataSourceWriter.java From spark-llap with Apache License 2.0 | 4 votes |
@Override public DataWriterFactory<InternalRow> createInternalRowWriterFactory() { // for the streaming case, commit transaction happens on task commit() (atleast-once), so interval is set to -1 return new HiveStreamingDataWriterFactory(jobId, schema, -1, db, table, partition, metastoreUri, metastoreKerberosPrincipal); }
Example #7
Source File: MockWriteSupport.java From spark-llap with Apache License 2.0 | 4 votes |
@Override public DataWriterFactory<InternalRow> createInternalRowWriterFactory() { return new MockHiveWarehouseDataWriterFactory(jobId, schema, path, new SerializableHadoopConfiguration(conf)); }