Java Code Examples for org.apache.flink.table.types.utils.TypeConversions#fromLegacyInfoToDataType()
The following examples show how to use
org.apache.flink.table.types.utils.TypeConversions#fromLegacyInfoToDataType() .
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: TypeMappingUtilsTest.java From flink with Apache License 2.0 | 6 votes |
@Test public void testCheckPhysicalLogicalTypeCompatible() { TableSchema tableSchema = TableSchema.builder() .field("a", DataTypes.VARCHAR(2)) .field("b", DataTypes.DECIMAL(20, 2)) .build(); TableSink tableSink = new TestTableSink(tableSchema); LegacyTypeInformationType legacyDataType = (LegacyTypeInformationType) tableSink.getConsumedDataType() .getLogicalType(); TypeInformation legacyTypeInfo = ((TupleTypeInfo) legacyDataType.getTypeInformation()).getTypeAt(1); DataType physicalType = TypeConversions.fromLegacyInfoToDataType(legacyTypeInfo); TableSchema physicSchema = DataTypeUtils.expandCompositeTypeToSchema(physicalType); DataType[] logicalDataTypes = tableSchema.getFieldDataTypes(); DataType[] physicalDataTypes = physicSchema.getFieldDataTypes(); for (int i = 0; i < logicalDataTypes.length; i++) { TypeMappingUtils.checkPhysicalLogicalTypeCompatible( physicalDataTypes[i].getLogicalType(), logicalDataTypes[i].getLogicalType(), "physicalField", "logicalField", false); } }
Example 2
Source File: StreamTableEnvironmentImpl.java From flink with Apache License 2.0 | 5 votes |
@Override public <T> DataStream<T> toAppendStream(Table table, TypeInformation<T> typeInfo) { OutputConversionModifyOperation modifyOperation = new OutputConversionModifyOperation( table.getQueryOperation(), TypeConversions.fromLegacyInfoToDataType(typeInfo), OutputConversionModifyOperation.UpdateMode.APPEND); return toDataStream(table, modifyOperation); }
Example 3
Source File: StreamTableEnvironmentImpl.java From flink with Apache License 2.0 | 5 votes |
@Override public <T> DataStream<T> toAppendStream(Table table, TypeInformation<T> typeInfo) { OutputConversionModifyOperation modifyOperation = new OutputConversionModifyOperation( table.getQueryOperation(), TypeConversions.fromLegacyInfoToDataType(typeInfo), OutputConversionModifyOperation.UpdateMode.APPEND); return toDataStream(table, modifyOperation); }
Example 4
Source File: FunctionLookupMock.java From flink with Apache License 2.0 | 5 votes |
@Override public PlannerTypeInferenceUtil getPlannerTypeInferenceUtil() { return (unresolvedCall, resolvedArgs) -> { FunctionDefinition functionDefinition = unresolvedCall.getFunctionDefinition(); List<DataType> argumentTypes = resolvedArgs.stream() .map(ResolvedExpression::getOutputDataType) .collect(Collectors.toList()); if (functionDefinition.equals(BuiltInFunctionDefinitions.EQUALS)) { return new TypeInferenceUtil.Result( argumentTypes, null, DataTypes.BOOLEAN() ); } else if (functionDefinition.equals(BuiltInFunctionDefinitions.IS_NULL)) { return new TypeInferenceUtil.Result( argumentTypes, null, DataTypes.BOOLEAN() ); } else if (functionDefinition instanceof ScalarFunctionDefinition) { return new TypeInferenceUtil.Result( argumentTypes, null, // We do not support a full legacy type inference here. We support only a static result // type TypeConversions.fromLegacyInfoToDataType(((ScalarFunctionDefinition) functionDefinition) .getScalarFunction() .getResultType(null))); } throw new IllegalArgumentException( "Unsupported builtin function in the test: " + unresolvedCall); }; }
Example 5
Source File: LogicalTypeChecksTest.java From flink with Apache License 2.0 | 5 votes |
@Test public void testIsCompositeTypeLegacySimpleType() { DataType dataType = TypeConversions.fromLegacyInfoToDataType(Types.STRING); boolean isCompositeType = LogicalTypeChecks.isCompositeType(dataType.getLogicalType()); assertThat(isCompositeType, is(false)); }
Example 6
Source File: LogicalTypeChecksTest.java From flink with Apache License 2.0 | 5 votes |
@Test public void testIsCompositeTypeLegacyCompositeType() { DataType dataType = TypeConversions.fromLegacyInfoToDataType(new RowTypeInfo(Types.STRING, Types.INT)); boolean isCompositeType = LogicalTypeChecks.isCompositeType(dataType.getLogicalType()); assertThat(isCompositeType, is(true)); }
Example 7
Source File: CsvTableSource.java From flink with Apache License 2.0 | 4 votes |
/** * A {@link InputFormatTableSource} and {@link LookupableTableSource} for simple CSV files with * a (logically) unlimited number of fields. * * @param path The path to the CSV file. * @param fieldNames The names of the table fields. * @param fieldTypes The types of the table fields. * @param selectedFields The fields which will be read and returned by the table source. If * None, all fields are returned. * @param fieldDelim The field delimiter, "," by default. * @param lineDelim The row delimiter, "\n" by default. * @param quoteCharacter An optional quote character for String values, null by default. * @param ignoreFirstLine Flag to ignore the first line, false by default. * @param ignoreComments An optional prefix to indicate comments, null by default. * @param lenient Flag to skip records with parse error instead to fail, false by * default. */ public CsvTableSource( String path, String[] fieldNames, TypeInformation<?>[] fieldTypes, int[] selectedFields, String fieldDelim, String lineDelim, Character quoteCharacter, boolean ignoreFirstLine, String ignoreComments, boolean lenient) { this(new CsvInputFormatConfig( path, fieldNames, TypeConversions.fromLegacyInfoToDataType(fieldTypes), selectedFields, fieldDelim, lineDelim, quoteCharacter, ignoreFirstLine, ignoreComments, lenient, false)); }
Example 8
Source File: HiveTableSinkTest.java From flink with Apache License 2.0 | 4 votes |
@Override public DataType getProducedDataType() { return TypeConversions.fromLegacyInfoToDataType(rowTypeInfo); }
Example 9
Source File: StreamTableEnvironmentImpl.java From flink with Apache License 2.0 | 4 votes |
private <T> DataType wrapWithChangeFlag(TypeInformation<T> outputType) { TupleTypeInfo tupleTypeInfo = new TupleTypeInfo<Tuple2<Boolean, T>>(Types.BOOLEAN(), outputType); return TypeConversions.fromLegacyInfoToDataType(tupleTypeInfo); }
Example 10
Source File: TypeTransformationsTest.java From flink with Apache License 2.0 | 4 votes |
private static DataType createLegacyRaw() { return TypeConversions.fromLegacyInfoToDataType(Types.GENERIC(TypeTransformationsTest.class)); }
Example 11
Source File: TypeTransformationsTest.java From flink with Apache License 2.0 | 4 votes |
private static DataType createLegacyDecimal() { return TypeConversions.fromLegacyInfoToDataType(Types.BIG_DEC); }
Example 12
Source File: TypeInfoLogicalTypeConverter.java From flink with Apache License 2.0 | 4 votes |
/** * It will lose some information. (Like {@link PojoTypeInfo} will converted to {@link RowType}) * It and {@link TypeInfoLogicalTypeConverter#fromLogicalTypeToTypeInfo} not allows back-and-forth conversion. */ public static LogicalType fromTypeInfoToLogicalType(TypeInformation typeInfo) { DataType dataType = TypeConversions.fromLegacyInfoToDataType(typeInfo); return LogicalTypeDataTypeConverter.fromDataTypeToLogicalType(dataType); }
Example 13
Source File: HiveTableSinkITCase.java From flink with Apache License 2.0 | 4 votes |
@Override public DataType getProducedDataType() { return TypeConversions.fromLegacyInfoToDataType(rowTypeInfo); }
Example 14
Source File: HBaseTableSchema.java From flink with Apache License 2.0 | 4 votes |
/** * Sets row key information in the table schema. * @param rowKeyName the row key field name * @param clazz the data type of the row key */ public void setRowKey(String rowKeyName, Class<?> clazz) { Preconditions.checkNotNull(clazz, "row key class type"); DataType type = TypeConversions.fromLegacyInfoToDataType(TypeExtractor.getForClass(clazz)); setRowKey(rowKeyName, type); }
Example 15
Source File: StreamTableEnvironmentImpl.java From flink with Apache License 2.0 | 4 votes |
private <T> DataType wrapWithChangeFlag(TypeInformation<T> outputType) { TupleTypeInfo tupleTypeInfo = new TupleTypeInfo<Tuple2<Boolean, T>>(Types.BOOLEAN(), outputType); return TypeConversions.fromLegacyInfoToDataType(tupleTypeInfo); }
Example 16
Source File: TypeInfoLogicalTypeConverter.java From flink with Apache License 2.0 | 4 votes |
/** * It will lose some information. (Like {@link PojoTypeInfo} will converted to {@link RowType}) * It and {@link TypeInfoLogicalTypeConverter#fromLogicalTypeToTypeInfo} not allows back-and-forth conversion. */ public static LogicalType fromTypeInfoToLogicalType(TypeInformation typeInfo) { DataType dataType = TypeConversions.fromLegacyInfoToDataType(typeInfo); return LogicalTypeDataTypeConverter.fromDataTypeToLogicalType(dataType); }
Example 17
Source File: HBaseTableSchema.java From flink with Apache License 2.0 | 2 votes |
/** * Adds a column defined by family, qualifier, and type to the table schema. * * @param family the family name * @param qualifier the qualifier name * @param clazz the data type of the qualifier */ public void addColumn(String family, String qualifier, Class<?> clazz) { Preconditions.checkNotNull(clazz, "class type"); DataType type = TypeConversions.fromLegacyInfoToDataType(TypeExtractor.getForClass(clazz)); addColumn(family, qualifier, type); }