org.apache.spark.sql.sources.BaseRelation Java Examples
The following examples show how to use
org.apache.spark.sql.sources.BaseRelation.
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: SubStringCounterDataSource.java From net.jgp.labs.spark with Apache License 2.0 | 6 votes |
@Override public BaseRelation createRelation(SQLContext arg0, Map<String, String> arg1) { log.debug("-> createRelation()"); java.util.Map<String, String> javaMap = scala.collection.JavaConverters .mapAsJavaMapConverter(arg1).asJava(); SubStringCounterRelation br = new SubStringCounterRelation(); br.setSqlContext(arg0); for (java.util.Map.Entry<String, String> entry : javaMap.entrySet()) { String key = entry.getKey(); String value = entry.getValue(); log.debug("[{}] --> [{}]", key, value); if (key.compareTo(K.PATH) == 0) { br.setFilename(value); } else if (key.startsWith(K.COUNT)) { br.addCriteria(value); } } return br; }
Example #2
Source File: SparkSession.java From incubator-nemo with Apache License 2.0 | 5 votes |
@Override public Dataset<Row> baseRelationToDataFrame(final BaseRelation baseRelation) { final boolean userTriggered = initializeFunction(baseRelation); final Dataset<Row> result = Dataset.from(super.baseRelationToDataFrame(baseRelation)); this.setIsUserTriggered(userTriggered); return result; }
Example #3
Source File: SparkSession.java From nemo with Apache License 2.0 | 5 votes |
@Override public Dataset<Row> baseRelationToDataFrame(final BaseRelation baseRelation) { final boolean userTriggered = initializeFunction(baseRelation); final Dataset<Row> result = Dataset.from(super.baseRelationToDataFrame(baseRelation)); this.setIsUserTriggered(userTriggered); return result; }