org.apache.spark.serializer.Serializer Java Examples
The following examples show how to use
org.apache.spark.serializer.Serializer.
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: SparkFrontendUtils.java From incubator-nemo with Apache License 2.0 | 6 votes |
/** * Collect data by running the DAG. * * @param dag the DAG to execute. * @param loopVertexStack loop vertex stack. * @param lastVertex last vertex added to the dag. * @param serializer serializer for the edges. * @param <T> type of the return data. * @return the data collected. */ public static <T> List<T> collect(final DAG<IRVertex, IREdge> dag, final Stack<LoopVertex> loopVertexStack, final IRVertex lastVertex, final Serializer serializer) { final DAGBuilder<IRVertex, IREdge> builder = new DAGBuilder<>(dag); final IRVertex collectVertex = new OperatorVertex(new CollectTransform<>()); builder.addVertex(collectVertex, loopVertexStack); final IREdge newEdge = new IREdge(getEdgeCommunicationPattern(lastVertex, collectVertex), lastVertex, collectVertex); newEdge.setProperty(EncoderProperty.of(new SparkEncoderFactory(serializer))); newEdge.setProperty(DecoderProperty.of(new SparkDecoderFactory(serializer))); newEdge.setProperty(SPARK_KEY_EXTRACTOR_PROP); builder.connectVertices(newEdge); // launch DAG JobLauncher.launchDAG(new IRDAG(builder.build()), SparkBroadcastVariables.getAll(), ""); return JobLauncher.getCollectedData(); }
Example #2
Source File: SparkFrontendUtils.java From incubator-nemo with Apache License 2.0 | 5 votes |
/** * Derive Spark serializer from a spark context. * * @param sparkContext spark context to derive the serializer from. * @return the serializer. */ public static Serializer deriveSerializerFrom(final org.apache.spark.SparkContext sparkContext) { if (sparkContext.conf().get("spark.serializer", "") .equals("org.apache.spark.serializer.KryoSerializer")) { return new KryoSerializer(sparkContext.conf()); } else { return new JavaSerializer(sparkContext.conf()); } }
Example #3
Source File: SparkJavaPairRDD.java From incubator-nemo with Apache License 2.0 | 5 votes |
@Override public <C> SparkJavaPairRDD<K, C> combineByKey(final Function<V, C> createCombiner, final Function2<C, V, C> mergeValue, final Function2<C, C, C> mergeCombiners, final Partitioner partitioner, final boolean mapSideCombine, final Serializer serializer) { throw new UnsupportedOperationException(NOT_YET_SUPPORTED); }
Example #4
Source File: SparkFrontendUtils.java From nemo with Apache License 2.0 | 5 votes |
/** * Derive Spark serializer from a spark context. * @param sparkContext spark context to derive the serializer from. * @return the serializer. */ public static Serializer deriveSerializerFrom(final SparkContext sparkContext) { if (sparkContext.conf().get("spark.serializer", "") .equals("org.apache.spark.serializer.KryoSerializer")) { return new KryoSerializer(sparkContext.conf()); } else { return new JavaSerializer(sparkContext.conf()); } }
Example #5
Source File: SparkDecoderFactory.java From incubator-nemo with Apache License 2.0 | 2 votes |
/** * Default constructor. * * @param serializer Spark serializer. */ public SparkDecoderFactory(final Serializer serializer) { this.serializer = serializer; }
Example #6
Source File: SparkEncoderFactory.java From incubator-nemo with Apache License 2.0 | 2 votes |
/** * Default constructor. * * @param serializer Spark serializer. */ public SparkEncoderFactory(final Serializer serializer) { this.serializer = serializer; }
Example #7
Source File: SparkCoder.java From nemo with Apache License 2.0 | 2 votes |
/** * Default constructor. * @param serializer kryo serializer. */ public SparkCoder(final Serializer serializer) { this.serializer = serializer; }