Java Code Examples for org.apache.spark.api.java.function.Function2#call()
The following examples show how to use
org.apache.spark.api.java.function.Function2#call() .
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: ReduceTransform.java From incubator-nemo with Apache License 2.0 | 6 votes |
/** * Reduce the iterator elements into a single object. * * @param elements the iterator of elements. * @param func function to apply for reduction. * @param <T> type of the elements. * @return the reduced element. */ @Nullable public static <T> T reduceIterator(final Iterator<T> elements, final Function2<T, T, T> func) { if (!elements.hasNext()) { // nothing to be done return null; } T res = elements.next(); while (elements.hasNext()) { try { res = func.call(res, elements.next()); } catch (Exception e) { throw new RuntimeException(e); } } return res; }
Example 2
Source File: ReduceTransform.java From nemo with Apache License 2.0 | 6 votes |
/** * Reduce the iterator elements into a single object. * @param elements the iterator of elements. * @param func function to apply for reduction. * @param <T> type of the elements. * @return the reduced element. */ @Nullable public static <T> T reduceIterator(final Iterator<T> elements, final Function2<T, T, T> func) { if (!elements.hasNext()) { // nothing to be done return null; } T res = elements.next(); while (elements.hasNext()) { try { res = func.call(res, elements.next()); } catch (Exception e) { throw new RuntimeException(e); } } return res; }
Example 3
Source File: CoverageModelEMWorkspace.java From gatk-protected with BSD 3-Clause "New" or "Revised" License | 4 votes |
/** * A generic function for broadcasting an object to all compute blocks * * If Spark is enabled: * * A {@link Broadcast} will be created from {@param obj} and will be "received" by the compute nodes by calling * {@param pusher}. A reference to the updated RDD will replace the old RDD. * * If Spark is disabled: * * The {@param pusher} function will be called together with {@param obj} and {@link #localComputeBlock} * * @param obj te object to broadcast * @param pusher a map from (V, {@link CoverageModelEMComputeBlock}) -> {@link CoverageModelEMComputeBlock} that * updates the compute block with the broadcasted value * @param <V> the type of the broadcasted object */ @UpdatesRDD private <V> void pushToWorkers(@Nonnull final V obj, @Nonnull final Function2<V, CoverageModelEMComputeBlock, CoverageModelEMComputeBlock> pusher) { if (sparkContextIsAvailable) { final Broadcast<V> broadcastedObj = ctx.broadcast(obj); final Function<CoverageModelEMComputeBlock, CoverageModelEMComputeBlock> mapper = cb -> pusher.call(broadcastedObj.value(), cb); mapWorkers(mapper); } else { try { localComputeBlock = pusher.call(obj, localComputeBlock); } catch (final Exception ex) { throw new RuntimeException("Can not apply the map function to the local compute block", ex); } } }