Java Code Examples for org.nd4j.linalg.ops.transforms.Transforms#softmax()
The following examples show how to use
org.nd4j.linalg.ops.transforms.Transforms#softmax() .
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example 1
Source File: LogSoftMaxDerivative.java From nd4j with Apache License 2.0 | 5 votes |
@Override public void exec() { //TODO add dimension arg. For now: hardcoded along dimension 1... INDArray softmax = Transforms.softmax(x, true); INDArray mul = softmax.mul(y); INDArray summed = mul.sum(1); Nd4j.getExecutioner().exec(new BroadcastSubOp(y,summed,z,0)); }
Example 2
Source File: YoloUtils.java From deeplearning4j with Apache License 2.0 | 4 votes |
public static INDArray activate(@NonNull INDArray boundingBoxPriors, @NonNull INDArray input, boolean nchw, LayerWorkspaceMgr layerWorkspaceMgr){ if(!nchw) input = input.permute(0,3,1,2); //NHWC to NCHW long mb = input.size(0); long h = input.size(2); long w = input.size(3); long b = boundingBoxPriors.size(0); long c = input.size(1)/b-5; //input.size(1) == b * (5 + C) -> C = (input.size(1)/b) - 5 INDArray output = layerWorkspaceMgr.create(ArrayType.ACTIVATIONS, input.dataType(), input.shape(), 'c'); INDArray output5 = output.reshape('c', mb, b, 5+c, h, w); INDArray output4 = output; //output.get(all(), interval(0,5*b), all(), all()); INDArray input4 = input.dup('c'); //input.get(all(), interval(0,5*b), all(), all()).dup('c'); INDArray input5 = input4.reshape('c', mb, b, 5+c, h, w); //X/Y center in grid: sigmoid INDArray predictedXYCenterGrid = input5.get(all(), all(), interval(0,2), all(), all()); Transforms.sigmoid(predictedXYCenterGrid, false); //width/height: prior * exp(input) INDArray predictedWHPreExp = input5.get(all(), all(), interval(2,4), all(), all()); INDArray predictedWH = Transforms.exp(predictedWHPreExp, false); Broadcast.mul(predictedWH, boundingBoxPriors.castTo(input.dataType()), predictedWH, 1, 2); //Box priors: [b, 2]; predictedWH: [mb, b, 2, h, w] //Confidence - sigmoid INDArray predictedConf = input5.get(all(), all(), point(4), all(), all()); //Shape: [mb, B, H, W] Transforms.sigmoid(predictedConf, false); output4.assign(input4); //Softmax //TODO OPTIMIZE? INDArray inputClassesPreSoftmax = input5.get(all(), all(), interval(5, 5+c), all(), all()); //Shape: [minibatch, C, H, W] INDArray classPredictionsPreSoftmax2d = inputClassesPreSoftmax.permute(0,1,3,4,2) //[minibatch, b, c, h, w] To [mb, b, h, w, c] .dup('c').reshape('c', new long[]{mb*b*h*w, c}); Transforms.softmax(classPredictionsPreSoftmax2d, false); INDArray postSoftmax5d = classPredictionsPreSoftmax2d.reshape('c', mb, b, h, w, c ).permute(0, 1, 4, 2, 3); INDArray outputClasses = output5.get(all(), all(), interval(5, 5+c), all(), all()); //Shape: [minibatch, C, H, W] outputClasses.assign(postSoftmax5d); if(!nchw) output = output.permute(0,2,3,1); //NCHW to NHWC return output; }
Example 3
Source File: BackPropMLPTest.java From deeplearning4j with Apache License 2.0 | 4 votes |
public static INDArray doSoftmax(INDArray input) { return Transforms.softmax(input, true); }