org.deeplearning4j.nn.conf.layers.FeedForwardLayer Java Examples

The following examples show how to use org.deeplearning4j.nn.conf.layers.FeedForwardLayer. You can vote up the ones you like or vote down the ones you don't like, and go to the original project or source file by following the links above each example. You may check out the related API usage on the sidebar.
Example #1
Source File: MultiLayerNetwork.java    From deeplearning4j with Apache License 2.0 6 votes vote down vote up
/**
 * Return the input size (number of inputs) for the specified layer.<br>
 * Note that the meaning of the "input size" can depend on the type of layer. For example:<br>
 * - DenseLayer, OutputLayer, etc: the feature vector size (nIn configuration option)<br>
 * - Recurrent layers: the feature vector size <i>per time step</i> (nIn configuration option)<br>
 * - ConvolutionLayer: the channels (number of channels)<br>
 * - Subsampling layers, global pooling layers, etc: size of 0 is always returned<br>
 *
 * @param layer Index of the layer to get the size of. Must be in range 0 to nLayers-1 inclusive
 * @return Size of the layer
 */
public int layerInputSize(int layer) {
    if (layer < 0 || layer > layers.length) {
        throw new IllegalArgumentException("Invalid layer index: " + layer + ". Layer index must be between 0 and "
                + (layers.length - 1) + " inclusive");
    }
    org.deeplearning4j.nn.conf.layers.Layer conf = layers[layer].conf().getLayer();
    if (conf == null || !(conf instanceof FeedForwardLayer)) {
        return 0;
    }
    FeedForwardLayer ffl = (FeedForwardLayer) conf;

    if (ffl.getNIn() > Integer.MAX_VALUE)
        throw new ND4JArraySizeException();
    return (int) ffl.getNIn();
}
 
Example #2
Source File: MultiLayerNetwork.java    From deeplearning4j with Apache License 2.0 6 votes vote down vote up
/**
 * Return the layer size (number of units) for the specified layer.<br>
 * Note that the meaning of the "layer size" can depend on the type of layer. For example:<br>
 * - DenseLayer, OutputLayer, recurrent layers: number of units (nOut configuration option)<br>
 * - ConvolutionLayer: the channels (number of channels)<br>
 * - Subsampling layers, global pooling layers, etc: size of 0 is always returned<br>
 *
 * @param layer Index of the layer to get the size of. Must be in range 0 to nLayers-1 inclusive
 * @return Size of the layer
 */
public int layerSize(int layer) {
    if (layer < 0 || layer > layers.length) {
        throw new IllegalArgumentException("Invalid layer index: " + layer + ". Layer index must be between 0 and "
                + (layers.length - 1) + " inclusive");
    }
    org.deeplearning4j.nn.conf.layers.Layer conf = layers[layer].conf().getLayer();
    if (conf == null || !(conf instanceof FeedForwardLayer)) {
        return 0;
    }
    FeedForwardLayer ffl = (FeedForwardLayer) conf;

    if (ffl.getNOut() > Integer.MAX_VALUE)
        throw new ND4JArraySizeException();
    return (int) ffl.getNOut();
}
 
Example #3
Source File: ElementWiseParamInitializer.java    From deeplearning4j with Apache License 2.0 6 votes vote down vote up
/**
 * Return a map of gradients (in their standard non-flattened representation), taken from the flattened (row vector) gradientView array.
 * The idea is that operates in exactly the same way as the paramsView does in
 * thus the position in the view (and, the array orders) must match those of the parameters
 *
 * @param conf         Configuration
 * @param gradientView The flattened gradients array, as a view of the larger array
 * @return A map containing an array by parameter type, that is a view of the full network gradients array
 */
@Override
public Map<String, INDArray> getGradientsFromFlattened(NeuralNetConfiguration conf, INDArray gradientView) {
    org.deeplearning4j.nn.conf.layers.FeedForwardLayer layerConf =
            (org.deeplearning4j.nn.conf.layers.FeedForwardLayer) conf.getLayer();
    val nIn = layerConf.getNIn();
    val nOut = layerConf.getNOut();
    val nWeightParams = nIn ;

    INDArray weightGradientView = gradientView.get(NDArrayIndex.interval(0,0,true), NDArrayIndex.interval(0, nWeightParams));
    INDArray biasView = gradientView.get(NDArrayIndex.interval(0,0,true),
            NDArrayIndex.interval(nWeightParams, nWeightParams + nOut)); //Already a row vector

    Map<String, INDArray> out = new LinkedHashMap<>();
    out.put(WEIGHT_KEY, weightGradientView);
    out.put(BIAS_KEY, biasView);

    return out;
}
 
Example #4
Source File: KerasLoss.java    From deeplearning4j with Apache License 2.0 6 votes vote down vote up
/**
 * Get DL4J LossLayer.
 *
 * @return LossLayer
 */
public FeedForwardLayer getLossLayer(InputType type) throws UnsupportedKerasConfigurationException {
    if (type instanceof InputType.InputTypeFeedForward) {
        this.layer = new LossLayer.Builder(loss).name(this.layerName).activation(Activation.IDENTITY).build();
    }
    else if (type instanceof  InputType.InputTypeRecurrent) {
        this.layer = new RnnLossLayer.Builder(loss).name(this.layerName).activation(Activation.IDENTITY).build();
    }
    else if (type instanceof InputType.InputTypeConvolutional) {
        this.layer = new CnnLossLayer.Builder(loss).name(this.layerName).activation(Activation.IDENTITY).build();
    } else {
        throw new UnsupportedKerasConfigurationException("Unsupported output layer type"
                + "got : " + type.toString());
    }
    return (FeedForwardLayer) this.layer;
}
 
Example #5
Source File: KerasLayer.java    From deeplearning4j with Apache License 2.0 6 votes vote down vote up
/**
 * Some DL4J layers need explicit specification of number of inputs, which Keras does infer.
 * This method searches through previous layers until a FeedForwardLayer is found. These layers
 * have nOut values that subsequently correspond to the nIn value of this layer.
 *
 * @param previousLayers
 * @return
 * @throws UnsupportedKerasConfigurationException
 */
protected long getNInFromConfig(Map<String, ? extends KerasLayer> previousLayers) throws UnsupportedKerasConfigurationException {
    int size = previousLayers.size();
    int count = 0;
    long nIn;
    String inboundLayerName = inboundLayerNames.get(0);
    while (count <= size) {
        if (previousLayers.containsKey(inboundLayerName)) {
            KerasLayer inbound = previousLayers.get(inboundLayerName);
            try {
                FeedForwardLayer ffLayer = (FeedForwardLayer) inbound.getLayer();
                nIn = ffLayer.getNOut();
                if (nIn > 0)
                    return nIn;
                count++;
                inboundLayerName = inbound.getInboundLayerNames().get(0);
            } catch (Exception e) {
                inboundLayerName = inbound.getInboundLayerNames().get(0);
            }
        }
    }
    throw new UnsupportedKerasConfigurationException("Could not determine number of input channels for" +
            "depthwise convolution.");
}
 
Example #6
Source File: CDAEParameter.java    From jstarcraft-rns with Apache License 2.0 5 votes vote down vote up
private INDArray createUserWeightMatrix(NeuralNetConfiguration conf, INDArray weightParamView, boolean initializeParameters) {
    FeedForwardLayer layerConf = (FeedForwardLayer) conf.getLayer();
    if (initializeParameters) {
        Distribution dist = Distributions.createDistribution(layerConf.getDist());
        return createWeightMatrix(numberOfUsers, layerConf.getNOut(), layerConf.getWeightInit(), dist, weightParamView, true);
    } else {
        return createWeightMatrix(numberOfUsers, layerConf.getNOut(), null, null, weightParamView, false);
    }
}
 
Example #7
Source File: TestPreProcessors.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
@Test
public void testCnnToDense() {
    MultiLayerConfiguration conf =
            new NeuralNetConfiguration.Builder()
                    .list().layer(0,
                    new org.deeplearning4j.nn.conf.layers.ConvolutionLayer.Builder(
                            4, 4) // 28*28*1 => 15*15*10
                            .nIn(1).nOut(10).padding(2, 2)
                            .stride(2, 2)
                            .weightInit(WeightInit.RELU)
                            .activation(Activation.RELU)
                            .build())
                    .layer(1, new org.deeplearning4j.nn.conf.layers.DenseLayer.Builder()
                            .activation(Activation.RELU).nOut(200).build())
                    .layer(2, new OutputLayer.Builder(LossFunctions.LossFunction.MCXENT).nIn(200)
                            .nOut(5).weightInit(WeightInit.RELU)
                            .activation(Activation.SOFTMAX).build())
                    .setInputType(InputType.convolutionalFlat(28, 28, 1))
                    .build();

    assertNotNull(conf.getInputPreProcess(0));
    assertNotNull(conf.getInputPreProcess(1));

    assertTrue(conf.getInputPreProcess(0) instanceof FeedForwardToCnnPreProcessor);
    assertTrue(conf.getInputPreProcess(1) instanceof CnnToFeedForwardPreProcessor);

    FeedForwardToCnnPreProcessor ffcnn = (FeedForwardToCnnPreProcessor) conf.getInputPreProcess(0);
    CnnToFeedForwardPreProcessor cnnff = (CnnToFeedForwardPreProcessor) conf.getInputPreProcess(1);

    assertEquals(28, ffcnn.getInputHeight());
    assertEquals(28, ffcnn.getInputWidth());
    assertEquals(1, ffcnn.getNumChannels());

    assertEquals(15, cnnff.getInputHeight());
    assertEquals(15, cnnff.getInputWidth());
    assertEquals(10, cnnff.getNumChannels());

    assertEquals(15 * 15 * 10, ((FeedForwardLayer) conf.getConf(1).getLayer()).getNIn());
}
 
Example #8
Source File: Bidirectional.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
public long getNIn() {
    if (this.fwd instanceof LastTimeStep) {
        return ((FeedForwardLayer) ((LastTimeStep) this.fwd).getUnderlying()).getNIn();
    } else {
        return ((FeedForwardLayer) this.fwd).getNIn();
    }
}
 
Example #9
Source File: Bidirectional.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
public long getNOut() {
    if (this.fwd instanceof LastTimeStep) {
        return ((FeedForwardLayer) ((LastTimeStep) this.fwd).getUnderlying()).getNOut();
    } else {
        return ((FeedForwardLayer) this.fwd).getNOut();
    }
}
 
Example #10
Source File: ElementWiseParamInitializer.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
/**
 * Initialize the parameters
 *
 * @param conf             the configuration
 * @param paramsView       a view of the full network (backprop) parameters
 * @param initializeParams if true: initialize the parameters according to the configuration. If false: don't modify the
 *                         values in the paramsView array (but do select out the appropriate subset, reshape etc as required)
 * @return Map of parameters keyed by type (view of the 'paramsView' array)
 */
@Override
public Map<String, INDArray> init(NeuralNetConfiguration conf, INDArray paramsView, boolean initializeParams) {
    if (!(conf.getLayer() instanceof org.deeplearning4j.nn.conf.layers.FeedForwardLayer))
        throw new IllegalArgumentException("unsupported layer type: " + conf.getLayer().getClass().getName());

    Map<String, INDArray> params = Collections.synchronizedMap(new LinkedHashMap<String, INDArray>());

    val length = numParams(conf);
    if (paramsView.length() != length)
        throw new IllegalStateException(
                "Expected params view of length " + length + ", got length " + paramsView.length());

    org.deeplearning4j.nn.conf.layers.FeedForwardLayer layerConf =
            (org.deeplearning4j.nn.conf.layers.FeedForwardLayer) conf.getLayer();
    val nIn = layerConf.getNIn();

    val nWeightParams = nIn ;
    INDArray weightView = paramsView.get(NDArrayIndex.interval(0,0,true), NDArrayIndex.interval(0, nWeightParams));
    INDArray biasView = paramsView.get(NDArrayIndex.interval(0,0,true),
            NDArrayIndex.interval(nWeightParams, nWeightParams + nIn));


    params.put(WEIGHT_KEY, createWeightMatrix(conf, weightView, initializeParams));
    params.put(BIAS_KEY, createBias(conf, biasView, initializeParams));
    conf.addVariable(WEIGHT_KEY);
    conf.addVariable(BIAS_KEY);

    return params;
}
 
Example #11
Source File: TransferLearning.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
/**
 * Modify the architecture of a vertex layer by changing nIn of the specified layer.<br>
 * Note that only the specified layer will be modified - all other layers will not be changed by this call.
 *
 * @param layerName The name of the layer to change nIn of
 * @param nIn       Value of nIn to change to
 * @param scheme    Weight init scheme to use for params in layerName and the layers following it
 * @return GraphBuilder
 */
public GraphBuilder nInReplace(String layerName, int nIn, IWeightInit scheme) {
    Preconditions.checkState(origGraph.getVertex(layerName) != null, "Layer with name %s not found",
            layerName);
    Preconditions.checkState(origGraph.getVertex(layerName).hasLayer(), "nInReplace can only be applied" +
            " on vertices with layers. Vertex %s does not have a layer", layerName);
    initBuilderIfReq();

    NeuralNetConfiguration layerConf = origGraph.getLayer(layerName).conf();
    Layer layerImpl = layerConf.getLayer().clone();

    Preconditions.checkState(layerImpl instanceof FeedForwardLayer, "Can only use nInReplace on FeedForward layers;" +
            "got layer of type %s for layer name %s", layerImpl.getClass().getSimpleName(), layerName);

    layerImpl.resetLayerDefaultConfig();
    FeedForwardLayer layerImplF = (FeedForwardLayer) layerImpl;
    layerImplF.setWeightInitFn(scheme);
    layerImplF.setNIn(nIn);

    if(editedVertices.contains(layerName) && editedConfigBuilder.getVertices().get(layerName) instanceof LayerVertex
            && nInFromNewConfig.containsKey(layerName)){
        Layer l = ((LayerVertex)editedConfigBuilder.getVertices().get(layerName)).getLayerConf().getLayer();
        if(l instanceof FeedForwardLayer){
            layerImplF.setNIn(nInFromNewConfig.get(layerName));
        }
    }

    editedConfigBuilder.removeVertex(layerName, false);
    LayerVertex lv = (LayerVertex) origConfig.getVertices().get(layerName);
    String[] lvInputs = origConfig.getVertexInputs().get(layerName).toArray(new String[0]);
    editedConfigBuilder.addLayer(layerName, layerImpl, lv.getPreProcessor(), lvInputs);
    editedVertices.add(layerName);

    return this;
}
 
Example #12
Source File: TransferLearning.java    From deeplearning4j with Apache License 2.0 5 votes vote down vote up
private void nInReplaceBuild(int layerNum, int nIn, IWeightInit init) {
    Preconditions.checkArgument(layerNum >= 0 && layerNum < editedConfs.size(), "Invalid layer index: must be 0 to " +
            "numLayers-1 = %s includive, got %s", editedConfs.size(), layerNum);
    NeuralNetConfiguration layerConf = editedConfs.get(layerNum);
    Layer layerImpl = layerConf.getLayer(); //not a clone need to modify nOut in place
    Preconditions.checkArgument(layerImpl instanceof FeedForwardLayer, "nInReplace can only be applide on FeedForward layers;" +
            "got layer of type %s", layerImpl.getClass().getSimpleName());
    FeedForwardLayer layerImplF = (FeedForwardLayer) layerImpl;
    layerImplF.setWeightInitFn(init);
    layerImplF.setNIn(nIn);
    long numParams = layerImpl.initializer().numParams(layerConf);
    INDArray params = Nd4j.create(origModel.getLayerWiseConfigurations().getDataType(), 1, numParams);
    org.deeplearning4j.nn.api.Layer someLayer = layerImpl.instantiate(layerConf, null, 0, params, true, dataType);
    editedParams.set(layerNum, someLayer.params());
}
 
Example #13
Source File: DeepFMParameter.java    From jstarcraft-rns with Apache License 2.0 5 votes vote down vote up
@Override
public Map<String, INDArray> getGradientsFromFlattened(NeuralNetConfiguration configuration, INDArray view) {
    Map<String, INDArray> gradients = new LinkedHashMap<>();
    FeedForwardLayer layerConfiguration = (FeedForwardLayer) configuration.getLayer();
    long numberOfOut = layerConfiguration.getNOut();
    long numberOfWeights = numberOfFeatures * numberOfOut;
    INDArray weight = view.get(NDArrayIndex.point(0), NDArrayIndex.interval(0, numberOfWeights)).reshape('f', numberOfWeights, numberOfOut);
    INDArray bias = view.get(NDArrayIndex.point(0), NDArrayIndex.interval(numberOfWeights, numberOfWeights + numberOfOut));
    gradients.put(WEIGHT_KEY, weight);
    gradients.put(BIAS_KEY, bias);
    return gradients;
}
 
Example #14
Source File: DeepFMParameter.java    From jstarcraft-rns with Apache License 2.0 5 votes vote down vote up
@Override
public Map<String, INDArray> init(NeuralNetConfiguration configuration, INDArray view, boolean initialize) {
    Map<String, INDArray> parameters = Collections.synchronizedMap(new LinkedHashMap<String, INDArray>());
    FeedForwardLayer layerConfiguration = (FeedForwardLayer) configuration.getLayer();
    long numberOfOut = layerConfiguration.getNOut();
    long numberOfWeights = numberOfFeatures * numberOfOut;
    INDArray weight = view.get(new INDArrayIndex[] { NDArrayIndex.point(0), NDArrayIndex.interval(0, numberOfWeights) });
    INDArray bias = view.get(NDArrayIndex.point(0), NDArrayIndex.interval(numberOfWeights, numberOfWeights + numberOfOut));

    parameters.put(WEIGHT_KEY, this.createWeightMatrix(configuration, weight, initialize));
    parameters.put(BIAS_KEY, createBias(configuration, bias, initialize));
    configuration.addVariable(WEIGHT_KEY);
    configuration.addVariable(BIAS_KEY);
    return parameters;
}
 
Example #15
Source File: DeepFMParameter.java    From jstarcraft-rns with Apache License 2.0 5 votes vote down vote up
protected INDArray createWeightMatrix(NeuralNetConfiguration configuration, INDArray view, boolean initialize) {
    FeedForwardLayer layerConfiguration = (FeedForwardLayer) configuration.getLayer();
    if (initialize) {
        Distribution distribution = Distributions.createDistribution(layerConfiguration.getDist());
        return super.createWeightMatrix(numberOfFeatures, layerConfiguration.getNOut(), layerConfiguration.getWeightInit(), distribution, view, true);
    } else {
        return super.createWeightMatrix(numberOfFeatures, layerConfiguration.getNOut(), null, null, view, false);
    }
}
 
Example #16
Source File: CDAEParameter.java    From jstarcraft-rns with Apache License 2.0 5 votes vote down vote up
@Override
public Map<String, INDArray> getGradientsFromFlattened(NeuralNetConfiguration conf, INDArray gradientView) {
    Map<String, INDArray> out = super.getGradientsFromFlattened(conf, gradientView);
    FeedForwardLayer layerConf = (FeedForwardLayer) conf.getLayer();
    long nIn = layerConf.getNIn();
    long nOut = layerConf.getNOut();
    long nWeightParams = nIn * nOut;
    long nUserWeightParams = numberOfUsers * nOut;
    INDArray userWeightGradientView = gradientView.get(NDArrayIndex.point(0), NDArrayIndex.interval(nWeightParams + nOut, nWeightParams + nOut + nUserWeightParams)).reshape('f', numberOfUsers, nOut);
    out.put(USER_KEY, userWeightGradientView);
    return out;
}
 
Example #17
Source File: CDAEParameter.java    From jstarcraft-rns with Apache License 2.0 5 votes vote down vote up
@Override
public Map<String, INDArray> init(NeuralNetConfiguration conf, INDArray paramsView, boolean initializeParams) {
    Map<String, INDArray> params = super.init(conf, paramsView, initializeParams);
    FeedForwardLayer layerConf = (FeedForwardLayer) conf.getLayer();
    long nIn = layerConf.getNIn();
    long nOut = layerConf.getNOut();
    long nWeightParams = nIn * nOut;
    long nUserWeightParams = numberOfUsers * nOut;
    INDArray userWeightView = paramsView.get(new INDArrayIndex[] { NDArrayIndex.point(0), NDArrayIndex.interval(nWeightParams + nOut, nWeightParams + nOut + nUserWeightParams) });
    params.put(USER_KEY, this.createUserWeightMatrix(conf, userWeightView, initializeParams));
    conf.addVariable(USER_KEY);
    return params;
}
 
Example #18
Source File: CDAEParameter.java    From jstarcraft-rns with Apache License 2.0 5 votes vote down vote up
@Override
public long numParams(NeuralNetConfiguration conf) {
    FeedForwardLayer layerConf = (FeedForwardLayer) conf.getLayer();
    return super.numParams(conf) + numberOfUsers * layerConf.getNOut(); // add
    // another
    // user
    // weight
    // matrix
}
 
Example #19
Source File: ElementWiseParamInitializer.java    From deeplearning4j with Apache License 2.0 4 votes vote down vote up
@Override
public long numParams(Layer layer) {
    FeedForwardLayer layerConf = (FeedForwardLayer) layer;
    val nIn = layerConf.getNIn();
    return nIn*2; //weights + bias
}
 
Example #20
Source File: DeepFMParameter.java    From jstarcraft-rns with Apache License 2.0 4 votes vote down vote up
@Override
public long numParams(NeuralNetConfiguration configuration) {
    FeedForwardLayer layerConfiguration = (FeedForwardLayer) configuration.getLayer();
    return numberOfFeatures * layerConfiguration.getNOut() + layerConfiguration.getNOut();
}
 
Example #21
Source File: SparkDl4jMultiLayer.java    From deeplearning4j with Apache License 2.0 3 votes vote down vote up
/**
 * Fit a MultiLayerNetwork using Spark MLLib LabeledPoint instances.
 * This will convert the labeled points to the internal DL4J data format and train the model on that
 *
 * @param rdd the rdd to fitDataSet
 * @return the multi layer network that was fitDataSet
 */
public MultiLayerNetwork fitLabeledPoint(JavaRDD<LabeledPoint> rdd) {
    int nLayers = network.getLayerWiseConfigurations().getConfs().size();
    FeedForwardLayer ffl = (FeedForwardLayer) network.getLayerWiseConfigurations().getConf(nLayers - 1).getLayer();
    JavaRDD<DataSet> ds = MLLibUtil.fromLabeledPoint(sc, rdd, ffl.getNOut());
    return fit(ds);
}
 
Example #22
Source File: SparkDl4jMultiLayer.java    From deeplearning4j with Apache License 2.0 2 votes vote down vote up
/**
 * Evaluate the network (regression performance) in a distributed manner on the provided data
 *
 * @param data Data to evaluate
 * @param minibatchSize Minibatch size to use when doing performing evaluation
 * @return     {@link RegressionEvaluation} instance with regression performance
 */
public <T extends RegressionEvaluation> T evaluateRegression(JavaRDD<DataSet> data, int minibatchSize) {
    long nOut = ((FeedForwardLayer) network.getOutputLayer().conf().getLayer()).getNOut();
    return (T)doEvaluation(data, new org.deeplearning4j.eval.RegressionEvaluation(nOut), minibatchSize);
}
 
Example #23
Source File: SparkComputationGraph.java    From deeplearning4j with Apache License 2.0 2 votes vote down vote up
/**
 * Evaluate the network (regression performance) in a distributed manner on the provided data
 *
 * @param data Data to evaluate
 * @param minibatchSize Minibatch size to use when doing performing evaluation
 * @return     {@link RegressionEvaluation} instance with regression performance
 */
public <T extends RegressionEvaluation> T evaluateRegression(JavaRDD<DataSet> data, int minibatchSize) {
    val nOut = ((FeedForwardLayer) network.getOutputLayer(0).conf().getLayer()).getNOut();
    return (T)doEvaluation(data, new org.deeplearning4j.eval.RegressionEvaluation(nOut), minibatchSize);
}