Python mxnet.nd.Convolution() Examples
The following are 3
code examples of mxnet.nd.Convolution().
You can vote up the ones you like or vote down the ones you don't like,
and go to the original project or source file by following the links above each example.
You may also want to check out all available functions/classes of the module
mxnet.nd
, or try the search function
.
Example #1
Source File: model.py From dynamic-training-with-apache-mxnet-on-aws with Apache License 2.0 | 5 votes |
def forward(self, x): # x shape is batch_size x in_channels x height x width return nd.Convolution( data=x, weight=self._spectral_norm(), kernel=(self.kernel_size, self.kernel_size), pad=(self.padding, self.padding), stride=(self.strides, self.strides), num_filter=self.num_filter, no_bias=True )
Example #2
Source File: model.py From training_results_v0.6 with Apache License 2.0 | 5 votes |
def forward(self, x): # x shape is batch_size x in_channels x height x width return nd.Convolution( data=x, weight=self._spectral_norm(), kernel=(self.kernel_size, self.kernel_size), pad=(self.padding, self.padding), stride=(self.strides, self.strides), num_filter=self.num_filter, no_bias=True )
Example #3
Source File: convert_conv2d.py From Quantization.MXNet with MIT License | 5 votes |
def _add_fake_bn_ema_hook(m): def _ema_hook(m, x): x = x[0] weight = m.weight.data() bias = nd.zeros(shape=weight.shape[0], ctx=weight.context) if m.bias is None else m.bias.data() y = nd.Convolution(x, weight, bias, **m._kwargs) num_samples = y.shape[0] * y.shape[2] * y.shape[3] m.current_mean = y.sum(axis=(0, 2, 3)) / num_samples diff_square = (y - m.current_mean.reshape(1, -1, 1, 1)) ** 2 m.current_var = diff_square.sum(axis=(0, 2, 3)) / num_samples m.register_forward_pre_hook(_ema_hook)