com.intel.analytics.bigdl

nn

package nn

Visibility
  1. Public
  2. All

Type Members

  1. class Abs[T] extends TensorModule[T]

    an element-wise abs operation

  2. class AbsCriterion[T] extends TensorCriterion[T]

    measures the mean absolute value of the element-wise difference between input and target

  3. class ActivityRegularization[T] extends TensorModule[T]

  4. class Add[T] extends TensorModule[T] with Initializable

    adds a bias term to input data ;

  5. class AddConstant[T] extends TensorModule[T]

    adding a constant

  6. class Anchor extends Serializable

    Generates a regular grid of multi-scale, multi-aspect anchor boxes.

  7. class Attention[T] extends AbstractModule[Activity, Activity, T]

    Implementation of multiheaded attention and self-attention layers.

  8. class BCECriterion[T] extends TensorCriterion[T]

    This loss function measures the Binary Cross Entropy between the target and the output loss(o, t) = - 1/n sum_i (t[i] * log(o[i]) + (1 - t[i]) * log(1 - o[i])) or in the case of the weights argument being specified: loss(o, t) = - 1/n sum_i weights[i] * (t[i] * log(o[i]) + (1 - t[i]) * log(1 - o[i]))

  9. case class BatchNormParams[T](eps: Double = 1.0E-5, momentum: Double = 0.1, initWeight: Tensor[T] = null, initBias: Tensor[T] = null, initGradWeight: Tensor[T] = null, initGradBias: Tensor[T] = null, affine: Boolean = true)(implicit evidence$8: ClassTag[T], ev: TensorNumeric[T]) extends Product with Serializable

  10. class BatchNormalization[T] extends TensorModule[T] with Initializable with MklInt8Convertible

    This layer implements Batch Normalization as described in the paper: "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift" by Sergey Ioffe, Christian Szegedy https://arxiv.

  11. class BiRecurrent[T] extends DynamicContainer[Tensor[T], Tensor[T], T]

    This layer implement a bidirectional recurrent neural network

  12. class BifurcateSplitTable[T] extends AbstractModule[Tensor[T], Table, T]

    Creates a module that takes a Tensor as input and outputs two tables, splitting the Tensor along the specified dimension dimension.

  13. class Bilinear[T] extends AbstractModule[Table, Tensor[T], T] with Initializable

    a bilinear transformation with sparse inputs, The input tensor given in forward(input) is a table containing both inputs x_1 and x_2, which are tensors of size N x inputDimension1 and N x inputDimension2, respectively.

  14. class BinaryThreshold[T] extends TensorModule[T]

    Threshold input Tensor.

  15. class BinaryTreeLSTM[T] extends TreeLSTM[T]

    This class is an implementation of Binary TreeLSTM (Constituency Tree LSTM).

  16. class Bottle[T] extends DynamicContainer[Tensor[T], Tensor[T], T]

    Bottle allows varying dimensionality input to be forwarded through any module that accepts input of nInputDim dimensions, and generates output of nOutputDim dimensions.

  17. class BoxHead extends BaseModule[Float]

  18. class CAdd[T] extends TensorModule[T] with Initializable

    This layer has a bias tensor with given size.

  19. class CAddTable[T, D] extends AbstractModule[Table, Tensor[D], T] with MklInt8Convertible

    Merge the input tensors in the input table by element wise adding them together.

  20. class CAveTable[T] extends AbstractModule[Table, Tensor[T], T]

    Merge the input tensors in the input table by element wise taking the average.

  21. class CDivTable[T] extends AbstractModule[Table, Tensor[_], T]

    Takes a table with two Tensor and returns the component-wise division between them.

  22. class CMaxTable[T] extends AbstractModule[Table, Tensor[T], T]

    Takes a table of Tensors and outputs the max of all of them.

  23. class CMinTable[T] extends AbstractModule[Table, Tensor[T], T]

    Takes a table of Tensors and outputs the min of all of them.

  24. class CMul[T] extends TensorModule[T] with Initializable

    This layer has a weight tensor with given size.

  25. class CMulTable[T] extends AbstractModule[Table, Tensor[T], T]

    Takes a table of Tensors and outputs the multiplication of all of them.

  26. class CSubTable[T] extends AbstractModule[Table, Tensor[_], T]

    Takes a table with two Tensor and returns the component-wise subtraction between them.

  27. class CategoricalCrossEntropy[T] extends AbstractCriterion[Tensor[T], Tensor[T], T]

    This is same with cross entropy criterion, except the target tensor is a one-hot tensor

  28. abstract class Cell[T] extends AbstractModule[Table, Table, T]

    The Cell class is a super class of any recurrent kernels, such as RnnCell, LSTM and GRU.

  29. class Clamp[T] extends HardTanh[T]

    A kind of hard tanh activition function with integer min and max

  30. class ClassNLLCriterion[T] extends TensorCriterion[T]

    The negative log likelihood criterion.

  31. class ClassSimplexCriterion[T] extends MSECriterion[T]

    ClassSimplexCriterion implements a criterion for classification.

  32. class Concat[T] extends DynamicContainer[Tensor[T], Tensor[T], T]

    Concat concatenates the output of one layer of "parallel" modules along the provided dimension: they take the same inputs, and their output is concatenated.

  33. class ConcatTable[T] extends DynamicContainer[Activity, Table, T] with MklInt8Convertible

    ConcateTable is a container module like Concate.

  34. case class ConstInitMethod(value: Double) extends InitializationMethod with Product with Serializable

    Initializer that generates tensors with certain constant double.

  35. abstract class Container[A <: Activity, B <: Activity, T] extends AbstractModule[A, B, T]

    Container is an abstract AbstractModule class which declares methods defined in all containers.

  36. class Contiguous[T] extends TensorModule[T]

    used to make input, gradOutput both contiguous

  37. class ConvLSTMPeephole[T] extends Cell[T]

    Convolution Long Short Term Memory architecture with peephole.

  38. class ConvLSTMPeephole3D[T] extends Cell[T]

    Convolution Long Short Term Memory architecture with peephole.

  39. class Cosine[T] extends TensorModule[T] with Initializable

    Cosine calculates the cosine similarity of the input to k mean centers.

  40. class CosineDistance[T] extends AbstractModule[Table, Tensor[T], T]

    outputs the cosine distance between inputs

  41. class CosineDistanceCriterion[T] extends TensorCriterion[T]

    Creates a criterion that measures the loss given an input tensor and target tensor.

  42. class CosineEmbeddingCriterion[T] extends AbstractCriterion[Table, Table, T]

    Creates a criterion that measures the loss given an input x = {x1, x2}, a table of two Tensors, and a Tensor label y with values 1 or -1.

  43. class CosineProximityCriterion[T] extends TensorCriterion[T]

    The negative of the mean cosine proximity between predictions and targets.

  44. class Cropping2D[T] extends TensorModule[T]

    Cropping layer for 2D input (e.

  45. class Cropping3D[T] extends TensorModule[T]

    Cropping layer for 3D data (e.

  46. class CrossEntropyCriterion[T] extends TensorCriterion[T]

    This criterion combines LogSoftMax and ClassNLLCriterion in one single class.

  47. class CrossProduct[T] extends AbstractModule[Table, Tensor[T], T]

    A layer which takes a table of multiple tensors(n >= 2) as input and calculate to dot product for all combinations of pairs among input tensors.

  48. class DenseToSparse[T] extends TensorModule[T]

    Convert DenseTensor to SparseTensor.

  49. class DetectionOutputFrcnn extends AbstractModule[Table, Activity, Float]

    Post process Faster-RCNN models

  50. case class DetectionOutputParam(nClasses: Int = 21, shareLocation: Boolean = true, bgLabel: Int = 0, nmsThresh: Float = 0.45, nmsTopk: Int = 400, keepTopK: Int = 200, confThresh: Float = 0.01, varianceEncodedInTarget: Boolean = false) extends Product with Serializable

  51. class DetectionOutputSSD[T] extends AbstractModule[Table, Activity, T]

    Layer to Post-process SSD output

  52. class DiceCoefficientCriterion[T] extends TensorCriterion[T]

    The Dice-Coefficient criterion input: Tensor, target: Tensor

  53. class DistKLDivCriterion[T] extends TensorCriterion[T]

    The Kullback–Leibler divergence criterion

  54. class DotProduct[T] extends AbstractModule[Table, Tensor[T], T]

    This is a simple table layer which takes a table of two tensors as input and calculate the dot product between them as outputs

  55. class DotProductCriterion[T] extends TensorCriterion[T]

    Compute the dot product of input and target tensor.

  56. class Dropout[T] extends TensorModule[T]

    Dropout masks(set to zero) parts of input using a bernoulli distribution.

  57. abstract class DynamicContainer[A <: Activity, B <: Activity, T] extends Container[A, B, T]

    DynamicContainer allow user to change its submodules after create it.

  58. class ELU[T] extends TensorModule[T]

    Djork-Arné Clevert, Thomas Unterthiner, Sepp Hochreiter Fast and Accurate Deep Network Learning by Exponential Linear Units (ELUs) [http://arxiv.

  59. class Echo[T] extends TensorModule[T]

    This module is for debug purpose, which can print activation and gradient in your model topology

  60. class Euclidean[T] extends TensorModule[T] with Initializable

    Outputs the Euclidean distance of the input to outputSize centers

  61. class Exp[T] extends TensorModule[T]

    Applies element-wise exp to input tensor.

  62. class ExpandSize[T] extends AbstractModule[Tensor[T], Tensor[T], T]

    Expand tensor to configured size

  63. class FPN[T] extends BaseModule[T]

    Feature Pyramid Network.

  64. class FeedForwardNetwork[T] extends BaseModule[T]

    Implementation FeedForwardNetwork constructed with fully connected network.

  65. class FlattenTable[T] extends AbstractModule[Table, Table, T]

    This is a table layer which takes an arbitrarily deep table of Tensors (potentially nested) as input and a table of Tensors without any nested table will be produced

  66. class FrameManager[T] extends Serializable

    Manage frame in scheduler.

  67. class GRU[T] extends Cell[T]

    Gated Recurrent Units architecture.

  68. class GaussianCriterion[T] extends AbstractCriterion[Table, Tensor[T], T]

    Computes the log-likelihood of a sample x given a Gaussian distribution p.

  69. class GaussianDropout[T] extends TensorModule[T]

    Apply multiplicative 1-centered Gaussian noise.

  70. class GaussianNoise[T] extends TensorModule[T]

    Apply additive zero-centered Gaussian noise.

  71. class GaussianSampler[T] extends AbstractModule[Table, Tensor[T], T]

    Takes {mean, log_variance} as input and samples from the Gaussian distribution

  72. class GradientReversal[T] extends TensorModule[T]

    It is a simple module preserves the input, but takes the gradient from the subsequent layer, multiplies it by -lambda and passes it to the preceding layer.

  73. abstract class Graph[T] extends Container[Activity, Activity, T] with MklInt8Convertible

    A graph container.

  74. trait GraphSerializable extends ContainerSerializable

  75. class HardShrink[T] extends TensorModule[T]

    This is a transfer layer which applies the hard shrinkage function element-wise to the input Tensor.

  76. class HardSigmoid[T] extends TensorModule[T]

    Apply Segment-wise linear approximation of sigmoid.

  77. class HardTanh[T] extends TensorModule[T]

    Applies HardTanh to each element of input, HardTanh is defined: ⎧ maxValue, if x > maxValue f(x) = ⎨ minValue, if x < minValue ⎩ x, otherwise

  78. class HingeEmbeddingCriterion[T] extends TensorCriterion[T]

    Creates a criterion that measures the loss given an input x which is a 1-dimensional vector and a label y (1 or -1).

  79. class Identity[T] extends AbstractModule[Activity, Activity, T]

    Identity just return the input to output.

  80. class Index[T] extends AbstractModule[Table, Tensor[T], T]

    Applies the Tensor index operation along the given dimension.

  81. class InferReshape[T] extends TensorModule[T]

    Reshape the input tensor with automatic size inference support.

  82. trait InitializationMethod extends AnyRef

    Initialization method to initialize bias and weight.

  83. class Input[T] extends AbstractModule[Activity, Activity, T]

    Input layer do nothing to the input tensors, just pass them.

  84. class JoinTable[T] extends AbstractModule[Table, Tensor[_], T]

    It is a table module which takes a table of Tensors as input and outputs a Tensor by joining them together along the dimension dimension.

  85. class KLDCriterion[T] extends AbstractCriterion[Table, Tensor[T], T]

    Computes the KL-divergence of the input normal distribution to a standard normal distribution.

  86. class KullbackLeiblerDivergenceCriterion[T] extends TensorCriterion[T]

    This method is same as kullback_leibler_divergence loss in keras.

  87. class L1Cost[T] extends TensorCriterion[T]

    compute L1 norm for input, and sign of input

  88. class L1HingeEmbeddingCriterion[T] extends AbstractCriterion[Table, Tensor[T], T]

    Creates a criterion that measures the loss given an input x = {x1, x2}, a table of two Tensors, and a label y (1 or -1):

  89. class L1Penalty[T] extends TensorModule[T]

    adds an L1 penalty to an input (for sparsity).

  90. class LSTM[T] extends Cell[T]

    Long Short Term Memory architecture.

  91. class LSTMPeephole[T] extends Cell[T]

    Long Short Term Memory architecture with peephole.

  92. class LayerNormalization[T] extends BaseModule[T]

    Applies layer normalization.

  93. class LeakyReLU[T] extends TensorModule[T]

    It is a transfer module that applies LeakyReLU, which parameter negval sets the slope of the negative part: LeakyReLU is defined as: f(x) = max(0, x) + negval * min(0, x)

  94. class Linear[T] extends TensorModule[T] with Initializable with MklInt8Convertible

    The Linear module applies a linear transformation to the input data, i.

  95. class LocallyConnected1D[T] extends TensorModule[T] with Initializable

  96. class LocallyConnected2D[T] extends TensorModule[T] with Initializable

    The LocallyConnected2D layer works similarly to the SpatialConvolution layer, except that weights are unshared, that is, a different set of filters is applied at each different patch of the input.

  97. class Log[T] extends TensorModule[T]

    The Log module applies a log transformation to the input data

  98. class LogSigmoid[T] extends TensorModule[T]

    This class is a transform layer corresponding to the sigmoid function: f(x) = Log(1 / (1 + e ^^ (-x)))

  99. class LogSoftMax[T] extends TensorModule[T]

    The LogSoftMax module applies a LogSoftMax transformation to the input data which is defined as: f_i(x) = log(1 / a exp(x_i)) where a = sum_j[exp(x_j)]

  100. class LookupTable[T] extends TensorModule[T] with Initializable

    This layer is a particular case of a convolution, where the width of the convolution would be 1.

  101. class LookupTableSparse[T] extends AbstractModule[Activity, Tensor[T], T] with Initializable

    LookupTable for multi-values.

  102. class MM[T] extends AbstractModule[Table, Tensor[T], T]

    Module to perform matrix multiplication on two mini-batch inputs, producing a mini-batch.

  103. class MSECriterion[T] extends TensorCriterion[T]

    The mean squared error criterion e.

  104. class MV[T] extends AbstractModule[Table, Tensor[T], T]

    It is a module to perform matrix vector multiplication on two mini-batch inputs, producing a mini-batch.

  105. class MapTable[T] extends DynamicContainer[Table, Table, T]

    This class is a container for a single module which will be applied to all input elements.

  106. class MarginCriterion[T] extends TensorCriterion[T]

    Creates a criterion that optimizes a two-class classification (squared) hinge loss (margin-based loss) between input x (a Tensor of dimension 1) and output y.

  107. class MarginRankingCriterion[T] extends AbstractCriterion[Table, Table, T]

    Creates a criterion that measures the loss given an input x = {x1, x2}, a table of two Tensors of size 1 (they contain only scalars), and a label y (1 or -1).

  108. class MaskHead extends BaseModule[Float]

  109. class MaskedSelect[T] extends AbstractModule[Table, Tensor[T], T]

    Performs a torch.

  110. class Masking[T] extends TensorModule[T]

    Masking Use a mask value to skip timesteps for a sequence

  111. class Max[T] extends TensorModule[T]

    Applies a max operation over dimension dim

  112. class Maxout[T] extends TensorModule[T]

    Maxout A linear maxout layer Maxout layer select the element-wise maximum value of maxoutNumber Linear(inputSize, outputSize) layers

  113. class Mean[T] extends Sum[T]

    It is a simple layer which applies a mean operation over the given dimension.

  114. class MeanAbsolutePercentageCriterion[T] extends TensorCriterion[T]

    This method is same as mean_absolute_percentage_error loss in keras.

  115. class MeanSquaredLogarithmicCriterion[T] extends TensorCriterion[T]

    This method is same as mean_squared_logarithmic_error loss in keras.

  116. class Min[T] extends TensorModule[T]

    Applies a min operation over dimension dim.

  117. class MixtureTable[T] extends AbstractModule[Table, Tensor[T], T]

    Creates a module that takes a table {gater, experts} as input and outputs the mixture of experts (a Tensor or table of Tensors) using a gater Tensor.

  118. trait MklInt8Convertible extends AnyRef

    Trait which provides MKL-DNN functionality to convert from FP32 to INT8

  119. case class MsraFiller(varianceNormAverage: Boolean = true) extends InitializationMethod with Product with Serializable

    A Filler based on the paper [He, Zhang, Ren and Sun 2015]: Specifically accounts for ReLU nonlinearities.

  120. class Mul[T] extends TensorModule[T] with Initializable

    multiply a single scalar factor to the incoming data

  121. class MulConstant[T] extends TensorModule[T]

    Multiplies input Tensor by a (non-learnable) scalar constant.

  122. class MultiCriterion[T] extends AbstractCriterion[Activity, Activity, T]

    a weighted sum of other criterions each applied to the same input and target;

  123. class MultiLabelMarginCriterion[T] extends TensorCriterion[T]

    Creates a criterion that optimizes a multi-class multi-classification hinge loss (margin-based loss) between input x and output y (which is a Tensor of target class indices)

  124. class MultiLabelSoftMarginCriterion[T] extends TensorCriterion[T]

    A MultiLabel multiclass criterion based on sigmoid:

  125. class MultiMarginCriterion[T] extends TensorCriterion[T]

    Creates a criterion that optimizes a multi-class classification hinge loss (margin-based loss) between input x and output y (which is a target class index).

  126. class MultiRNNCell[T] extends Cell[T]

    Enable user stack multiple simple cells.

  127. class Narrow[T] extends TensorModule[T]

    Narrow is application of narrow operation in a module.

  128. class NarrowTable[T] extends AbstractModule[Table, Table, T]

    Creates a module that takes a table as input and outputs the subtable starting at index offset having length elements (defaults to 1 element).

  129. class Negative[T] extends AbstractModule[Tensor[_], Tensor[_], T]

    Computing negative value of each element of input tensor

  130. class NegativeEntropyPenalty[T] extends TensorModule[T]

    Penalize the input multinomial distribution if it has low entropy.

  131. class Nms extends Serializable

    Non-Maximum Suppression (nms) for Object Detection The goal of nms is to solve the problem that groups of several detections near the real location, ideally obtaining only one detection per object

  132. class Normalize[T] extends TensorModule[T]

    Normalizes the input Tensor to have unit L_p norm.

  133. class NormalizeScale[T] extends TensorModule[T]

    NormalizeScale is conposed of normalize and scale, this is equal to caffe Normalize layer

  134. class PGCriterion[T] extends TensorCriterion[T]

    The Criterion to compute the negative policy gradient given a multinomial distribution and the sampled action and reward.

  135. class PReLU[T] extends TensorModule[T] with Initializable

    Applies parametric ReLU, which parameter varies the slope of the negative part.

  136. class Pack[T] extends AbstractModule[Activity, Tensor[_], T]

    Stacks a list of n-dimensional tensors into one (n+1)-dimensional tensor.

  137. class Padding[T] extends TensorModule[T]

    This module adds pad units of padding to dimension dim of the input.

  138. class PairwiseDistance[T] extends AbstractModule[Table, Tensor[T], T]

    It is a module that takes a table of two vectors as input and outputs the distance between them using the p-norm.

  139. class ParallelCriterion[T] extends AbstractCriterion[Table, Table, T]

    ParallelCriterion is a weighted sum of other criterions each applied to a different input and target.

  140. class ParallelTable[T] extends DynamicContainer[Table, Table, T]

    It is a container module that applies the i-th member module to the i-th input, and outputs an output in the form of Table

  141. class PoissonCriterion[T] extends TensorCriterion[T]

    This class is same as Poisson loss in keras.

  142. class Pooler[T] extends AbstractModule[Table, Tensor[T], T]

    Pooler selects the feature map which matches the size of RoI for RoIAlign

  143. class Power[T] extends TensorModule[T]

    Apply an element-wise power operation with scale and shift.

  144. class PriorBox[T] extends AbstractModule[Activity, Tensor[T], T]

    Generate the prior boxes of designated sizes and aspect ratios across all dimensions (H * W) Intended for use with MultiBox detection method to generate prior

  145. class Proposal extends AbstractModule[Table, Tensor[Float], Float]

    Outputs object detection proposals by applying estimated bounding-box transformations to a set of regular boxes (called "anchors").

  146. class RReLU[T] extends TensorModule[T]

    Applies the randomized leaky rectified linear unit (RReLU) element-wise to the input Tensor, thus outputting a Tensor of the same dimension.

  147. case class RandomNormal(mean: Double, stdv: Double) extends InitializationMethod with Product with Serializable

    Initializer that generates tensors with a normal distribution.

  148. case class RandomUniform(lower: Double, upper: Double) extends InitializationMethod with Product with Serializable

    Initializer that generates tensors with a uniform distribution.

  149. class ReLU[T] extends Threshold[T] with MklInt8Convertible

    Applies the rectified linear unit (ReLU) function element-wise to the input Tensor Thus the output is a Tensor of the same dimension ReLU function is defined as: f(x) = max(0, x)

  150. class ReLU6[T] extends HardTanh[T]

    Same as ReLU except that the rectifying function f(x) saturates at x = 6 ReLU6 is defined as: f(x) = min(max(0, x), 6)

  151. class Recurrent[T] extends DynamicContainer[Tensor[T], Tensor[T], T]

    Recurrent module is a container of rnn cells Different types of rnn cells can be added using add() function

  152. class RecurrentDecoder[T] extends Recurrent[T]

    RecurrentDecoder module is a container of rnn cells that used to make a prediction of the next timestep based on the prediction we made from the previous timestep.

  153. class RegionProposal extends AbstractModule[Table, Table, Float]

    Layer for RPN computation.

  154. class Replicate[T] extends TensorModule[T]

    Replicate repeats input nFeatures times along its dim dimension

  155. class Reshape[T] extends TensorModule[T]

    The forward(input) reshape the input tensor into a size(0) * size(1) * ... tensor, taking the elements row-wise.

  156. class ResizeBilinear[T] extends AbstractModule[Tensor[Float], Tensor[Float], T]

    Resize the input image with bilinear interpolation.

  157. class Reverse[T] extends TensorModule[T]

    Reverse the input w.

  158. class RnnCell[T] extends Cell[T]

    Implementation of vanilla recurrent neural network cell i2h: weight matrix of input to hidden units h2h: weight matrix of hidden units to themselves through time The updating is defined as: h_t = f(i2h * x_t + h2h * h_{t-1})

  159. class RoiAlign[T] extends AbstractModule[Activity, Tensor[T], T]

    Region of interest aligning (RoIAlign) for Mask-RCNN

  160. class RoiPooling[T] extends AbstractModule[Table, Tensor[T], T]

    Region of interest pooling The RoIPooling uses max pooling to convert the features inside any valid region of interest into a small feature map with a fixed spatial extent of pooledH × pooledW (e.

  161. class SReLU[T] extends TensorModule[T] with Initializable

    S-shaped Rectified Linear Unit.

  162. class Scale[T] extends AbstractModule[Tensor[T], Tensor[T], T]

    Scale is the combination of cmul and cadd Computes the elementwise product of input and weight, with the shape of the weight "expand" to match the shape of the input.

  163. class Select[T] extends TensorModule[T]

    A Simple layer selecting an index of the input tensor in the given dimension

  164. class SelectTable[T] extends AbstractModule[Table, Activity, T]

    Creates a module that takes a table as input and outputs the element at index index (positive or negative).

  165. class SequenceBeamSearch[T] extends AbstractModule[Table, Activity, T]

    Beam search to find the translated sequence with the highest probability.

  166. class Sequential[T] extends DynamicContainer[Activity, Activity, T] with MklInt8Convertible

    Sequential provides a means to plug layers together in a feed-forward fully connected manner.

  167. class Sigmoid[T] extends TensorModule[T]

    Applies the Sigmoid function element-wise to the input Tensor, thus outputting a Tensor of the same dimension.

  168. class SmoothL1Criterion[T] extends TensorCriterion[T]

    Creates a criterion that can be thought of as a smooth version of the AbsCriterion.

  169. class SmoothL1CriterionWithWeights[T] extends AbstractCriterion[Tensor[T], Table, T]

    a smooth version of the AbsCriterion It uses a squared term if the absolute element-wise error falls below 1.

  170. class SoftMarginCriterion[T] extends TensorCriterion[T]

    Creates a criterion that optimizes a two-class classification logistic loss between input x (a Tensor of dimension 1) and output y (which is a tensor containing either 1s or -1s).

  171. class SoftMax[T] extends TensorModule[T]

    Applies the SoftMax function to an n-dimensional input Tensor, rescaling them so that the elements of the n-dimensional output Tensor lie in the range (0, 1) and sum to 1.

  172. class SoftMin[T] extends TensorModule[T]

    Applies the SoftMin function to an n-dimensional input Tensor, rescaling them so that the elements of the n-dimensional output Tensor lie in the range (0,1) and sum to 1.

  173. class SoftPlus[T] extends TensorModule[T]

    Apply the SoftPlus function to an n-dimensional input tensor.

  174. class SoftShrink[T] extends TensorModule[T]

    Apply the soft shrinkage function element-wise to the input Tensor

  175. class SoftSign[T] extends TensorModule[T]

    Apply SoftSign function to an n-dimensional input Tensor.

  176. class SoftmaxWithCriterion[T] extends TensorCriterion[T]

    Computes the multinomial logistic loss for a one-of-many classification task, passing real-valued predictions through a softmax to get a probability distribution over classes.

  177. class SparseJoinTable[T] extends AbstractModule[Table, Tensor[T], T]

    :: Experimental ::

  178. class SparseLinear[T] extends Linear[T]

    SparseLinear is the sparse version of module Linear.

  179. class SpatialAveragePooling[T] extends TensorModule[T]

    Applies 2D average-pooling operation in kWxkH regions by step size dWxdH steps.

  180. class SpatialBatchNormalization[T] extends BatchNormalization[T]

    This file implements Batch Normalization as described in the paper: "Batch Normalization: Accelerating Deep Network Training by Reducing Internal Covariate Shift" by Sergey Ioffe, Christian Szegedy This implementation is useful for inputs coming from convolution layers.

  181. class SpatialContrastiveNormalization[T] extends TensorModule[T]

    Subtractive + divisive contrast normalization.

  182. class SpatialConvolution[T] extends TensorModule[T] with Initializable with MklInt8Convertible

    Applies a 2D convolution over an input image composed of several input planes.

  183. class SpatialConvolutionMap[T] extends TensorModule[T]

    This class is a generalization of SpatialConvolution.

  184. class SpatialCrossMapLRN[T] extends TensorModule[T]

    Applies Spatial Local Response Normalization between different feature maps.

  185. class SpatialDilatedConvolution[T] extends TensorModule[T] with Initializable

    Apply a 2D dilated convolution over an input image.

  186. class SpatialDivisiveNormalization[T] extends TensorModule[T]

    Applies a spatial division operation on a series of 2D inputs using kernel for computing the weighted average in a neighborhood.

  187. class SpatialDropout1D[T] extends TensorModule[T]

    This version performs the same function as Dropout, however it drops entire 1D feature maps instead of individual elements.

  188. class SpatialDropout2D[T] extends TensorModule[T]

    This version performs the same function as Dropout, however it drops entire 2D feature maps instead of individual elements.

  189. class SpatialDropout3D[T] extends TensorModule[T]

    This version performs the same function as Dropout, however it drops entire 3D feature maps instead of individual elements.

  190. class SpatialFullConvolution[T] extends AbstractModule[Activity, Tensor[T], T] with Initializable

    Apply a 2D full convolution over an input image.

  191. class SpatialMaxPooling[T] extends TensorModule[T]

    Applies 2D max-pooling operation in kWxkH regions by step size dWxdH steps.

  192. class SpatialSeparableConvolution[T] extends AbstractModule[Tensor[T], Tensor[T], T]

    Separable convolutions consist in first performing a depthwise spatial convolution (which acts on each input channel separately) followed by a pointwise convolution which mixes together the resulting output channels.

  193. class SpatialShareConvolution[T] extends SpatialConvolution[T]

    Annotations
    @SerialVersionUID( 4479683852714800631L )
  194. class SpatialSubtractiveNormalization[T] extends TensorModule[T]

    Applies a spatial subtraction operation on a series of 2D inputs using kernel for computing the weighted average in a neighborhood.

  195. class SpatialWithinChannelLRN[T] extends TensorModule[T]

    The local response normalization layer performs a kind of “lateral inhibition” by normalizing over local input regions.

  196. class SpatialZeroPadding[T] extends TensorModule[T]

    Each feature map of a given input is padded with specified number of zeros.

  197. class SplitTable[T] extends AbstractModule[Tensor[T], Table, T]

    Creates a module that takes a Tensor as input and outputs several tables, splitting the Tensor along the specified dimension dimension.

  198. class Sqrt[T] extends Power[T]

    Apply an element-wise sqrt operation.

  199. class Square[T] extends Power[T]

    Apply an element-wise square operation.

  200. class Squeeze[T] extends AbstractModule[Tensor[_], Tensor[_], T]

    Delete all singleton dimensions or a specific singleton dimension.

  201. class StaticGraph[T] extends Graph[T]

    A graph container.

  202. class Sum[T] extends TensorModule[T]

    It is a simple layer which applies a sum operation over the given dimension.

  203. class TableOperation[T] extends AbstractModule[Table, Tensor[T], T]

    When two tensors have different size, firstly expand small size tensor to large size tensor, and then do table operation.

  204. class Tanh[T] extends TensorModule[T]

    Applies the Tanh function element-wise to the input Tensor, thus outputting a Tensor of the same dimension.

  205. class TanhShrink[T] extends TensorModule[T]

    A simple layer for each element of the input tensor, do the following operation during the forward process: [f(x) = tanh(x) - 1]

  206. class TemporalConvolution[T] extends TensorModule[T] with Initializable

    Applies a 1D convolution over an input sequence composed of nInputFrame frames.

  207. class TemporalMaxPooling[T] extends TensorModule[T]

    Applies 1D max-pooling operation in kW regions by step size dW steps.

  208. class TensorTree[T] extends Serializable

    TensorTree class is used to decode a tensor to a tree structure.

  209. class Threshold[T] extends TensorModule[T]

    Threshold input Tensor.

  210. class Tile[T] extends TensorModule[T]

    Tile repeats input nFeatures times along its dim dimension

  211. class TimeDistributed[T] extends TensorModule[T]

    This layer is intended to apply contained layer to each temporal time slice of input tensor.

  212. class TimeDistributedCriterion[T] extends TensorCriterion[T]

    This class is intended to support inputs with 3 or more dimensions.

  213. class TimeDistributedMaskCriterion[T] extends TensorCriterion[T]

    This class is intended to support inputs with 3 or more dimensions.

  214. class Transformer[T] extends AbstractModule[Activity, Activity, T]

    Transformer model from "Attention Is All You Need".

  215. class TransformerCriterion[T] extends AbstractCriterion[Activity, Activity, T]

    The criterion that takes two modules to transform input and target, and take one criterion to compute the loss with the transformed input and target.

  216. sealed trait TransformerType extends AnyRef

  217. class Transpose[T] extends AbstractModule[Tensor[_], Tensor[_], T]

    Transpose input along specified dimensions

  218. abstract class TreeLSTM[T] extends AbstractModule[Table, Tensor[T], T]

  219. class Unsqueeze[T] extends AbstractModule[Tensor[_], Tensor[_], T]

    Insert singleton dim (i.

  220. class UpSampling1D[T] extends TensorModule[T]

    Upsampling layer for 1D inputs.

  221. class UpSampling2D[T] extends TensorModule[T]

    Upsampling layer for 2D inputs.

  222. class UpSampling3D[T] extends TensorModule[T]

    Upsampling layer for 3D inputs.

  223. trait VariableFormat extends AnyRef

    VariableFormat describe the meaning of each dimension of the variable (the trainable parameters of a model like weight and bias) and can be used to return the fan in and fan out size of the variable when provided the variable shape.

  224. class View[T] extends TensorModule[T]

    This module creates a new view of the input tensor using the sizes passed to the constructor.

  225. class VolumetricAveragePooling[T] extends TensorModule[T]

    Applies 3D average-pooling operation in kTxkWxkH regions by step size dTxdWxdH.

  226. class VolumetricConvolution[T] extends TensorModule[T] with Initializable

    Applies a 3D convolution over an input image composed of several input planes.

  227. class VolumetricFullConvolution[T] extends AbstractModule[Activity, Tensor[T], T] with Initializable

    Apply a 3D full convolution over an 3D input image, a sequence of images, or a video etc.

  228. class VolumetricMaxPooling[T] extends TensorModule[T]

    Applies 3D max-pooling operation in kTxkWxkH regions by step size dTxdWxdH.

Value Members

  1. object Abs extends Serializable

  2. object AbsCriterion extends Serializable

  3. object ActivityRegularization extends Serializable

  4. object Add extends Serializable

  5. object AddConstant extends Serializable

  6. object Anchor extends Serializable

  7. object Attention extends Serializable

  8. object BCECriterion extends Serializable

  9. object BatchNormalization extends ModuleSerializable with Serializable

  10. object BiRecurrent extends ContainerSerializable with Serializable

  11. object BifurcateSplitTable extends Serializable

  12. object Bilinear extends Serializable

  13. object BilinearFiller extends InitializationMethod with Product with Serializable

    Initialize the weight with coefficients for bilinear interpolation.

  14. object BinaryThreshold extends Serializable

  15. object BinaryTreeLSTM extends ModuleSerializable with Serializable

  16. object Bottle extends Serializable

  17. object BoxHead extends Serializable

  18. object CAdd extends Serializable

  19. object CAddTable extends ModuleSerializable with Serializable

  20. object CAveTable extends Serializable

  21. object CDivTable extends Serializable

  22. object CMaxTable extends Serializable

  23. object CMinTable extends Serializable

  24. object CMul extends Serializable

  25. object CMulTable extends Serializable

  26. object CMulTableExpand

  27. object CSubTable extends Serializable

  28. object CSubTableExpand

  29. object CategoricalCrossEntropy extends Serializable

  30. object CellSerializer extends ModuleSerializable

  31. object Clamp extends Serializable

  32. object ClassNLLCriterion extends Serializable

  33. object ClassSimplexCriterion extends Serializable

  34. object Concat extends Serializable

  35. object ConcatTable extends Serializable

  36. object Contiguous extends Serializable

  37. object ConvLSTMPeephole extends Serializable

  38. object ConvLSTMPeephole3D extends Serializable

  39. object Cosine extends Serializable

  40. object CosineDistance extends Serializable

  41. object CosineDistanceCriterion extends Serializable

  42. object CosineEmbeddingCriterion extends Serializable

  43. object CosineProximityCriterion extends Serializable

  44. object Cropping2D extends Serializable

  45. object Cropping3D extends Serializable

  46. object CrossEntropyCriterion extends Serializable

  47. object CrossProduct extends Serializable

  48. object DenseToSparse extends Serializable

  49. object DetectionOutputFrcnn extends Serializable

  50. object DetectionOutputSSD extends Serializable

  51. object DiceCoefficientCriterion extends Serializable

  52. object DistKLDivCriterion extends Serializable

  53. object DotProduct extends Serializable

  54. object DotProductCriterion extends Serializable

  55. object Dropout extends Serializable

  56. object ELU extends Serializable

  57. object Echo extends ModuleSerializable with Serializable

  58. object ErrorInfo

  59. object Euclidean extends Serializable

  60. object Exp extends Serializable

  61. object ExpandSize extends Serializable

  62. object FPN extends Serializable

  63. object FeedForwardNetwork extends Serializable

  64. object FlattenTable extends Serializable

  65. object FrameManager extends Serializable

  66. object GRU extends Serializable

  67. object GaussianCriterion extends Serializable

  68. object GaussianDropout extends Serializable

  69. object GaussianNoise extends Serializable

  70. object GaussianSampler extends Serializable

  71. object GradientReversal extends Serializable

  72. object Graph extends GraphSerializable with Serializable

  73. object HardShrink extends Serializable

  74. object HardSigmoid extends Serializable

  75. object HardTanh extends Serializable

  76. object Highway

  77. object HingeEmbeddingCriterion extends Serializable

  78. object Identity extends Serializable

  79. object Index extends Serializable

  80. object InferReshape extends Serializable

  81. object Input extends Serializable

  82. object JoinTable extends Serializable

  83. object KLDCriterion extends Serializable

  84. object KullbackLeiblerDivergenceCriterion extends Serializable

  85. object L1Cost extends Serializable

  86. object L1HingeEmbeddingCriterion extends Serializable

  87. object L1Penalty extends Serializable

  88. object LSTM extends Serializable

  89. object LSTMPeephole extends Serializable

  90. object LanguageModel extends TransformerType with Product with Serializable

  91. object LeakyReLU extends Serializable

  92. object Linear extends Quantizable with Serializable

  93. object LocallyConnected1D extends Serializable

  94. object LocallyConnected2D extends Serializable

  95. object Log extends Serializable

  96. object LogSigmoid extends Serializable

  97. object LogSoftMax extends Serializable

  98. object LookupTable extends Serializable

  99. object LookupTableSparse extends Serializable

  100. object MM extends Serializable

  101. object MSECriterion extends Serializable

  102. object MV extends Serializable

  103. object MapTable extends ContainerSerializable with Serializable

  104. object MarginCriterion extends Serializable

  105. object MarginRankingCriterion extends Serializable

  106. object MaskHead extends Serializable

  107. object MaskedSelect extends ModuleSerializable with Serializable

  108. object Masking extends Serializable

  109. object Max extends Serializable

  110. object Maxout extends ModuleSerializable with Serializable

  111. object Mean extends Serializable

  112. object MeanAbsolutePercentageCriterion extends Serializable

  113. object MeanSquaredLogarithmicCriterion extends Serializable

  114. object Min extends Serializable

  115. object MixtureTable extends Serializable

  116. object Module

  117. object Mul extends Serializable

  118. object MulConstant extends Serializable

  119. object MultiCriterion extends Serializable

  120. object MultiLabelMarginCriterion extends Serializable

  121. object MultiLabelSoftMarginCriterion extends Serializable

  122. object MultiMarginCriterion extends Serializable

  123. object MultiRNNCell extends ModuleSerializable with Serializable

  124. object Narrow extends Serializable

  125. object NarrowTable extends Serializable

  126. object Negative extends Serializable

  127. object NegativeEntropyPenalty extends Serializable

  128. object NormMode extends Enumeration

  129. object Normalize extends Serializable

  130. object NormalizeScale extends Serializable

  131. object Ones extends InitializationMethod with Product with Serializable

    Initializer that generates tensors with zeros.

  132. object PGCriterion extends Serializable

  133. object PReLU extends Serializable

  134. object Pack extends Serializable

  135. object Padding extends Serializable

  136. object PairwiseDistance extends Serializable

  137. object ParallelCriterion extends Serializable

  138. object ParallelTable extends Serializable

  139. object PoissonCriterion extends Serializable

  140. object Pooler extends Serializable

  141. object Power extends Serializable

  142. object PriorBox extends Serializable

  143. object Proposal extends Serializable

  144. object RReLU extends Serializable

  145. object RandomUniform extends InitializationMethod with Product with Serializable

    Initializer that generates tensors with a uniform distribution.

  146. object ReLU extends Serializable

  147. object ReLU6 extends Serializable

  148. object Recurrent extends ContainerSerializable with Serializable

  149. object RecurrentDecoder extends ContainerSerializable with Serializable

  150. object RegionProposal extends Serializable

  151. object Replicate extends Serializable

  152. object Reshape extends ModuleSerializable with Serializable

  153. object ResizeBilinear extends Serializable

  154. object Reverse extends Serializable

  155. object RnnCell extends Serializable

  156. object RoiAlign extends Serializable

  157. object RoiPooling extends Serializable

  158. object SReLU extends ModuleSerializable with Serializable

  159. object Scale extends ModuleSerializable with Serializable

  160. object Select extends Serializable

  161. object SelectTable extends Serializable

  162. object SequenceBeamSearch extends Serializable

  163. object Sequential extends Serializable

  164. object Sigmoid extends Serializable

  165. object SmoothL1Criterion extends Serializable

  166. object SmoothL1CriterionWithWeights extends Serializable

  167. object SoftMarginCriterion extends Serializable

  168. object SoftMax extends Serializable

  169. object SoftMin extends Serializable

  170. object SoftPlus extends Serializable

  171. object SoftShrink extends Serializable

  172. object SoftSign extends Serializable

  173. object SoftmaxWithCriterion extends Serializable

  174. object SparseJoinTable extends Serializable

  175. object SparseLinear extends Serializable

  176. object SpatialAveragePooling extends Serializable

  177. object SpatialBatchNormalization extends Serializable

  178. object SpatialContrastiveNormalization extends ModuleSerializable with Serializable

  179. object SpatialConvolution extends Quantizable with Serializable

  180. object SpatialConvolutionMap extends Serializable

  181. object SpatialCrossMapLRN extends Serializable

  182. object SpatialDilatedConvolution extends Quantizable with Serializable

  183. object SpatialDivisiveNormalization extends ModuleSerializable with Serializable

  184. object SpatialDropout1D extends Serializable

  185. object SpatialDropout2D extends Serializable

  186. object SpatialDropout3D extends Serializable

  187. object SpatialFullConvolution extends ModuleSerializable with Serializable

  188. object SpatialMaxPooling extends ModuleSerializable with Serializable

  189. object SpatialSeparableConvolution extends ModuleSerializable with Serializable

  190. object SpatialShareConvolution extends Serializable

  191. object SpatialSubtractiveNormalization extends ModuleSerializable with Serializable

  192. object SpatialWithinChannelLRN extends Serializable

  193. object SpatialZeroPadding extends Serializable

  194. object SplitTable extends Serializable

  195. object Sqrt extends Serializable

  196. object Square extends Serializable

  197. object Squeeze extends Serializable

  198. object Sum extends Serializable

  199. object Tanh extends Serializable

  200. object TanhShrink extends Serializable

  201. object TemporalConvolution extends Serializable

  202. object TemporalMaxPooling extends Serializable

  203. object Threshold extends Serializable

  204. object Tile extends Serializable

  205. object TimeDistributed extends ModuleSerializable with Serializable

  206. object TimeDistributedCriterion extends Serializable

  207. object TimeDistributedMaskCriterion extends Serializable

  208. object Transformer extends ModuleSerializable with Serializable

  209. object TransformerCriterion extends Serializable

  210. object Translation extends TransformerType with Product with Serializable

  211. object Transpose extends ModuleSerializable with Serializable

  212. object Unsqueeze extends Serializable

  213. object UpSampling1D extends Serializable

  214. object UpSampling2D extends Serializable

  215. object UpSampling3D extends Serializable

  216. object Utils

  217. object VariableFormat

  218. object View extends Serializable

  219. object VolumetricAveragePooling extends ModuleSerializable with Serializable

  220. object VolumetricConvolution extends Serializable

  221. object VolumetricFullConvolution extends Serializable

  222. object VolumetricMaxPooling extends ModuleSerializable with Serializable

  223. object Xavier extends InitializationMethod with Product with Serializable

    In short, it helps signals reach deep into the network.

  224. object Zeros extends InitializationMethod with Product with Serializable

    Initializer that generates tensors with zeros.

  225. package abstractnn

  226. package keras

  227. package mkldnn

  228. package onnx

  229. package ops

  230. package quantized

  231. package tf

Ungrouped