BigDL module to be optimized
BigDL criterion method
The size (Tensor dimensions) of the feature data. e.g. an image may be with width * height = 28 * 28, featureSize = Array(28, 28).
The size (Tensor dimensions) of the label data.
BigDL criterion method
The size (Tensor dimensions) of the feature data.
The size (Tensor dimensions) of the feature data. e.g. an image may be with width * height = 28 * 28, featureSize = Array(28, 28).
The size (Tensor dimensions) of the label data.
number of max Epoch for the training
BigDL module to be optimized
sub classes can extend the method and return required model for different transform tasks
sub classes can extend the method and return required model for different transform tasks
DLEstimator helps to train a BigDL Model with the Spark ML Estimator/Transfomer pattern, thus Spark users can conveniently fit BigDL into Spark ML pipeline.
DLEstimator supports feature and label data in the format of Array[Double], Array[Float], org.apache.spark.mllib.linalg.{Vector, VectorUDT} for Spark 1.5, 1.6 and org.apache.spark.ml.linalg.{Vector, VectorUDT} for Spark 2.0+. Also label data can be of DoubleType. User should specify the feature data dimensions and label data dimensions via the constructor parameters featureSize and labelSize respectively. Internally the feature and label data are converted to BigDL tensors, to further train a BigDL model efficiently.
For details usage, please refer to examples in package com.intel.analytics.bigdl.example.MLPipeline