Performs a back-propagation step through the criterion, with respect to the given input.
Performs a back-propagation step through the criterion, with respect to the given input.
input data
target
gradient corresponding to input data
Deep copy this criterion
Takes an input object, and computes the corresponding loss of the criterion,
compared with target
.
Takes an input object, and computes the corresponding loss of the criterion,
compared with target
.
input data
target
the loss of criterion
back propagation with: - target / input
back propagation with: - target / input
input data
target data / labels
gradient of input
It calculates: y_true = K.
It calculates: y_true = K.clip(y_true, K.epsilon(), 1) y_pred = K.clip(y_pred, K.epsilon(), 1) and output K.sum(y_true * K.log(y_true / y_pred), axis=-1)
input of the criterion
target or labels
the loss of the criterion
This method is same as
kullback_leibler_divergence
loss in keras. Loss calculated as: y_true = K.clip(y_true, K.epsilon(), 1) y_pred = K.clip(y_pred, K.epsilon(), 1) and output K.sum(y_true * K.log(y_true / y_pred), axis=-1)The numeric type in the criterion, usually which are Float or Double