# Objective Functions¶

Objective functions are used measure the performance of a model.

## Function Descriptions¶

class ztlearn.objectives.BinaryCrossEntropy[source]

Binary Cross Entropy

Binary CrossEntropy measures the performance of a classification model whose output is a probability value between 0 & 1. ‘Binary’ is meant for discrete classification tasks in which the classes are independent and not mutually exclusive. Targets here could be either 0 or 1 scalar

References

[1] Cross Entropy
accuracy(predictions, targets, threshold=0.5)[source]

Calculates the BinaryCrossEntropy Accuracy Score given prediction and targets

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array threshold (numpy.float32) – the threshold value the output of BinaryCrossEntropy Accuracy Score numpy.float32
derivative(predictions, targets, np_type)[source]

Applies the BinaryCrossEntropy Derivative to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of BinaryCrossEntropy Derivative to prediction and targets numpy.array
loss(predictions, targets, np_type)[source]

Applies the BinaryCrossEntropy Loss to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of BinaryCrossEntropy Loss to prediction and targets numpy.array
objective_name
class ztlearn.objectives.CategoricalCrossEntropy[source]

Categorical Cross Entropy

Categorical Cross Entropy measures the performance of a classification model whose output is a probability value between 0 and 1. ‘Categorical’ is meant for discrete classification tasks in which the classes are mutually exclusive.

References

[1] Cross Entropy
accuracy(predictions, targets)[source]

Calculates the CategoricalCrossEntropy Accuracy Score given prediction and targets

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of CategoricalCrossEntropy Accuracy Score numpy.float32
derivative(predictions, targets, np_type)[source]

Applies the CategoricalCrossEntropy Derivative to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of CategoricalCrossEntropy Derivative to prediction and targets numpy.array
loss(predictions, targets, np_type)[source]

Applies the CategoricalCrossEntropy Loss to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of CategoricalCrossEntropy Loss to prediction and targets numpy.array
objective_name
class ztlearn.objectives.HellingerDistance[source]

Bases: object

Hellinger Distance

Hellinger Distance is used to quantify the similarity between two probability distributions.

References

[1] Hellinger Distance
SQRT_2 = 1.4142135623730951
accuracy(predictions, targets, threshold=0.5)[source]
derivative(predictions, targets, np_type)[source]

Applies the HellingerDistance Derivative to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of HellingerDistance Derivative to prediction and targets numpy.array
loss(predictions, targets, np_type)[source]

Applies the HellingerDistance Loss to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of HellingerDistance Loss to prediction and targets numpy.array
objective_name
sqrt_difference(predictions, targets)[source]
class ztlearn.objectives.HingeLoss[source]

Bases: object

Hinge Loss

Hinge Loss also known as SVM Loss is used “maximum-margin” classification, most notably for support vector machines (SVMs)

References

[1] Hinge loss
accuracy(predictions, targets, threshold=0.5)[source]

Calculates the Hinge-Loss Accuracy Score given prediction and targets

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of Hinge-Loss Accuracy Score numpy.float32
derivative(predictions, targets, np_type)[source]

Applies the Hinge-Loss Derivative to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of Hinge-Loss Derivative to prediction and targets numpy.array
loss(predictions, targets, np_type)[source]

Applies the Hinge-Loss to Loss prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of Hinge-Loss Loss to prediction and targets numpy.array
objective_name
class ztlearn.objectives.HuberLoss[source]

Huber Loss

Huber Loss: is a loss function used in robust regression where it is found to be less sensitive to outliers in data than the squared error loss.

References:
[1] Huber Loss
[2] Huber loss
accuracy(predictions, targets)[source]

Calculates the HuberLoss Accuracy Score given prediction and targets

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of KLDivergence Accuracy Score numpy.float32
derivative(predictions, targets, np_type, delta=1.0)[source]

Applies the HuberLoss Derivative to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of KLDivergence Derivative to prediction and targets numpy.array
loss(predictions, targets, np_type, delta=1.0)[source]

Applies the HuberLoss Loss to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of KLDivergence Loss to prediction and targets numpy.array
objective_name
class ztlearn.objectives.KLDivergence[source]

KL Divergence

Kullback–Leibler divergence (also called relative entropy) is a measure of divergence between two probability distributions.
accuracy(predictions, targets)[source]

Calculates the KLDivergence Accuracy Score given prediction and targets

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of KLDivergence Accuracy Score numpy.float32
derivative(predictions, targets, np_type)[source]

Applies the KLDivergence Derivative to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of KLDivergence Derivative to prediction and targets numpy.array
loss(predictions, targets, np_type)[source]

Applies the KLDivergence Loss to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of KLDivergence Loss to prediction and targets numpy.array
objective_name
class ztlearn.objectives.MeanSquaredError[source]

Bases: object

Mean Squared error (MSE)

MSE measures the average squared difference between the predictions and the targets. The closer the predictions are to the targets the more efficient the estimator.

References

[1] Mean Squared error
accuracy(predictions, targets, threshold=0.5)[source]
derivative(predictions, targets, np_type)[source]

Applies the MeanSquaredError Derivative to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of MeanSquaredError Derivative to prediction and targets numpy.array
loss(predictions, targets, np_type)[source]

Applies the MeanSquaredError Loss to prediction and targets provided

Parameters: predictions (numpy.array) – the predictions numpy array targets (numpy.array) – the targets numpy array the output of MeanSquaredError Loss to prediction and targets numpy.array
objective_name
class ztlearn.objectives.Objective[source]

Bases: object

add_fuzz_factor(np_array, epsilon=1e-05)[source]
clip(predictions, epsilon=1e-15)[source]
error(predictions, targets)[source]
objective_name
class ztlearn.objectives.ObjectiveFunction(name)[source]

Bases: object

accuracy(predictions, targets)[source]
backward(predictions, targets, np_type=<class 'numpy.float32'>)[source]
forward(predictions, targets, np_type=<class 'numpy.float32'>)[source]
name