Losses#
- class ivy.data_classes.array.losses._ArrayWithLosses[source]#
Bases:
ABC
- _abc_impl = <_abc._abc_data object>#
- binary_cross_entropy(pred, /, *, from_logits=False, epsilon=0.0, reduction='mean', pos_weight=None, axis=None, out=None)[source]#
ivy.Array instance method variant of ivy.binary_cross_entropy. This method simply wraps the function, and so the docstring for ivy.binary_cross_entropy also applies to this method with minimal changes.
- Parameters:
self (
Array
) – input array containing true labels.pred (
Union
[Array
,NativeArray
]) – input array containing Predicted labels.from_logits (
bool
, default:False
) – Whether pred is expected to be a logits tensor. By default, we assume that pred encodes a probability distribution.epsilon (
float
, default:0.0
) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating the loss. If epsilon is0
, no smoothing will be applied. Default:0
.reduction (
str
, default:'mean'
) –'none'
: No reduction will be applied to the output.'mean'
: The output will be averaged.'sum'
: The output will be summed. Default:'none'
.pos_weight (
Optional
[Union
[Array
,NativeArray
]], default:None
) – a weight for positive examples. Must be an array with length equal to the number of classes.axis (
Optional
[int
], default:None
) – Axis along which to compute crossentropy.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
Array
- Returns:
ret – The binary cross entropy between the given distributions.
Examples
>>> x = ivy.array([1 , 1, 0]) >>> y = ivy.array([0.7, 0.8, 0.2]) >>> z = x.binary_cross_entropy(y) >>> print(z) ivy.array(0.26765382)
- cross_entropy(pred, /, *, axis=-1, epsilon=1e-07, reduction='mean', out=None)[source]#
ivy.Array instance method variant of ivy.cross_entropy. This method simply wraps the function, and so the docstring for ivy.cross_entropy also applies to this method with minimal changes.
- Parameters:
self (
Array
) – input array containing true labels.pred (
Union
[Array
,NativeArray
]) – input array containing the predicted labels.axis (
int
, default:-1
) – the axis along which to compute the cross-entropy. If axis is-1
, the cross-entropy will be computed along the last dimension. Default:-1
.epsilon (
float
, default:1e-07
) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating the loss. If epsilon is0
, no smoothing will be applied. Default:1e-7
.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
Array
- Returns:
ret – The cross-entropy loss between the given distributions.
Examples
>>> x = ivy.array([0, 0, 1, 0]) >>> y = ivy.array([0.25, 0.25, 0.25, 0.25]) >>> z = x.cross_entropy(y) >>> print(z) ivy.array(0.34657359)
- sparse_cross_entropy(pred, /, *, axis=-1, epsilon=1e-07, reduction='mean', out=None)[source]#
ivy.Array instance method variant of ivy.sparse_cross_entropy. This method simply wraps the function, and so the docstring for ivy.sparse_cross_entropy also applies to this method with minimal changes.
- Parameters:
self (
Array
) – input array containing the true labels as logits.pred (
Union
[Array
,NativeArray
]) – input array containing the predicted labels as logits.axis (
int
, default:-1
) – the axis along which to compute the cross-entropy. If axis is-1
, the cross-entropy will be computed along the last dimension. Default:-1
. epsilon a float in [0.0, 1.0] specifying the amount of smoothing when calculating the loss. If epsilon is0
, no smoothing will be applied. Default:1e-7
.epsilon (
float
, default:1e-07
) – a float in [0.0, 1.0] specifying the amount of smoothing when calculating the loss. If epsilon is0
, no smoothing will be applied. Default:1e-7
.out (
Optional
[Array
], default:None
) – optional output array, for writing the result to. It must have a shape that the inputs broadcast to.
- Return type:
Array
- Returns:
ret – The sparse cross-entropy loss between the given distributions.
Examples
>>> x = ivy.array([1 , 1, 0]) >>> y = ivy.array([0.7, 0.8, 0.2]) >>> z = x.sparse_cross_entropy(y) >>> print(z) ivy.array([0.07438118, 0.07438118, 0.11889165])
This should have hopefully given you an overview of the losses submodule, if you have any questions, please feel free to reach out on our discord!