mindsponge.metrics.BalancedMSE

View Source On Gitee
class mindsponge.metrics.BalancedMSE(first_break, last_break, num_bins, beta=0.99, reducer_flag=False)[source]

Balanced MSE error Compute Balanced MSE error between the prediction and the ground truth to solve unbalanced labels in regression task.

Reference:

Ren, Jiawei, et al. 'Balanced MSE for Imbalanced Visual Regression' .

\[L =-\log \mathcal{N}\left(\boldsymbol{y} ; \boldsymbol{y}_{\text {pred }}, \sigma_{\text {noise }}^{2} \mathrm{I}\right) +\log \sum_{i=1}^{N} p_{\text {train }}\left(\boldsymbol{y}_{(i)}\right) \cdot \mathcal{N}\left(\boldsymbol{y}_{(i)} ; \boldsymbol{y}_{\text {pred }}, \sigma_{\text {noise }}^{2} \mathrm{I}\right)\]
Parameters
  • first_break (float) – The begin value of bin.

  • last_break (float) – The end value of bin.

  • num_bins (int) – The bin numbers.

  • beta (float) – The moving average coefficient, default: 0.99.

  • reducer_flag (bool) – Whether to aggregate the label values of multiple devices, default: False.

Inputs:
  • prediction (Tensor) - Predict values, shape is \((batch\_size, ndim)\).

  • target (Tensor) - Label values, shape is \((batch\_size, ndim)\).

Outputs:

Tensor, shape is \((batch\_size, ndim)\).

Supported Platforms:

Ascend GPU

Examples

>>> import numpy as np
>>> from mindsponge.metrics import BalancedMSE
>>> from mindspore import Tensor
>>> net = BalancedMSE(0, 1, 20)
>>> prediction = Tensor(np.random.randn(32, 10).astype(np.float32))
>>> target = Tensor(np.random.randn(32, 10).astype(np.float32))
>>> out = net(prediction, target)
>>> print(out.shape)
(32, 10)