probly.train.evidential.torch.evidential_kl_divergence¶
- probly.train.evidential.torch.evidential_kl_divergence(alphas: Tensor, targets: Tensor) Tensor[source]¶
Evidential KL divergence loss for classification uncertainty estimation.
Implements the KL divergence regularization term proposed by Sensoy et al. (2018) for Evidential Deep Learning.
- Reference:
Sensoy et al., “Evidential Deep Learning to Quantify Classification Uncertainty”, NeurIPS 2018. https://arxiv.org/abs/1806.01768
- Parameters:
alphas – Dirichlet concentration parameters, shape (B, C).
targets – Ground-truth class labels, shape (B,).
- Returns:
Scalar evidential KL divergence loss averaged over the batch.