probly.layers.torch¶
torch layer implementations.
Classes¶
BatchEnsemble convolutional layer based on [WTB20]. |
|
BatchEnsemble linear layer based on [WTB20]. |
|
Implementation of a Bayesian convolutional layer based on [BCKW15]. |
|
Implement a Bayesian linear layer based on [BCKW15]. |
|
Custom Linear layer with DropConnect applied to weights during training based on [MYM+21]. |
|
A unified PyTorch implementation of the Heteroscedastic layer based on [CMK+21]. |
|
Head that converts encoded features into Dirichlet concentration parameters (alpha). |
|
Interval-valued batch normalization for 1D features based on [WCM+24]. |
|
Interval-valued batch normalization for 2D feature maps based on [WCM+24]. |
|
Interval-arithmetic 2D convolution based on [WCM+24]. |
|
Interval-arithmetic linear layer based on [WCM+24]. |
|
Interval SoftMax head based on [WCM+24]. |
|
Dirichlet posterior head for evidential classification. |
|
Gaussian posterior head for evidential regression. |
|
Custom Linear layer for a normal-inverse-gamma-distribution based on [ASSR20]. |
|
Radial normalizing flow based on [RM15]. |
|
Stack of radial normalizing flows based on [RM15]. |
|
Head that converts encoded features into evidential Normal-Gamma parameters. |
|
Spectral-normalized Neural Gaussian Process (SNGP) layer based on [LLP+20]. |
|
Applies spectral normalization with a bounded multiplier to a module's weight suggested by [LLP+20]. |
Functions¶
Promote |
|
Split a packed interval tensor on |