PAGE NOT FOUND
Out of nothing, something.
You can find (just about) anything on Medium — apparently even a page that doesn’t exist. Maybe these stories about finding what you didn’t know you were looking for will take you somewhere new?
Get a free MyMarkup account to save this article and view it later on any device.
Tags not allowed
To sign in to MyMarkup, please click the MyMarkup icon in Chrome's toolbar.
End User License Agreement
A prior probability distribution is defined before observing the data, the learning happens and the distribution transforms into posterior distributions once the data is observed
Aleatoric uncertainty measures the noise inherent in the observations
Epistemic uncertainty represents the uncertainty caused by the model itself
homoscedastic uncertainty, the uncertainty which stays constant for different inputs
heteroscedastic uncertainty which depends on the inputs to the model
Epistemic uncertainty is modelled by placing a prior distribution over a model’s weights and then trying to capture how much these weights vary given some data
Aleatoric uncertainty, on the other hand, is modelled by placing a distribution over the output of the model.
only point estimates of the weights are achieved in the network. As a result, these networks make overconfident predictions and do not account for uncertainty in the parameters
take a variational approximation rather than a Monte Carlo scheme to find the approximate Bayesian posterior distribution
P(θ) which is our prior
P(x|θ) which is the likelihood
P(x) is the evidence
P(w|D) (our posterior from above)
another distribution q(w|D) with some variational parameters θ.