The website uses cookies. By using this site, you agree to our use of cookies as described in the Privacy Policy.
I Agree
Text direction?


Out of nothing, something.
You can find (just about) anything on Medium — apparently even a page that doesn’t exist. Maybe these stories about finding what you didn’t know you were looking for will take you somewhere new?
Related Notes
Get a free MyMarkup account to save this article and view it later on any device.
Create account

End User License Agreement

Summary | 15 Annotations
A prior probability distribution is defined before observing the data, the learning happens and the distribution transforms into posterior distributions once the data is observed
2020/07/28 07:10
Aleatoric uncertainty measures the noise inherent in the observations
2020/07/28 07:12
Epistemic uncertainty represents the uncertainty caused by the model itself
2020/07/28 07:12
homoscedastic uncertainty, the uncertainty which stays constant for different inputs
2020/07/28 07:12
heteroscedastic uncertainty which depends on the inputs to the model
2020/07/28 07:12
Epistemic uncertainty is modelled by placing a prior distribution over a model’s weights and then trying to capture how much these weights vary given some data
2020/07/28 07:13
Aleatoric uncertainty, on the other hand, is modelled by placing a distribution over the output of the model.
2020/07/28 07:13
only point estimates of the weights are achieved in the network. As a result, these networks make overconfident predictions and do not account for uncertainty in the parameters
2020/07/28 07:15
take a variational approximation rather than a Monte Carlo scheme to find the approximate Bayesian posterior distribution
2020/07/28 07:17
P(θ) which is our prior
2020/07/28 07:17
P(x|θ) which is the likelihood
2020/07/28 07:18
P(x) is the evidence
2020/07/28 07:18
P(w|D) (our posterior from above)
2020/07/28 07:22
another distribution q(w|D) with some variational parameters θ.
2020/07/28 07:23
2020/07/28 07:29