The website uses cookies. By using this site, you agree to our use of cookies as described in the Privacy Policy.
I Agree
blank_error__heading
blank_error__body
Text direction?

Medium

Error
500
Our nimblest minds and quickest coders are scrambling to bring Medium back — so hold tight.
What can you do? Try refreshing this page or check Medium's status.

We’ve made changes to our Terms of Service and Privacy Policy. They take effect on September 1, 2020, and we encourage you to review them. By continuing to use our services, you agree to the new Terms of Service and acknowledge the Privacy Policy applies to you.

Measure
Measure
Related Notes
Get a free MyMarkup account to save this article and view it later on any device.
Create account

End User License Agreement

Summary | 15 Annotations
A prior probability distribution is defined before observing the data, the learning happens and the distribution transforms into posterior distributions once the data is observed
2020/07/28 07:10
Aleatoric uncertainty measures the noise inherent in the observations
2020/07/28 07:12
Epistemic uncertainty represents the uncertainty caused by the model itself
2020/07/28 07:12
homoscedastic uncertainty, the uncertainty which stays constant for different inputs
2020/07/28 07:12
heteroscedastic uncertainty which depends on the inputs to the model
2020/07/28 07:12
Epistemic uncertainty is modelled by placing a prior distribution over a model’s weights and then trying to capture how much these weights vary given some data
2020/07/28 07:13
Aleatoric uncertainty, on the other hand, is modelled by placing a distribution over the output of the model.
2020/07/28 07:13
only point estimates of the weights are achieved in the network. As a result, these networks make overconfident predictions and do not account for uncertainty in the parameters
2020/07/28 07:15
take a variational approximation rather than a Monte Carlo scheme to find the approximate Bayesian posterior distribution
2020/07/28 07:17
P(θ) which is our prior
2020/07/28 07:17
P(x|θ) which is the likelihood
2020/07/28 07:18
P(x) is the evidence
2020/07/28 07:18
P(w|D) (our posterior from above)
2020/07/28 07:22
another distribution q(w|D) with some variational parameters θ.
2020/07/28 07:23
o
2020/07/28 07:29