Maximum Likelihood Estimation (MLE) of Single Gaussian (Normal) Distribution
- Python Automation and Machine Learning for ICs - - An Online Book - |
||||||||
Python Automation and Machine Learning for ICs http://www.globalsino.com/ICs/ | ||||||||
Chapter/Index: Introduction | A | B | C | D | E | F | G | H | I | J | K | L | M | N | O | P | Q | R | S | T | U | V | W | X | Y | Z | Appendix | ||||||||
================================================================================= In the case of a single Gaussian (normal) distribution, the goal is to find the mean (μ) and variance (σ2) that maximize the likelihood function for the observed data. For a single Gaussian distribution, the probability density function (PDF) is given by: ------------------------- [3690a] The likelihood function, , for a set of independent and identically distributed (i.i.d.) observations is the product of the individual probability density functions:------------------------- [3690b] To find the MLE estimates for and σ2, we differentiate the log-likelihood function with respect to each parameter, set the derivatives equal to zero, and solve for the parameters. The resulting values of and σ2 that maximize the log-likelihood function are the MLE estimates. The solutions are:------------------------- [3690c] ------------------------- [3690d] These formulas correspond to the sample mean and sample variance, respectively, as MLE estimates for the parameters of a single Gaussian distribution. ============================================
|
||||||||
================================================================================= | ||||||||
|
||||||||