Tuesday 6 October 2009

Probability for engineers and scientist

Just brushing through probab. and stats. Grabbed a nice book from the library. Studying some basic concepts such as random vars, joint probability distributions.

A few things that I need to understand are maximum likelihood estimation (MLE) and conditional probability leading to the Bayes' theorem which commonly occurs in modelling problems in imaging. 

Today I have been reading on MLE and it is one of the most popular methods of point-estimation. Point estimation as I understood is to approximate, for example, mean of the population from the sample mean. So samples are taken from the pop. MLE was explained nicely here using a coin experiment. Given a sample (lets say 52 heads and the rest all tails in a 100 tosses), what is the probability for heads P(H) that makes this sample most likely. So, if we set out with P(H) = 0.5 and computed the probability for 52H and 48T = 100!/(52! * 48!), we would see that the probability is lower than if we had set out with lets say P(H)=0.52.  So maximizing the MLE function determines the optimal value of P(H). MLE functions are usually expressed as a product of probabilities. It is thus more convenient to maximize the log-likelihood, transforming the function to a summation. This is simply because multiplying small numbers each time results into smaller numbers difficult to represent.

The EM algorithm is used for computing MLE. I need to yet read on EM and perhaps see how it is applied in Lorenzo-Valdes' thesis. Sometime ago I had borrowed a book on Pattern classification which has a chapter on EM. 

My main focus however today is not stats. It has been the graph cuts method for segmentation, which is commonly used in medical image segmentation apparently. I have yet to come in grips with how the method works.