"Aniket" wrote in message <k9ja1j$1gg$1@newscl01ah.mathworks.com>...
> Hi
>
> This is a little old thread but I hope its active!
> Even I am using classify.m to classify my data into two groups. To get a better insight I tried to understand the math involved in the 'linear' case of classify.m. However, I could not really make a whole lot of sense out of it.
>
> I referred the multivariate analysis book by W.J. Krzanowski and found one of the criteria for classification is the closeness of an observation to the mean of a class. Is this what is used in classify.m or is it different?
It is more general. The measure of closeness is the log of the Gaussian probability distribution which contains the (squared) Mahalanobis distance and log of the apriori probability.
> I understand my question is more statistics oriented than matlab based, but I hope to find some fundamental answers.
Hope this helps.
Greg
> Hi
>
> This is a little old thread but I hope its active!
> Even I am using classify.m to classify my data into two groups. To get a better insight I tried to understand the math involved in the 'linear' case of classify.m. However, I could not really make a whole lot of sense out of it.
>
> I referred the multivariate analysis book by W.J. Krzanowski and found one of the criteria for classification is the closeness of an observation to the mean of a class. Is this what is used in classify.m or is it different?
It is more general. The measure of closeness is the log of the Gaussian probability distribution which contains the (squared) Mahalanobis distance and log of the apriori probability.
> I understand my question is more statistics oriented than matlab based, but I hope to find some fundamental answers.
Hope this helps.
Greg