![]() |
![]() |
||||||||||||||
   Next:                 Contact:    Peter Mills    Previous:    Home ![]() libagf is a Peteysoft project |
Consider the following generalization of a k-nearest neighbours scheme:
where ![]() ![]() ![]() ![]() ![]()
where ![]() ![]()
The parameter,
An obvious choice for
Where the upright brackets denote a metric, typically Cartesian. The width of the Gaussian can be effectively varied by simply squaring the weights, thus they may be pre-calculated for an initial, trial value of ![]()
The primary advantage of the above over a k-nearest-neighbours, is that
it generates estimates that are both continuous and differentiable.
Both features may be exploited, first to find the class
borders, then to perform classifications and estimate the
conditional probability. Let
Where ![]() ![]() The class of a test point is estimated as follows: where ![]() ![]() ![]() This algorithm is robust, general and efficient, yet still supplies knowledge of the conditional probabilities which are useful for gauging the accuracy of an estimate without prior knowledge of its true class. ReferencesTerrel and Scott (1992). "Variable kernel density estimation." Annals of statistics 20:1236-1265.
Peter Mills 2007-11-03 |