Calculates the softmin of a vector. Let d be a vector. Then the softmin of d is defined as: s = exp(-d/sigma^2) / sum( exp(-d/sigma^2) ) The softmin is a way of taking a dissimilarity (distance) vector d and converting it to a similarity vector s, such that sum(s)==1. Note that as sigma->0, softmin's behavior tends toward that of the standard min function. That is the softmin of a vector d has all zeros with a single 1 in the location of the smallest value of d. For example, "softmin([.2 .4 .1 .3],eps)" returns "[0 0 1 0]". As sigma->inf, then softmin(d,sigma) tends toward "ones(1,n)/n", where n==length(d). If D is an NxK array, is is treated as N K-dimensional vectors, and the return is likewise an NxK array. This is useful if D is a distance matrix, generated by the likes of dist_euclidean or dist_chisquared. If d contains the squared euclidean distance between a point y and k points xi, then there is a probabilistic interpretation for softmin. If we think of the k points representing equal variant gaussians each with mean xi and std sigma, then the softmin returns the relative probability of y being generated by each gaussian. INPUTS D - NxK dissimilarity matrix sigma - controls 'softness' of softmin OUTPUTS M - the softmin EXAMPLE % example 1 C = [0 0; 1 0; 0 1; 1 1]; x=[.7,.3; .1 .2]; D = dist_euclidean( x, C ); M = softmin( D, 1 ) % example 2 fplot( 'softmin( [0.5 0.2 .4], x )', [eps 10] ); DATESTAMP 29-Sep-2005 2:00pm See also DIST_EUCLIDEAN, DIST_CHISQUARED