Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    MIT Press ; 2011
    In:  Neural Computation Vol. 23, No. 5 ( 2011-05), p. 1343-1392
    In: Neural Computation, MIT Press, Vol. 23, No. 5 ( 2011-05), p. 1343-1392
    Abstract: Supervised and unsupervised vector quantization methods for classification and clustering traditionally use dissimilarities, frequently taken as Euclidean distances. In this article, we investigate the applicability of divergences instead, focusing on online learning. We deduce the mathematical fundamentals for its utilization in gradient-based online vector quantization algorithms. It bears on the generalized derivatives of the divergences known as Fréchet derivatives in functional analysis, which reduces in finite-dimensional problems to partial derivatives in a natural way. We demonstrate the application of this methodology for widely applied supervised and unsupervised online vector quantization schemes, including self-organizing maps, neural gas, and learning vector quantization. Additionally, principles for hyperparameter optimization and relevance learning for parameterized divergences in the case of supervised vector quantization are given to achieve improved classification accuracy.
    Type of Medium: Online Resource
    ISSN: 0899-7667 , 1530-888X
    Language: English
    Publisher: MIT Press
    Publication Date: 2011
    detail.hit.zdb_id: 1025692-1
    detail.hit.zdb_id: 1498403-9
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages