Kooperativer Bibliotheksverbund

Berlin Brandenburg

and
and

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Language: English
    In: IEEE Transactions on Neural Networks and Learning Systems, December 2016, Vol.27(12), pp.2458-2471
    Description: In training radial basis function neural networks (RBFNNs), the locations of Gaussian neurons are commonly determined by clustering. Training inputs can be clustered on a fully unsupervised manner (input clustering), or some supervision can be introduced, for example, by concatenating the input vectors with weighted output vectors (input-output clustering). In this paper, we propose to apply clustering separately for each class (class-specific clustering). The idea has been used in some previous works, but without evaluating the benefits of the approach. We compare the class-specific, input, and input-output clustering approaches in terms of classification performance and computational efficiency when training RBFNNs. To accomplish this objective, we apply three different clustering algorithms and conduct experiments on 25 benchmark data sets. We show that the class-specific approach significantly reduces the overall complexity of the clustering, and our experimental results demonstrate that it can also lead to a significant gain in the classification performance, especially for the networks with a relatively few Gaussian neurons. Among other applied clustering algorithms, we combine, for the first time, a dynamic evolutionary optimization method, multidimensional particle swarm optimization, and the class-specific clustering to optimize the number of cluster centroids and their locations.
    Keywords: Training ; Neurons ; Clustering Algorithms ; Optimization ; Neural Networks ; Heuristic Algorithms ; Particle Swarm Optimization ; Clustering Methods ; Particle Swarm Optimization (Pso) ; Radial Basis Function Networks (Rbfnns) ; Supervised Learning ; Computer Science
    ISSN: 2162-237X
    E-ISSN: 2162-2388
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Language: English
    In: IEEE Transactions on Neural Networks and Learning Systems, 31 May 2019, pp.1-15
    Description: The traditional multilayer perceptron (MLP) using a McCulloch-Pitts neuron model is inherently limited to a set of neuronal activities, i.e., linear weighted sum followed by nonlinear thresholding step. Previously, generalized operational perceptron (GOP) was proposed to extend the conventional perceptron model by defining a diverse set of neuronal activities to imitate a generalized model of biological neurons. Together with GOP, a progressive operational perceptron (POP) algorithm was proposed to optimize a predefined template of multiple homogeneous layers in a layerwise manner. In this paper, we propose an efficient algorithm to learn a compact, fully heterogeneous multilayer network that allows each individual neuron, regardless of the layer, to have distinct characteristics. Based on the complexity of the problem, the proposed algorithm operates in a progressive manner on a neuronal level, searching for a compact topology, not only in terms of depth but also width, i.e., the number of neurons in each layer. The proposed algorithm is shown to outperform other related learning methods in extensive experiments on several classification problems.
    Keywords: Neurons ; Biological Neural Networks ; Network Topology ; Learning Systems ; Topology ; Computational Modeling ; Nonhomogeneous Media ; Architecture Learning ; Feedforward Network ; Generalized Operational Perceptron (GOP) ; Progressive Learning ; Computer Science
    ISSN: 2162-237X
    E-ISSN: 2162-2388
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages