Kooperativer Bibliotheksverbund

Berlin Brandenburg

and
and

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
Filter
Language
Year
  • 1
    In: Frontiers in Computational Neuroscience, 2010, Vol.4
    ISSN: Frontiers in Computational Neuroscience
    E-ISSN: 1662-5188
    Source: CrossRef
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    Language: English
    In: PLoS ONE, March 15, 2017, Vol.12(3), p.e0174289
    Description: [This corrects the article DOI: 10.1371/journal.pone.0171015.].
    Keywords: Sciences (General);
    ISSN: 1932-6203
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    In: PLoS ONE, 2017, Vol.12(2)
    Description: We present a theoretical analysis of Gaussian-binary restricted Boltzmann machines (GRBMs) from the perspective of density models. The key aspect of this analysis is to show that GRBMs can be formulated as a constrained mixture of Gaussians, which gives a much better insight into the model’s capabilities and limitations. We further show that GRBMs are capable of learning meaningful features without using a regularization term and that the results are comparable to those of independent component analysis. This is illustrated for both a two-dimensional blind source separation task and for modeling natural image patches. Our findings exemplify that reported difficulties in training GRBMs are due to the failure of the training algorithm rather than the model itself. Based on our analysis we derive a better training setup and show empirically that it leads to faster and more robust training of GRBMs. Finally, we compare different sampling algorithms for training GRBMs and show that Contrastive Divergence performs better than training methods that use a persistent Markov chain.
    Keywords: Research Article ; Physical Sciences ; Physical Sciences ; Physical Sciences ; Biology And Life Sciences ; Physical Sciences ; Research And Analysis Methods ; Physical Sciences ; Research And Analysis Methods ; Biology And Life Sciences ; Computer And Information Sciences
    E-ISSN: 1932-6203
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    Article
    Article
    In: Frontiers in Computational Neuroscience, 2012, Vol.6
    ISSN: Frontiers in Computational Neuroscience
    E-ISSN: 1662-5188
    Source: CrossRef
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    In: PLoS ONE, 2018, Vol.13(10)
    Description: Episodic memories have been suggested to be represented by neuronal sequences, which are stored and retrieved from the hippocampal circuit. A special difficulty is that realistic neuronal sequences are strongly correlated with each other since computational memory models generally perform poorly when correlated patterns are stored. Here, we study in a computational model under which conditions the hippocampal circuit can perform this function robustly. During memory encoding, CA3 sequences in our model are driven by intrinsic dynamics, entorhinal inputs, or a combination of both. These CA3 sequences are hetero-associated with the input sequences, so that the network can retrieve entire sequences based on a single cue pattern. We find that overall memory performance depends on two factors: the robustness of sequence retrieval from CA3 and the circuit’s ability to perform pattern completion through the feedforward connectivity, including CA3, CA1 and EC. The two factors, in turn, depend on the relative contribution of the external inputs and recurrent drive on CA3 activity. In conclusion, memory performance in our network model critically depends on the network architecture and dynamics in CA3.
    Keywords: Research Article ; Biology And Life Sciences ; Biology And Life Sciences ; Biology And Life Sciences ; Biology And Life Sciences ; Biology And Life Sciences ; Medicine And Health Sciences ; Biology And Life Sciences ; Biology And Life Sciences ; Biology And Life Sciences ; Biology And Life Sciences ; Social Sciences ; Biology And Life Sciences ; Computer And Information Sciences ; Biology And Life Sciences ; Computer And Information Sciences ; Research And Analysis Methods
    E-ISSN: 1932-6203
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    In: Frontiers in Computational Neuroscience, 2010, Vol.4
    ISSN: Frontiers in Computational Neuroscience
    E-ISSN: 1662-5188
    Source: CrossRef
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    In: Neural Computation, 2011, Vol.23(2), pp.303-335
    Description: We develop a group-theoretical analysis of slow feature analysis for the case where the input data are generated by applying a set of continuous transformations to static templates. As an application of the theory, we analytically derive nonlinear visual receptive fields and show that their optimal stimuli, as well as the orientation and frequency tuning, are in good agreement with previous simulations of complex cells in primary visual cortex (Berkes and Wiskott, 2005 ). The theory suggests that side and end stopping can be interpreted as a weak breaking of translation invariance. Direction selectivity is also discussed.
    Keywords: Articles
    ISSN: 0899-7667
    E-ISSN: 1530-888X
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    Language: English
    In: PLoS Computational Biology, 2011, Vol.7(1), p.e1001063
    Description: Recently, we presented a study of adult neurogenesis in a simplified hippocampal memory model. The network was required to encode and decode memory patterns despite changing input statistics. We showed that additive neurogenesis was a more effective adaptation strategy compared to neuronal turnover and conventional synaptic plasticity as it allowed the network to respond to changes in the input statistics while preserving representations of earlier environments. Here we extend our model to include realistic, spatially driven input firing patterns in the form of grid cells in the entorhinal cortex. We compare network performance across a sequence of spatial environments using three distinct adaptation strategies: conventional synaptic plasticity, where the network is of fixed size but the connectivity is plastic; neuronal turnover, where the network is of fixed size but units in the network may die and be replaced; and additive neurogenesis, where the network starts out with fewer initial units but grows over time. We confirm that additive neurogenesis is a superior adaptation strategy when using realistic, spatially structured input patterns. We then show that a more biologically plausible neurogenesis rule that incorporates cell death and enhanced plasticity of new granule cells has an overall performance significantly better than any one of the three individual strategies operating alone. This adaptation rule can be tailored to maximise performance of the network when operating as either a short- or long-term memory store. We also examine the time course of adult neurogenesis over the lifetime of an animal raised under different hypothetical rearing conditions. These growth profiles have several distinct features that form a theoretical prediction that could be tested experimentally. Finally, we show that place cells can emerge and refine in a realistic manner in our model as a direct result of the sparsification performed by the dentate gyrus layer. ; Contrary to the long-standing belief that no new neurons are added to the adult brain, it is now known that new neurons are born in a number of different brain regions and animals. One such region is the hippocampus, an area that plays an important role in learning and memory. In this paper we explore the effect of adding new neurons in a computational model of rat hippocampal function. Our hypothesis is that adding new neurons helps in forming new memories without disrupting memories that have already been stored. We find that adding new units is indeed superior to either changing connectivity or allowing neuronal turnover (where old units die and are replaced). We then show that a more biologically plausible mechanism that combines all three of these processes produces the best performance. Our work provides a strong theoretical argument as to why new neurons are born in the adult hippocampus: the new units allow the network to adapt in a way that is not possible by rearranging existing connectivity using conventional plasticity or neuronal turnover.
    Keywords: Research Article ; Computational Biology ; Computational Biology -- Computational Neuroscience
    ISSN: 1553-734X
    E-ISSN: 1553-7358
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 9
    Language: English
    In: PLoS Computational Biology, 2010, Vol.6(8), p.e1000894
    Description: Humans and animals are able to learn complex behaviors based on a massive stream of sensory information from different modalities. Early animal studies have identified learning mechanisms that are based on reward and punishment such that animals tend to avoid actions that lead to punishment whereas rewarded actions are reinforced. However, most algorithms for reward-based learning are only applicable if the dimensionality of the state-space is sufficiently small or its structure is sufficiently simple. Therefore, the question arises how the problem of learning on high-dimensional data is solved in the brain. In this article, we propose a biologically plausible generic two-stage learning system that can directly be applied to raw high-dimensional input streams. The system is composed of a hierarchical slow feature analysis (SFA) network for preprocessing and a simple neural network on top that is trained based on rewards. We demonstrate by computer simulations that this generic architecture is able to learn quite demanding reinforcement learning tasks on high-dimensional visual input streams in a time that is comparable to the time needed when an explicit highly informative low-dimensional state-space representation is given instead of the high-dimensional visual input. The learning speed of the proposed architecture in a task similar to the Morris water maze task is comparable to that found in experimental studies with rats. This study thus supports the hypothesis that slowness learning is one important unsupervised learning principle utilized in the brain to form efficient state representations for behavioral learning. ; Humans and animals are able to learn complex behaviors based on a massive stream of sensory information from different modalities. Early animal studies have identified learning mechanisms that are based on reward and punishment such that animals tend to avoid actions that lead to punishment whereas rewarded actions are reinforced. It is an open question how sensory information is processed by the brain in order to learn and perform rewarding behaviors. In this article, we propose a learning system that combines the autonomous extraction of important information from the sensory input with reward-based learning. The extraction of salient information is learned by exploiting the temporal continuity of real-world stimuli. A subsequent neural circuit then learns rewarding behaviors based on this representation of the sensory input. We demonstrate in two control tasks that this system is capable of learning complex behaviors on raw visual input.
    Keywords: Research Article ; Computational Biology -- Computational Neuroscience ; Neuroscience -- Behavioral Neuroscience ; Neuroscience -- Theoretical Neuroscience ; Neuroscience -- Natural And Synthetic Vision
    ISSN: 1553-734X
    E-ISSN: 1553-7358
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 10
    Language: English
    In: Neural computation, May 2018, Vol.30(5), pp.1151-1179
    Description: The computational principles of slowness and predictability have been proposed to describe aspects of information processing in the visual system. From the perspective of slowness being a limited special case of predictability we investigate the relationship between these two principles empirically. On a collection of real-world data sets we compare the features extracted by slow feature analysis (SFA) to the features of three recently proposed methods for predictable feature extraction: forecastable component analysis, predictable feature analysis, and graph-based predictable feature analysis. Our experiments show that the predictability of the learned features is highly correlated, and, thus, SFA appears to effectively implement a method for extracting predictable features according to different measures of predictability.
    Keywords: Information Processing ; Visual Perception ; Real Time ; Feature Extraction ; Predictive Control ; Feature Extraction ; Data Processing ; Empirical Analysis;
    ISSN: 08997667
    E-ISSN: 1530-888X
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages