Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    Online Resource
    Online Resource
    San Rafael, California : Morgan & Claypool | Cham : Springer International Publishing | Cham : Imprint: Springer
    UID:
    gbv_1657252272
    Format: 1 Online-Ressource (1 PDF (xxv, 215 pages)) , illustrations.
    ISBN: 9783031017544 , 9781627058902
    Series Statement: Synthesis lectures on computer architecture # 39
    Content: Part I. Introduction to space-time computing and temporal neural networks -- 1. Introduction -- 1.1 Basics of neuron operation -- 1.2 Space-time communication and computation -- 1.2.1 Communication -- 1.2.2 Computation -- 1.2.3 Discussion -- 1.3 Background: neural network models -- 1.3.1 Rate coding -- 1.3.2 Temporal coding -- 1.3.3 Rate processing -- 1.3.4 Spike processing -- 1.3.5 Summary and taxonomy -- 1.4 Background: machine learning -- 1.5 Approach: interaction of computer engineering and neuroscience -- 1.6 Bottom-up analysis: a guiding analogy -- 1.7 Overview --
    Content: 2. Space-time computing -- 2.1 Definition of terms -- 2.2 Feedforward computing networks -- 2.3 General TNN model -- 2.4 Space-time computing systems -- 2.5 Implications of invariance -- 2.6 TNN system architecture -- 2.6.1 Training -- 2.6.2 Computation (evaluation) -- 2.6.3 Encoding -- 2.6.4 Decoding -- 2.7 Summary: meta-architecture -- 2.7.1 Simulation -- 2.7.2 Implied functions -- 2.8 Special case: feedforward McCulloch-Pitts networks -- 2.9 Race logic --
    Content: 3. Biological overview -- 3.1 Overall brain structure (very brief ) -- 3.2 Neurons -- 3.2.1 Synapses -- 3.2.2 Synaptic plasticity -- 3.2.3 Frequency-current relationship -- 3.2.4 Inhibition -- 3.3 Hierarchy and columnar organization -- 3.3.1 Neurons -- 3.3.2 Columns (micro-columns) -- 3.3.3 Macro-columns -- 3.3.4 Regions -- 3.3.5 Lobes -- 3.3.6 Uniformity -- 3.4 Inter-neuron connections -- 3.4.1 Path distances -- 3.4.2 Propagation velocities -- 3.4.3 Transmission delays -- 3.4.4 Numbers of connections -- 3.4.5 Attenuation of excitatory responses -- 3.4.6 Connections summary -- 3.5 Sensory processing -- 3.5.1 Receptive fields -- 3.5.2 Saccades and whisks -- 3.5.3 Vision pathway -- 3.5.4 Waves of spikes -- 3.5.5 Feedforward processing path -- 3.5.6 Precision -- 3.5.7 Information content -- 3.5.8 Neural processing -- 3.6 Oscillations -- 3.6.1 Theta oscillations -- 3.6.2 Gamma oscillations --
    Content: Part II. Modeling temporal neural networks -- 4. Connecting TNNs with biology -- 4.1 Communication via voltage spikes -- 4.2 Columns and spike bundles -- 4.3 Spike synchronization -- 4.3.1 Aperiodic synchronization: saccades, whisks, and sniffs -- 4.3.2 Periodic synchronization -- 4.4 First spikes carry information -- 4.5 Feedforward processing -- 4.6 Simplifications summary -- 4.7 Plasticity and training -- 4.8 Fault tolerance and temporal stability -- 4.8.1 Interwoven fault tolerance -- 4.8.2 Temporal stability -- 4.8.3 Noise (or lack thereof ) -- 4.9 Discussion: reconciling biological complexity with model simplicity -- 4.10 Prototype architecture overview --
    Content: 5. Neuron modeling -- 5.1 Basic models -- 5.1.1 Hodgkin Huxley neuron model -- 5.1.2 Derivation of the leaky integrate and fire (LIF) model -- 5.1.3 Spike response model (SRM0) -- 5.2 Modeling synaptic connections -- 5.3 Excitatory neuron implementation -- 5.4 The menagerie of LIF neurons -- 5.4.1 Synaptic conductance model -- 5.4.2 Biexponential SRM0 model -- 5.4.3 Single stage SRM0 -- 5.4.4 Linear leak integrate and fire (LLIF) -- 5.5 Other neuron models -- 5.5.1 Alpha function -- 5.5.2 Quadratic integrate-and-fire -- 5.6 Synaptic plasticity and training --
    Content: 6. Computing with excitatory neurons -- 6.1 Single neuron clustering -- 6.1.1 Definitions -- 6.1.2 Excitatory neuron function, approximate description -- 6.1.3 Looking ahead -- 6.2 Spike coding -- 6.2.1 Volleys -- 6.2.2 Nonlinear mappings -- 6.2.3 Distance functions -- 6.3 Prior work: radial basis function (RBF) neurons -- 6.4 Excitatory neuron I: training mode -- 6.4.1 Modeling excitatory response functions -- 6.4.2 Training set -- 6.4.3 STDP update rule -- 6.4.4 Weight stabilization -- 6.5 Excitatory neuron I: compound response functions -- 6.6 Excitatory neuron model II -- 6.6.1 Neuron model derivation -- 6.6.2 Training mode -- 6.6.3 Evaluation mode -- 6.7 Attenuation of excitatory responses -- 6.8 Threshold detection -- 6.9 Excitatory neuron model II summary --
    Content: 7. System architecture -- 7.1 Overview -- 7.2 Interconnection structure -- 7.3 Input encoding -- 7.4 Excitatory column operation -- 7.4.1 Evaluation -- 7.4.2 Training -- 7.4.3 Unsupervised synaptic weight training -- 7.4.4 Supervised weight training -- 7.5 Inhibition -- 7.5.1 Feedback inhibition -- 7.5.2 Lateral inhibition -- 7.5.3 Feedforward inhibition -- 7.6 Volley decoding and analysis -- 7.6.1 Temporal flattening -- 7.6.2 Decoding to estimate clustering quality -- 7.6.3 Decoding for classification -- 7.7 Training inhibition -- 7.7.1 FFI: establishing tF and kF -- 7.7.2 LI: establishing tL and kL -- 7.7.3 Excitatory neuron training in the presence of inhibition -- Part III: extended design study: clustering the MNIST dataset --
    Content: 8. Simulator implementation -- 8.1 Simulator overview -- 8.2 Inter-unit communication -- 8.3 Simulating time -- 8.4 Synaptic weight training -- 8.5 Evaluation -- 8.5.1 EC block -- 8.5.2 IC block -- 8.5.3 VA block -- 8.6 Design methodology --
    Content: 9. Clustering the MNIST dataset -- 9.1 MNIST workload -- 9.2 Prototype clustering architecture -- 9.3 OnOff encoding -- 9.4 Intra-CC network -- 9.5 Excitatory column (EC) -- 9.6 Lateral inhibition -- 9.7 144 RFs -- 9.8 Feedforward inhibition -- 9.9 Layer 1 result summary -- 9.10 Related work -- 9.11 Considering layer 2 --
    Content: 10. Summary and conclusions -- References -- Author biography
    Note: Part of: Synthesis digital library of engineering and computer science. - Includes bibliographical references (pages 205-214). - Compendex. INSPEC. Google scholar. Google book search. - Title from PDF title page (viewed on June 26, 2017)
    Additional Edition: ISBN 9783031006265
    Additional Edition: ISBN 9783031028823
    Additional Edition: ISBN 9781627059480
    Additional Edition: Erscheint auch als Druck-Ausgabe Smith, James E., 1950 - Space-time computing with temporal neural networks [San Rafael, California] : Morgan & Claypool, 2017 ISBN 9781627059480
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages