UID:
almafu_9959242816402883
Format:
1 online resource (329 p.)
ISBN:
981-279-792-0
Series Statement:
Series in Machine Perception and Artificial Intelligence ; Volume 7
Content:
Contents: A Connectionist Approach to Speech Recognition (Y Bengio); Signature Verification Using a "Siamese" Time Delay Neural Network (J Bromley et al.); Boosting Performance in Neural Networks (H Drucker et al.); An Integrated Architecture for Recognition of Totally Unconstrained Handwritten Numerals (A Gupta et al.); Time-Warping Network: A Neural Approach to Hidden Markov Model Based Speech Recognition (E Levin et al.); Computing Optical Flow with a Recurrent Neural Network (H Li & J Wang); Integrated Segmentation and Recognition through Exhaustive Scans or Learned Saccadic Jumps (G L Mar
Note:
Description based upon print version of record.
,
Contents; Preface; A Connectionist Approach to Speech Recognition; 1. Introduction; 2. Integrating Prior Knowledge and Learning; 2.1. Input representation and coding; 2.2. Output coding; 2.3. Architecture; 3. Connectionist Architectures for Speech Recognition; 3.1. Static isolated word recognizer; 3.2. Static fixed window; 3.3. Time-delay neural network (TDNN); 3.4. Segment-based classification; 3.5. Recurrent nets; 3.5.1. Problems with training of recurrent networks; 4. Hybrids of Connectionist Models and Stastical Models; 4.1. A New approach to density estimation with neural networks
,
4.2. A Hybrid of neural network and discriminant Hidden Markov model5. Conclusion; Acknowledgements; References; Signature Verification using a ""Siamese"" Time Delay Neural Network; 1. Introduction; 1.1. NCR requirements; 2. Data; 2.1. Data sets; 3. Neural Networks Architectures; 4. Signature Preprocessing; 5. Training the Neural Networks; 5.1. First round of training; 5.2. Second round of training; 6. Testing; 6.1. First round; 6.2. Second round; 6.3. Zero-Effort forgeries; 6.4. The 80 byte constraint; 6.5. Comments on the testing; 7. Conclusions; Acknowledgements; References
,
Off Line Recognition of Handwritten Postal Words using Neural Networks1. Introduction; 2. Recognition of Handwritten Zip Codes; 2.1. Shortest path segmentation; 2.2. Image preprocessing; 2.3. Cut generation; 2.4. Graph representation of the segmentation process; 2.5. Neural network; 2.6. Neural network training; 2.7. Use of a ZIP code lexico; 2.8. Results; 3. Recognition of Handwritten Words; 3.1. Image preprocessing; 3.2. The Neural network; 3.3. Segmentation graph; 3.4. Results; 4. Conclusions; Acknowledgements; References; Boosting Performance in Neural Networks; 1. Introduction
,
2. Theoritical Background3. A Deformation Model; 4. Network Architectures; 5. Training Algorithm; 6. Results on UPS Database; 7. Results on NIST Databases; 8. Using Sieving to Reduce Evaluation Time; 9. Discussion and Conclusion; Acknowledgements; References; Multi-Modular Neural Network Architectures: Applications in Optical Character and Human Face Recognition; 1. Introduction; 2. Classifiers; 2.1. Neural network models; 2.1.1. TDNN; 2.1.2. LVQ; 2.1.3. RBF; 2.2. k-Nearest neighbor; 3. Multi-Modular Architectures; 3.1. Modularity
,
3.2. Multi-Modular feature extraction-classification architectures3.3. Segmentation; 4. Decision and Rejection Criteria; 4.1. Introduction; 4.2. MLP; 4.3. LVQ; 4.4. RBF; 5. Optical Character Recognition; 5.1. The Databases; 5.2. The Architectures; 5.2.1. Introduction; 5.2.2. MLP-encoder; 5.2.3. TDNN-LeNotre; 5.2.4. TDNN-LeNet; 5.2.5 k-nn; 5.2.6. LVQ; 5.2.7. RBF; 5.2.8. RBF-coop; 5.3. Classification Performances; 5.3.1. Effect of modularity; 5.3.2. Effect of cooperation; 5.3.3. Conclusion; 5.4. Rejection; 5.4.1. Method; 5.4.2. Comparison of criteria; 5.4.3. Comparison of architectures
,
5.4.4. Evaluation of global performances
,
English
Additional Edition:
ISBN 981-02-1444-8
Additional Edition:
ISBN 1-306-56620-7
Language:
English
Bookmarklink