feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    UID:
    almahu_9948030301902882
    Format: XIV, 122 p. 36 illus., 31 illus. in color. , online resource.
    ISBN: 9789811300622
    Series Statement: Studies in Computational Intelligence, 783
    Content: This book offers an introduction to modern natural language processing using machine learning, focusing on how neural networks create a machine interpretable representation of the meaning of natural language. Language is crucially linked to ideas – as Webster’s 1923 “English Composition and Literature” puts it: “A sentence is a group of words expressing a complete thought”. Thus the representation of sentences and the words that make them up is vital in advancing artificial intelligence and other “smart” systems currently being developed. Providing an overview of the research in the area, from Bengio et al.’s seminal work on a “Neural Probabilistic Language Model” in 2003, to the latest techniques, this book enables readers to gain an understanding of how the techniques are related and what is best for their purposes. As well as a introduction to neural networks in general and recurrent neural networks in particular, this book details the methods used for representing words, senses of words, and larger structures such as sentences or documents. The book highlights practical implementations and discusses many aspects that are often overlooked or misunderstood. The book includes thorough instruction on challenging areas such as hierarchical softmax and negative sampling, to ensure the reader fully and easily understands the details of how the algorithms function. Combining practical aspects with a more traditional review of the literature, it is directly applicable to a broad readership. It is an invaluable introduction for early graduate students working in natural language processing; a trustworthy guide for industry developers wishing to make use of recent innovations; and a sturdy bridge for researchers already familiar with linguistics or machine learning wishing to understand the other.
    Note: Introduction -- Machine Learning for Representations -- Current Challenges in Natural Language Processing -- Word Representations -- Word Sense Representations -- Phrase Representations -- Sentence representations and beyond -- Character-Based Representations -- Conclusion.
    In: Springer eBooks
    Additional Edition: Printed edition: ISBN 9789811300615
    Additional Edition: Printed edition: ISBN 9789811300639
    Additional Edition: Printed edition: ISBN 9789811343209
    Language: English
    Subjects: Computer Science
    RVK:
    URL: Volltext  (URL des Erstveröffentlichers)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    UID:
    b3kat_BV023787622
    Format: XIII, 350 S. , Ill., graph. Darst. , 24 cm
    ISBN: 1852333987
    Series Statement: Advances in pattern recognition
    Language: English
    Keywords: Objekterkennung
    Author information: Bennamoun, Mohammed 1961-
    Author information: Mamic, George 1977-
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    UID:
    almahu_BV045225753
    Format: xiv, 122 Seiten : , Illustrationen, Diagramme (teilweise farbig).
    ISBN: 978-981-13-0061-5
    Series Statement: Studies in computational intelligence Volume 783
    Additional Edition: Erscheint auch als Online-Ausgabe ISBN 978-981-13-0062-2
    Language: English
    Subjects: Computer Science
    RVK:
    URL: Cover
    Author information: Bennamoun, Mohammed, 1961-
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    UID:
    gbv_1015347886
    Format: 1 Online-Ressource (209 Seiten)
    ISBN: 9781681730226
    Series Statement: Synthesis Lectures on Computer Vision 15
    Content: Intro -- Preface -- Acknowledgments -- Introduction -- What is Computer Vision? -- Applications -- Image Processing vs. Computer Vision -- What is Machine Learning? -- Why Deep Learning? -- Book Overview -- Features and Classifiers -- Importance of Features and Classifiers -- Features -- Classifiers -- Traditional Feature Descriptors -- Histogram of Oriented Gradients (HOG) -- Scale-invariant Feature Transform (SIFT) -- Speeded-up Robust Features (SURF) -- Limitations of Traditional Hand-engineered Features -- Machine Learning Classifiers -- Support Vector Machine (SVM) -- Random Decision Forest -- Conclusion -- Neural Networks Basics -- Introduction -- Multi-layer Perceptron -- Architecture Basics -- Parameter Learning -- Recurrent Neural Networks -- Architecture Basics -- Parameter Learning -- Link with Biological Vision -- Biological Neuron -- Computational Model of a Neuron -- Artificial vs. Biological Neuron -- Convolutional Neural Network -- Introduction -- Network Layers -- Pre-processing -- Convolutional Layers -- Pooling Layers -- Nonlinearity -- Fully Connected Layers -- Transposed Convolution Layer -- Region of Interest Pooling -- Spatial Pyramid Pooling Layer -- Vector of Locally Aggregated Descriptors Layer -- Spatial Transformer Layer -- CNN Loss Functions -- Cross-entropy Loss -- SVM Hinge Loss -- Squared Hinge Loss -- Euclidean Loss -- The 1 Error -- Contrastive Loss -- Expectation Loss -- Structural Similarity Measure -- CNN Learning -- Weight Initialization -- Gaussian Random Initialization -- Uniform Random Initialization -- Orthogonal Random Initialization -- Unsupervised Pre-training -- Xavier Initialization -- ReLU Aware Scaled Initialization -- Layer-sequential Unit Variance -- Supervised Pre-training -- Regularization of CNN -- Data Augmentation -- Dropout -- Drop-connect -- Batch Normalization -- Ensemble Model Averaging
    Content: The 2 Regularization -- The 1 Regularization -- Elastic Net Regularization -- Max-norm Constraints -- Early Stopping -- Gradient-based CNN Learning -- Batch Gradient Descent -- Stochastic Gradient Descent -- Mini-batch Gradient Descent -- Neural Network Optimizers -- Momentum -- Nesterov Momentum -- Adaptive Gradient -- Adaptive Delta -- RMSprop -- Adaptive Moment Estimation -- Gradient Computation in CNNs -- Analytical Differentiation -- Numerical Differentiation -- Symbolic Differentiation -- Automatic Differentiation -- Understanding CNN through Visualization -- Visualizing Learned Weights -- Visualizing Activations -- Visualizations based on Gradients -- Examples of CNN Architectures -- LeNet -- AlexNet -- Network in Network -- VGGnet -- GoogleNet -- ResNet -- ResNeXt -- FractalNet -- DenseNet -- Applications of CNNs in Computer Vision -- Image Classification -- PointNet -- Object Detection and Localization -- Region-based CNN -- Fast R-CNN -- Regional Proposal Network (RPN) -- Semantic Segmentation -- Fully Convolutional Network (FCN) -- Deep Deconvolution Network (DDN) -- DeepLab -- Scene Understanding -- DeepContext -- Learning Rich Features from RGB-D Images -- PointNet for Scene Understanding -- Image Generation -- Generative Adversarial Networks (GANs) -- Deep Convolutional Generative Adversarial Networks (DCGANs) -- Super Resolution Generative Adversarial Network (SRGAN) -- Video-based Action Recognition -- Action Recognition From Still Video Frames -- Two-stream CNNs -- Long-term Recurrent Convolutional Network (LRCN) -- Deep Learning Tools and Libraries -- Caffe -- TensorFlow -- MatConvNet -- Torch7 -- Theano -- Keras -- Lasagne -- Marvin -- Chainer -- PyTorch -- Conclusion -- Bibliography -- Authors' Biographies -- Blank Page
    Additional Edition: ISBN 9781681730219
    Additional Edition: ISBN 9781681732787
    Additional Edition: Erscheint auch als Druck-Ausgabe Khan, Salman A guide to convolutional neural networks for computer vision [San Rafael, Calif.] : Morgan & Claypool Publishers, 2018 ISBN 9781681730219
    Language: English
    Keywords: Electronic books
    Author information: Bennamoun, Mohammed 1961-
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    UID:
    b3kat_BV044311036
    Format: 1 Online-Ressource (xviii, 338 Seiten)
    ISBN: 9781609606268
    Series Statement: Premier reference source
    Content: "This book provides relevant theoretical foundations, and disseminates new research findings and expert views on the remaining challenges in ontology learning, discussing artificial intelligence, knowledge acquisition, knowledge representation and reasoning, text mining, information extraction, and ontology learning"--
    Note: Includes bibliographical references , 1. Evidence sources, methods and use cases for learning lightweight domain ontologies / Albert Weichselbraun, Gerhard Wohlgenannt and Arno Scharl -- 2. An overview of shallow and deep natural language processing for ontology learning / Amal Zouaq -- 3. Topic extraction for ontology learning / Marian-Andrei Rizoiu and Julien Velcin -- 4. A cognitive-based approach to identify topics in text using the web as a knowledge source / Louis Massey and Wilson Wong -- 5. Named entity recognition for ontology population using background knowledge from Wikipedia / Ziqi Zhang and Fabio Ciravegnan -- 6. User-centered maintenance of concept hierarchies / Kai Eckert, Robert Meusel and Heiner Stuckenschmidt -- 7. Learning SKOS relations for terminological ontologies from text / Wei Wang, Payam M. Barnaghi and Andrzej Bargiela -- 8. Incorporating correlations among gene ontology terms into predicting protein functions / Pingzhao Hu, Hui Jiang and Andrew Emili -- 9. GO-based term semantic similarity / Marco A. Alvarez, Xiaojun Qi and Changhui Yan -- 10. Ontology learning and the humanities / Toby Burrows -- 11. Ontology-based knowledge capture and sharing in enterprise organizations / Aba-Sah Dadzie, Victoria Uren and Fabio Ciravegna -- 12. Automated learning of social ontologies / Konstantinos Kotis and Andreas Papasalouros -- 13. Mining parallel knowledge from comparable patents / Bin Lu ... [et al.] -- 14. Cross-language ontology learning / Hans Hjelm and Martin Volk
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 978-1-60960-625-1
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 1-60960-625-6
    Language: English
    Keywords: Ontologie ; Data Mining ; Künstliche Intelligenz ; Information Retrieval ; World Wide Web 2.0 ; Aufsatzsammlung
    URL: Volltext  (URL des Erstveröffentlichers)
    Author information: Bennamoun, Mohammed 1961-
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 6
    UID:
    gbv_1823896340
    Format: 1 Online-Ressource(XIX, 187 p.)
    Edition: 1st ed. 2018.
    ISBN: 9783031018213
    Series Statement: Synthesis Lectures on Computer Vision
    Content: Preface -- Acknowledgments -- Introduction -- Features and Classifiers -- Neural Networks Basics -- Convolutional Neural Network -- CNN Learning -- Examples of CNN Architectures -- Applications of CNNs in Computer Vision -- Deep Learning Tools and Libraries -- Conclusion -- Bibliography -- Authors' Biographies.
    Content: Computer vision has become increasingly important and effective in recent years due to its wide-ranging applications in areas as diverse as smart surveillance and monitoring, health and medicine, sports and recreation, robotics, drones, and self-driving cars. Visual recognition tasks, such as image classification, localization, and detection, are the core building blocks of many of these applications, and recent developments in Convolutional Neural Networks (CNNs) have led to outstanding performance in these state-of-the-art visual recognition tasks and systems. As a result, CNNs now form the crux of deep learning algorithms in computer vision. This self-contained guide will benefit those who seek to both understand the theory behind CNNs and to gain hands-on experience on the application of CNNs in computer vision. It provides a comprehensive introduction to CNNs starting with the essential concepts behind neural networks: training, regularization, and optimization of CNNs. The book also discusses a wide range of loss functions, network layers, and popular CNN architectures, reviews the different techniques for the evaluation of CNNs, and presents some popular CNN tools and libraries that are commonly used in computer vision. Further, this text describes and discusses case studies that are related to the application of CNN in computer vision, including image classification, object detection, semantic segmentation, scene understanding, and image generation. This book is ideal for undergraduate and graduate students, as no prior background knowledge in the field is required to follow the material, as well as new researchers, developers, engineers, and practitioners who are interested in gaining a quick understanding of CNN models.
    Additional Edition: ISBN 9783031000782
    Additional Edition: ISBN 9783031006937
    Additional Edition: ISBN 9783031029493
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 9783031000782
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 9783031006937
    Additional Edition: Erscheint auch als Druck-Ausgabe ISBN 9783031029493
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 7
    UID:
    almafu_9959767643902883
    Format: 1 online resource (XIV, 122 p. 36 illus., 31 illus. in color.)
    Edition: 1st ed. 2019.
    ISBN: 981-13-0062-3
    Series Statement: Studies in Computational Intelligence, 783
    Content: This book offers an introduction to modern natural language processing using machine learning, focusing on how neural networks create a machine interpretable representation of the meaning of natural language. Language is crucially linked to ideas – as Webster’s 1923 “English Composition and Literature” puts it: “A sentence is a group of words expressing a complete thought”. Thus the representation of sentences and the words that make them up is vital in advancing artificial intelligence and other “smart” systems currently being developed. Providing an overview of the research in the area, from Bengio et al.’s seminal work on a “Neural Probabilistic Language Model” in 2003, to the latest techniques, this book enables readers to gain an understanding of how the techniques are related and what is best for their purposes. As well as a introduction to neural networks in general and recurrent neural networks in particular, this book details the methods used for representing words, senses of words, and larger structures such as sentences or documents. The book highlights practical implementations and discusses many aspects that are often overlooked or misunderstood. The book includes thorough instruction on challenging areas such as hierarchical softmax and negative sampling, to ensure the reader fully and easily understands the details of how the algorithms function. Combining practical aspects with a more traditional review of the literature, it is directly applicable to a broad readership. It is an invaluable introduction for early graduate students working in natural language processing; a trustworthy guide for industry developers wishing to make use of recent innovations; and a sturdy bridge for researchers already familiar with linguistics or machine learning wishing to understand the other.
    Note: Introduction -- Machine Learning for Representations -- Current Challenges in Natural Language Processing -- Word Representations -- Word Sense Representations -- Phrase Representations -- Sentence representations and beyond -- Character-Based Representations -- Conclusion.
    Additional Edition: ISBN 981-13-0061-5
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 8
    UID:
    gbv_894513559
    Format: 1 Online-Ressource (2 volumes (xxiv, 850 pages))) , illustrations
    Edition: [S.l.] HathiTrust Digital Library Electronic reproduction
    ISBN: 0780343670 , 0780343689 , 0780343654 , 0780343662 , 9780780343672 , 9780780343689 , 9780780343658 , 9780780343665
    Note: "IEEE catalog number 97CH36162"--Title page verso , Includes bibliographical references and index , Use copy Restrictions unspecified star MiAaHDL , Electronic reproduction , Master and use copy. Digital master created according to Benchmark for Faithful Digital Reproductions of Monographs and Serials, Version 1. Digital Library Federation, December 2002.
    Additional Edition: Print version TENCON '97 (1997 : Brisbane, Qld.) TENCON '97, Brisbane, Australia [New York] : IEEE ; Piscataway, NJ : IEEE Service Center, ©1997
    Language: English
    Keywords: Konferenzschrift
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages