feed icon rss

Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    UID:
    almahu_9948030301902882
    Format: XIV, 122 p. 36 illus., 31 illus. in color. , online resource.
    ISBN: 9789811300622
    Series Statement: Studies in Computational Intelligence, 783
    Content: This book offers an introduction to modern natural language processing using machine learning, focusing on how neural networks create a machine interpretable representation of the meaning of natural language. Language is crucially linked to ideas – as Webster’s 1923 “English Composition and Literature” puts it: “A sentence is a group of words expressing a complete thought”. Thus the representation of sentences and the words that make them up is vital in advancing artificial intelligence and other “smart” systems currently being developed. Providing an overview of the research in the area, from Bengio et al.’s seminal work on a “Neural Probabilistic Language Model” in 2003, to the latest techniques, this book enables readers to gain an understanding of how the techniques are related and what is best for their purposes. As well as a introduction to neural networks in general and recurrent neural networks in particular, this book details the methods used for representing words, senses of words, and larger structures such as sentences or documents. The book highlights practical implementations and discusses many aspects that are often overlooked or misunderstood. The book includes thorough instruction on challenging areas such as hierarchical softmax and negative sampling, to ensure the reader fully and easily understands the details of how the algorithms function. Combining practical aspects with a more traditional review of the literature, it is directly applicable to a broad readership. It is an invaluable introduction for early graduate students working in natural language processing; a trustworthy guide for industry developers wishing to make use of recent innovations; and a sturdy bridge for researchers already familiar with linguistics or machine learning wishing to understand the other.
    Note: Introduction -- Machine Learning for Representations -- Current Challenges in Natural Language Processing -- Word Representations -- Word Sense Representations -- Phrase Representations -- Sentence representations and beyond -- Character-Based Representations -- Conclusion.
    In: Springer eBooks
    Additional Edition: Printed edition: ISBN 9789811300615
    Additional Edition: Printed edition: ISBN 9789811300639
    Additional Edition: Printed edition: ISBN 9789811343209
    Language: English
    Subjects: Computer Science
    RVK:
    URL: Volltext  (URL des Erstveröffentlichers)
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 2
    UID:
    b3kat_BV019423435
    Format: XII, 385 S. , graph. Darst.
    ISBN: 1584883103
    Series Statement: Discrete mathematics and its applications
    Content: "Without abandoning the theoretical foundations, Fundamentals of Information Theory and Coding Design presents working algorithms and implementation that can be used to design and create real systems. The emphasis is on the underlying concepts governing information theory and the mathematical basis for modern coding systems, but the authors also provide the practical details of important codes like Reed-Solomon, BCH, and Turbo codes. Also setting this book apart are discussions on the cascading of information channels and the additivity of information, the details of arithmetic coding, and the connection between coding of extensions and Markov modeling." "Complete, balanced coverage, an outstanding format, and a wealth of examples and exercises make this an outstanding text for upper-level students in computer science, mathematics, and engineering, and a valuable reference for telecommunications engineers and coding theory researchers."--BOOK JACKET.
    Note: Includes bibliographical references and index
    Language: English
    Subjects: Computer Science , Mathematics
    RVK:
    RVK:
    Keywords: Informationstheorie ; Codierungstheorie
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 3
    UID:
    almahu_BV045225753
    Format: xiv, 122 Seiten : , Illustrationen, Diagramme (teilweise farbig).
    ISBN: 978-981-13-0061-5
    Series Statement: Studies in computational intelligence Volume 783
    Additional Edition: Erscheint auch als Online-Ausgabe ISBN 978-981-13-0062-2
    Language: English
    Subjects: Computer Science
    RVK:
    URL: Cover
    Author information: Bennamoun, Mohammed, 1961-
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 4
    UID:
    b3kat_BV023642422
    Format: XII, 385 S. , Ill.
    ISBN: 1584883103
    Series Statement: Discrete mathematics and its applications
    Language: Undetermined
    Subjects: Engineering
    RVK:
    Keywords: Informationstheorie ; Codierungstheorie ; Lehrbuch
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
  • 5
    UID:
    almafu_9959767643902883
    Format: 1 online resource (XIV, 122 p. 36 illus., 31 illus. in color.)
    Edition: 1st ed. 2019.
    ISBN: 981-13-0062-3
    Series Statement: Studies in Computational Intelligence, 783
    Content: This book offers an introduction to modern natural language processing using machine learning, focusing on how neural networks create a machine interpretable representation of the meaning of natural language. Language is crucially linked to ideas – as Webster’s 1923 “English Composition and Literature” puts it: “A sentence is a group of words expressing a complete thought”. Thus the representation of sentences and the words that make them up is vital in advancing artificial intelligence and other “smart” systems currently being developed. Providing an overview of the research in the area, from Bengio et al.’s seminal work on a “Neural Probabilistic Language Model” in 2003, to the latest techniques, this book enables readers to gain an understanding of how the techniques are related and what is best for their purposes. As well as a introduction to neural networks in general and recurrent neural networks in particular, this book details the methods used for representing words, senses of words, and larger structures such as sentences or documents. The book highlights practical implementations and discusses many aspects that are often overlooked or misunderstood. The book includes thorough instruction on challenging areas such as hierarchical softmax and negative sampling, to ensure the reader fully and easily understands the details of how the algorithms function. Combining practical aspects with a more traditional review of the literature, it is directly applicable to a broad readership. It is an invaluable introduction for early graduate students working in natural language processing; a trustworthy guide for industry developers wishing to make use of recent innovations; and a sturdy bridge for researchers already familiar with linguistics or machine learning wishing to understand the other.
    Note: Introduction -- Machine Learning for Representations -- Current Challenges in Natural Language Processing -- Word Representations -- Word Sense Representations -- Phrase Representations -- Sentence representations and beyond -- Character-Based Representations -- Conclusion.
    Additional Edition: ISBN 981-13-0061-5
    Language: English
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages