Your email was sent successfully. Check your inbox.

An error occurred while sending the email. Please try again.

Proceed reservation?

Export
  • 1
    UID:
    gbv_855501642
    Format: 1 Online-Ressource (xxi, 139 Seiten)
    Edition: First edition
    Edition: Also available in print
    ISBN: 1608459780 , 9781608459780
    Series Statement: Synthesis lectures on human language technologies #28
    Content: This book provides a thorough introduction to the subfield of theoretical computer science known as grammatical inference from a computational linguistic perspective. Grammatical inference provides principled methods for developing computationally sound algorithms that learn structure from strings of symbols. The relationship to computational linguistics is natural because many research problems in computational linguistics are learning problems on words, phrases, and sentences:What algorithm can take as input some finite amount of data (for instance a corpus, annotated or otherwise) and output a system that behaves "correctly" on specific tasks? Throughout the text, the key concepts of grammatical inference are interleaved with illustrative examples drawn from problems in computational linguistics. Special attention is paid to the notion of "learning bias." In the context of computational linguistics, such bias can be thought to reflect common (ideally universal) properties of natural languages. This bias can be incorporated either by identifying a learnable class of languages which contains the language to be learned or by using particular strategies for optimizing parameter values. Examples are drawn largely from two linguistic domains (phonology and syntax) which span major regions of the Chomsky Hierarchy (from regular to context-sensitive classes). The conclusion summarizes the major lessons and open questions that grammatical inference brings to computational linguistics
    Content: 1. Studying learning -- 1.1 An overview of grammatical inference -- 1.2 Formal and empirical grammatical inference -- 1.3 Formal grammatical inference -- 1.3.1 Language and grammar -- 1.3.2 Language families -- 1.3.3 Learning languages efficiently -- 1.4 Empirical grammatical inference -- 1.4.1 Languages, grammars, and language families -- 1.4.2 Evaluation -- 1.5 Summary -- 1.6 Formal preliminaries --
    Content: 2. Formal learning -- 2.1 Introduction -- 2.1.1 The issues of learning -- 2.1.2 Learning scenarios -- 2.1.3 Learning grammars of languages -- 2.2 Learnability: definitions and paradigms -- 2.2.1 Blame the data, not the algorithm -- 2.2.2 A non-probabilistic setting: identification in the limit -- 2.2.3 An active learning setting -- 2.2.4 Introducing complexity -- 2.2.5 A probabilistic version of identification in the limit -- 2.2.6 Probably approximately correct (PAC) learning -- 2.3 Grammar formalisms -- 2.3.1 Finite-state machines recognizing strings -- 2.3.2 Probabilistic finite-state machines -- 2.3.3 Transducers -- 2.3.4 More complex formalisms -- 2.3.5 Dealing with trees and graphs -- 2.4 Is grammatical inference an instance of machine learning? -- 2.5 Summary --
    Content: 3. Learning regular languages -- 3.1 Introduction -- 3.2 Bias selection reduces the problem space -- 3.3 Regular grammars -- 3.4 State-merging algorithms -- 3.4.1 The problem of learning stress patterns -- 3.4.2 Merging states -- 3.4.3 Finite-state representations of finite samples -- 3.4.4 The state-merging theorem -- 3.5 State-merging as a learning bias -- 3.6 State-merging as inference rules -- 3.7 RPNI -- 3.7.1 How it works -- 3.7.2 Theoretical results -- 3.8 Regular relations -- 3.9 Learning stochastic regular languages -- 3.9.1 Stochastic languages -- 3.9.2 Structure of the class is deterministic and known a priori -- 3.9.3 Structure of the class is deterministic but not known a priori -- 3.9.4 Structure of the class is non-deterministic and not known a priori -- 3.10 Summary --
    Content: 4. Learning non-regular languages -- 4.1 Substitutability -- 4.1.1 Identifying structure -- 4.1.2 Learning using substitutability -- 4.2 Empirical approaches -- 4.2.1 Expanding and reducing approaches -- 4.2.2 Supervised and unsupervised approaches -- 4.2.3 Word-based and POS-based approaches -- 4.2.4 Description of empirical systems -- 4.2.5 Comparison of empirical systems -- 4.3 Issues for evaluation -- 4.3.1 Looks-good-to-me approach -- 4.3.2 Rebuilding known grammars -- 4.3.3 Compare against a treebank -- 4.3.4 Language membership -- 4.4 Formal approaches -- 4.5 Summary --
    Content: 5. Lessons learned and open problems -- 5.1 Summary -- 5.2 Lessons -- 5.3 Problems -- 5.3.1 Learning targets -- 5.3.2 Learning criteria -- 5.4 Resources -- 5.5 Final words -- Bibliography -- Author biographies
    Note: Includes bibliographical references (pages 121-136) , Also available in print. , System requirements: Adobe Acrobat Reader. , Mode of access: World Wide Web.
    Additional Edition: ISBN 1608459772
    Additional Edition: ISBN 9781608459773
    Additional Edition: Print version Heinz, Jeffrey Grammatical inference for computational linguistics San Rafael : Morgan & Claypool, 2014
    Language: English
    Subjects: Computer Science
    RVK:
    Keywords: Grammatik ; Maschinelles Lernen
    Author information: Heinz, Jeffrey 1974-
    Library Location Call Number Volume/Issue/Year Availability
    BibTip Others were also interested in ...
Close ⊗
This website uses cookies and the analysis tool Matomo. Further information can be found on the KOBV privacy pages