Algebraic Structures in Natural Language | Agenda Bookshop Skip to content
Act III
Age Group_Uncategorized
Age Group_Uncategorized
algebraic systems
algorithm
artificial intelligence
automatic-update
B01=Jean-Philippe Bernardy
B01=Shalom Lappin
Category1=Non-Fiction
Category=CFA
Category=CFX
Category=PBF
Category=UMB
Category=UYQL
cognitive science
computational linguistics
COP=United Kingdom
Cross-serial Dependency
Deep Learning Models
Deep Nets
Deep Neural Networks
Delivery_Pre-order
Discourse Deixis
Discourse Entities
DNN Model
eq_computing
eq_dictionaries-language-reference
eq_isMigrated=2
eq_non-fiction
EXPERIENCER THEME Verbs
Filler Gap Dependencies
Inductive Biases
Language Models
Language_English
linguistic
Long Distance Agreement
LSTM
Machine Translation
natural language
NPI
NPI License
PA=Temporarily unavailable
Parasitic Gaps
Pretrained Models
Price_€50 to €100
PS=Active
psycholinguistics
Semantic Affectedness
Semantic Information
Sequence Algebra
softlaunch
Syntactic Annotations
Syntactic Generalisations
Word Embeddings

Algebraic Structures in Natural Language

English

Algebraic Structures in Natural Language addresses a central problem in cognitive science concerning the learning procedures through which humans acquire and represent natural language. Until recently algebraic systems have dominated the study of natural language in formal and computational linguistics, AI, and the psychology of language, with linguistic knowledge seen as encoded in formal grammars, model theories, proof theories and other rule-driven devices. Recent work on deep learning has produced an increasingly powerful set of general learning mechanisms which do not apply rule-based algebraic models of representation. The success of deep learning in NLP has led some researchers to question the role of algebraic models in the study of human language acquisition and linguistic representation. Psychologists and cognitive scientists have also been exploring explanations of language evolution and language acquisition that rely on probabilistic methods, social interaction and information theory, rather than on formal models of grammar induction.

This book addresses the learning procedures through which humans acquire natural language, and the way in which they represent its properties. It brings together leading researchers from computational linguistics, psychology, behavioral science and mathematical linguistics to consider the significance of non-algebraic methods for the study of natural language. The text represents a wide spectrum of views, from the claim that algebraic systems are largely irrelevant to the contrary position that non-algebraic learning methods are engineering devices for efficiently identifying the patterns that underlying grammars and semantic models generate for natural language input. There are interesting and important perspectives that fall at intermediate points between these opposing approaches, and they may combine elements of both. It will appeal to researchers and advanced students in each of these fields, as well as to anyone who wants to learn more about the relationship between computational models and natural language.

See more
€82.99
Act IIIAge Group_Uncategorizedalgebraic systemsalgorithmartificial intelligenceautomatic-updateB01=Jean-Philippe BernardyB01=Shalom LappinCategory1=Non-FictionCategory=CFACategory=CFXCategory=PBFCategory=UMBCategory=UYQLcognitive sciencecomputational linguisticsCOP=United KingdomCross-serial DependencyDeep Learning ModelsDeep NetsDeep Neural NetworksDelivery_Pre-orderDiscourse DeixisDiscourse EntitiesDNN Modeleq_computingeq_dictionaries-language-referenceeq_isMigrated=2eq_non-fictionEXPERIENCER THEME VerbsFiller Gap DependenciesInductive BiasesLanguage ModelsLanguage_EnglishlinguisticLong Distance AgreementLSTMMachine Translationnatural languageNPINPI LicensePA=Temporarily unavailableParasitic GapsPretrained ModelsPrice_€50 to €100PS=ActivepsycholinguisticsSemantic AffectednessSemantic InformationSequence AlgebrasoftlaunchSyntactic AnnotationsSyntactic GeneralisationsWord Embeddings

Will deliver when available.

Product Details
  • Weight: 740g
  • Dimensions: 178 x 254mm
  • Publication Date: 23 Dec 2022
  • Publisher: Taylor & Francis Ltd
  • Publication City/Country: GB
  • Language: English
  • ISBN13: 9781032066547

About

Shalom Lappin is a Professor of Computational Linguistics at the University of Gothenburg, Professor of Natural Language Processing at Queen Mary University of London and Emeritus Professor of Computational Linguistics at King’s College London. His research focuses on the application of machine learning and probabilistic models to the representation and the acquisition of linguistic knowledge.

Jean-Philippe Bernardy is a researcher at the University of Gothenburg. His main research interest is in interpretable linguistic models, in particular, those built from first principles of algebra, probability and geometry.

Customer Reviews

Be the first to write a review
0%
(0)
0%
(0)
0%
(0)
0%
(0)
0%
(0)
We use cookies to ensure that we give you the best experience on our website. If you continue we'll assume that you are understand this. Learn more
Accept