Minimum Gamma-Divergence for Regression and Classification Problems | Agenda Bookshop Skip to content
Online orders placed from 19/12 onward will not arrive in time for Christmas.
Online orders placed from 19/12 onward will not arrive in time for Christmas.
A01=Shinto Eguchi
Age Group_Uncategorized
Age Group_Uncategorized
Author_Shinto Eguchi
automatic-update
Category1=Non-Fiction
Category=PBT
Category=UYQM
COP=Singapore
Delivery_Pre-order
Language_English
PA=Not yet available
Price_€20 to €50
PS=Forthcoming
softlaunch

Minimum Gamma-Divergence for Regression and Classification Problems

English

By (author): Shinto Eguchi

This book introduces the gamma-divergence, a measure of distance between probability distributions that was proposed by Fujisawa and Eguchi in 2008. The gamma-divergence has been extensively explored to provide robust estimation when the power index is positive. The gamma-divergence can be defined even when the power index is negative, as long as the condition of integrability is satisfied. Thus, the authors consider the gamma-divergence defined on a set of discrete distributions. The arithmetic, geometric, and harmonic means for the distribution ratios are closely connected with the gamma-divergence with a negative . In particular, the authors call the geometric-mean (GM) divergence the gamma-divergence when is equal to -1.

The book begins by providing an overview of the gamma-divergence and its properties. It then goes on to discuss the applications of the gamma-divergence in various areas, including machine learning, statistics, and ecology. Bernoulli, categorical, Poisson, negative binomial, and Boltzmann distributions are discussed as typical examples. Furthermore, regression analysis models that explicitly or implicitly assume these distributions as the dependent variable in generalized linear models are discussed to apply the minimum gamma-divergence method.

In ensemble learning, AdaBoost is derived by the exponential loss function in the weighted majority vote manner. It is pointed out that the exponential loss function is deeply connected to the GM divergence. In the Boltzmann machine, the maximum likelihood has to use approximation methods such as mean field approximation because of the intractable computation of the partition function. However, by considering the GM divergence and the exponential loss, it is shown that the calculation of the partition function is not necessary, and it can be executed without variational inference.

 

See more
Current price €47.49
Original price €49.99
Save 5%
A01=Shinto EguchiAge Group_UncategorizedAuthor_Shinto Eguchiautomatic-updateCategory1=Non-FictionCategory=PBTCategory=UYQMCOP=SingaporeDelivery_Pre-orderLanguage_EnglishPA=Not yet availablePrice_€20 to €50PS=Forthcomingsoftlaunch

Will deliver when available. Publication date 17 Dec 2024

Product Details
  • Dimensions: 155 x 235mm
  • Publication Date: 17 Dec 2024
  • Publisher: Springer Verlag Singapore
  • Publication City/Country: Singapore
  • Language: English
  • ISBN13: 9789819788798

About Shinto Eguchi

Shinto Eguchi received his master degree from Osaka University in 1979 and a Ph.D. from Hiroshima University Japan in 1984. His working career started as Assistant Professor of Hiroshima University 1984 Associate Professor of Shinamne University 1986 and Professor of The Institute of Statistical Mathematics 1995-2020. He is currently Emeritus Professor at the Institute of Statistical Mathematics and Graduate University of Advanced Studies. His research interest is primarily statistics including statistical machine learning bioinformatics information geometry statistical ecology and parametric/semiparametric inference and robust statistics.   His recent publication:   -A generalized quasi-linear mixed-effects model Y Saigusa S Eguchi O Komori Statistical Methods in Medical Research 31 (7) 1280-1291 2022.   -Robust self-tuning semiparametric PCA for contaminated elliptical distribution H Hung SY Huang S Eguchi IEEE Transactions on Signal Processing 70 5885-5897 2022.   -Minimum information divergence of Q-functions for dynamic treatment resumes. S Eguchi Information Geometry 1-21 2022.  

Customer Reviews

Be the first to write a review
0%
(0)
0%
(0)
0%
(0)
0%
(0)
0%
(0)
We use cookies to ensure that we give you the best experience on our website. If you continue we'll assume that you are understand this. Learn more
Accept