Aviso: para depositar documentos, por favor, inicia sesión e identifícate con tu cuenta de correo institucional de la UCM con el botón MI CUENTA UCM. No emplees la opción AUTENTICACIÓN CON CONTRASEÑA
 

Minimum divergence estimators based on grouped data

Loading...
Thumbnail Image

Full text at PDC

Publication date

2001

Advisors (or tutors)

Editors

Journal Title

Journal ISSN

Volume Title

Publisher

Springer
Citations
Google Scholar

Citation

Abstract

The paper considers statistical models with real-valued observations i.i.d. by F(x, theta (0)) from a family of distribution functions (F(x, theta); theta is an element of Theta), Theta subset of R-s, s greater than or equal to 1. For random quantizations defined by sample quantiles (F-n(-1)(lambda (1)),..., F-n(-1)(lambda (m-1))) of arbitrary fixed orders 0 < <lambda>(1) < ... < lambda (m-1) < 1, there are studied estimators <theta>(phi ,n) of theta (0) which minimize phi -divergences of the theoretical and empirical probabilities. Under an appropriate regularity, all these estimators are shown to be as efficient (first order, in the sense of Rao) as the MLE in the model quantified nonrandomly by (F-1(lambda (1), theta (0)),..., F-1(lambda (m-1), theta (0))). Moreover, the Fisher information matrix I-m(theta (0), lambda) of the latter model with the equidistant orders lambda = (lambda (j) = j/m : 1 less than or equal to j less than or equal to m-1) arbitrarily closely approximates the Fisher information F(theta (0)) of the original model when m is appropriately large. Thus the random binning by a large number of quantiles of equidistant orders leads to appropriate estimates of the above considered type.

Research Projects

Organizational Units

Journal Issue

Description

Unesco subjects

Keywords

Collections