Abstract
We consider the general linear model y = Xβββ + εεε, denoted as M = {y, Xβββ, V}, supplemented with the new unobservable random vector y∗, coming from y∗ = X∗βββ + εεε∗, where the covariance matrix of y∗ is known as well as the cross-covariance matrix between y∗ and y. A linear statistic Fy is called linearly sufï¬cient for X∗βββ if there exists a matrix A such that AFy is the best linear unbi-ased estimator, BLUE, for X∗βββ. The concept of linear sufï¬ciency with respect to a predictable random vector is deï¬ned in the corresponding way but considering the best linear unbiased predictor, BLUP instead of BLUE. In this paper, we consider the linear sufï¬ciency of Fy with respect to y∗, X∗βββ, and εεε∗. We also apply our results into the linear mixed model. The concept of linear sufï¬ciency was essentially introduced in early 1980s by Baksalary, Kala, and Drygas. Recently, several papers providing further properties of the linear sufï¬ciency have been published by the present authors. Our aim is to provide an easy-to-read review of recent results and while doing that, we go through some basic concepts related to linear sufï¬ciency. As a review paper, we do not provide many proofs, instead our goal is to explain and clarify the central results.
Original language | English |
---|---|
Title of host publication | Multivariate, Multilinear and Mixed Linear Models |
Editors | Katarzyna Filipiak, Augustyn Markiewicz, Dietrich von Rosen |
Place of Publication | Switzerland |
Publisher | Springer Cham |
Pages | 265-318 |
Volume | 1 |
Edition | 1 |
ISBN (Print) | 978-3-030-75493-8 |
DOIs | |
Publication status | Published - 2021 |