This work treats the problem of estimating the predictive density of a random vector when both the mean vector and the variance are unknown. We prove that the density of reference in this context is inadmissible under the Kullback–Leibler loss in a nonasymptotic framework. Our result holds even when the dimension of the vector is strictly lower than three, which is surprising compared to the known variance setting. Finally, we discuss the relationship between the prediction and the estimation problems.
Keywords: Bayes rule; Inadmissibility; Multivariate normal distribution; Prior distribution; Unknown variance
Journal Article. 0 words.
Subjects: Biomathematics and Statistics ; Probability and Statistics
Full text: subscription required