Entropy-Based Goodness-of-Fit Testing for Multivariate Models
Published in Mathematics and Statistics
In this work, we develop goodness-of-fit testing procedures grounded in Tsallis entropy, targeting multivariate exponential-power (generalized Gaussian) and q-Gaussian distributions. The proposed framework compares closed-form entropy expressions under the null hypothesis with non-parametric k-nearest-neighbor estimators, enabling flexible and distribution-sensitive model assessment.
We establish key theoretical properties, including consistency and mean-square convergence of the estimator, and discuss an asymptotic normality regime as the entropy parameter approaches the Shannon limit. Since analytical null distributions are often intractable, critical values are calibrated using parametric bootstrap techniques, ensuring reliable inference across dimensions and parameter settings.
Extensive Monte Carlo experiments demonstrate the empirical size, power, and computational efficiency of the proposed tests. The results highlight the advantages of entropy-based evaluation in capturing subtle distributional discrepancies that may be overlooked by traditional goodness-of-fit approaches. An applied example further illustrates practical calibration and sensitivity considerations.
🔗 Article link: https://www.nature.com/articles/s41598-025-08110-2
Please sign in or register for FREE
If you are a registered user on Research Communities by Springer Nature, please sign in