Entropy-Based Goodness-of-Fit Testing for Multivariate Models
Published in Mathematics and Statistics
In this work, we develop goodness-of-fit testing procedures grounded in Tsallis entropy, targeting multivariate exponential-power (generalized Gaussian) and q-Gaussian distributions. The proposed framework compares closed-form entropy expressions under the null hypothesis with non-parametric k-nearest-neighbor estimators, enabling flexible and distribution-sensitive model assessment.
We establish key theoretical properties, including consistency and mean-square convergence of the estimator, and discuss an asymptotic normality regime as the entropy parameter approaches the Shannon limit. Since analytical null distributions are often intractable, critical values are calibrated using parametric bootstrap techniques, ensuring reliable inference across dimensions and parameter settings.
Extensive Monte Carlo experiments demonstrate the empirical size, power, and computational efficiency of the proposed tests. The results highlight the advantages of entropy-based evaluation in capturing subtle distributional discrepancies that may be overlooked by traditional goodness-of-fit approaches. An applied example further illustrates practical calibration and sensitivity considerations.
🔗 Article link: https://www.nature.com/articles/s41598-025-08110-2
Follow the Topic
-
Scientific Reports
An open access journal publishing original research from across all areas of the natural sciences, psychology, medicine and engineering.
Related Collections
With Collections, you can get published faster and increase your visibility.
Reproductive Health
Publishing Model: Hybrid
Deadline: Mar 30, 2026
Sepsis: Treatment, intervention, mortality
Publishing Model: Open Access
Deadline: Dec 23, 2025
Please sign in or register for FREE
If you are a registered user on Research Communities by Springer Nature, please sign in