Information Theoretic Models for Dependence Analysis and Missing Data Estimation
The maximum entropy principle was used to define an information theoretic dependence measure in this chapter, which measures the amount of dependence among the attributes in a contingency table. A correlation has been discovered between the information theoretic measure of dependence and the Chi-square statistic. been talked about This information theoretic dependence measure has also been studied in terms of generalization. Finally, by considering practical problems with empirical data, Yate’s method and maximum entropy estimation of missing data in design of experiment have been described and illustrated. An algorithm for estimating missing values in a fuzzy matrix is defined and applied to missing data estimation in a contingency table.
Author (S) Details
Prof. D. S. Hooda
(Former PVC Kurukshetra Univeristy) Honorary Professor in Mathematics at GJU of Science & Technology, Hisar-125001, India.
Dr. Parmil Kumar
Department of Statistics, University of Jammu, Jammu, India.
View Book :- https://stm.bookpi.org/CASTR-V5/article/view/1619