Entropy-metric estimation of the small data models with stochastic parameters

HELIYON(2024)

引用 0|浏览0
暂无评分
摘要
The formalization of dependencies between datasets, taking into account specific hypotheses about data properties, is a constantly relevant task, which is especially acute when it comes to small data. The aim of the study is to formalize the procedure for calculating optimal estimates of probability density functions of parameters of linear and nonlinear dynamic and static small data models, created taking into account specific hypotheses regarding the properties of the studied object. The research methodology includes probability theory and mathematical statistics, information theory, evaluation theory, and stochastic mathematical programming methods. The mathematical apparatus presented in the article is based on the principle of maximization of information entropy on sets determined as a result of a small number of censored measurements of "input" and "output" entities in the presence of noise. These data structures became the basis for the formalization of linear and nonlinear dynamic and static models of small data with stochastic parameters, which include both controlled and noise-oriented input and output measurement entities. For all variants of the above-mentioned small data models, the tasks of determining the optimal estimates of the probability density functions of the parameters were carried out. Formulated optimization problems are reduced to the forms canonical for the stochastic linear programming problem with probabilistic constraints.
更多
查看译文
关键词
Information entropy,Machine learning,Small data model,Probability density functions estimation,Static stochastic model,Dynamic stochastic model,Parametric optimization
AI 理解论文
溯源树
样例
生成溯源树,研究论文发展脉络
Chat Paper
正在生成论文摘要