11112223333

概率统计系列学术报告:The normalized expectation-maximization (N-EM) algorithm (正则化的EM算法)

发布人:日期:2021年12月24日 15:19浏览数:

报告题目:The normalized expectation-maximization (N-EM) algorithm(正则化的EM算法)

报 告 人:田国梁教授(南方科技大学)

报告时间:20211229日  10:00

报告地点:数统院307学术报告厅

报告摘要:

Although the expectation- maximization (EM) algorithm is a powerful optimization tool in statistics, it can only be applied to missing/incomplete data problems or to problems with a latent-variable structure. It is well known that the introduction of latent variables (or the data augmentation) is an art; i.e., it could only be done case by case. In this paper, we propose a new algorithm, a so-called normalized EM (N-EM) algorithm, for a class of log-likelihood functions with integrals. As an extension of the original EM algorithm, the N-EM algorithm inherits all advantages of EM-type algorithms and consists of three steps: normalization step (N-step), expectation step (E-step) and maximization step (M-step), where the N-step is to construct a normalized density function (ndf), the E-step is to compute a well-established surrogate Q-function and the M-step is to maximize the Q-function as in the original EM algorithm. The ascent property, the best choice of the ndf, and those N-EM algorithms with a difficult M-step are also explored. By multiple real applications, we have shown that the N-EM algorithm can solve some problems which cannot be addressed by the EM algorithm. Next, for problems to which the EM can be applied (often case by case), the N-EM algorithm can be employed in a unified framework. Numerical experiments are performed and convergence properties are also established. [This is a joint work with Xuanyu LIU, Kam Chuen YUEN and Chi ZHANG].

报告人简介:

田国梁博士曾在美国马里兰大学从事医学统计研究六年,在香港大学统计与精算学系任副教授八年,从20166月至今在南方科技大学统计与数据科学系任教授、博士生导师、副系主任。他目前的研究方向为(0, 1)区间上连续数据以及成份数据的统计分析、多元零膨胀计次数据分析,在国外发表140SCI论文、出版3本英文专著、在科学出版社出版英文教材1本。他是四个国际统计期刊的副主编。主持国自然面上项目二项、参加国自然重点项目并主持深圳市稳定支持面上项目各一项。

上一条:代数、数论与几何系列学术报告:Topological properties of self-affine tiles and reptiles

下一条:发明专利申请经验交流

【关闭】 打印    收藏