学术沙龙:Softplus Regressions and Convex Polytopes

文:教师发展中心 / 来源:党委教师工作部、人力资源部 / 2016-12-08 / 点击量:2628

  为加强我校各学科之间的学术交流,搭建教师学术交流平台,促进教师学术水平提升和跨学科合作,教师发展中心开展跨学科学术沙龙活动。

  本次活动教师发展中心特别邀请来自德克萨斯大学奥斯汀分校的周名远博士,与我校师生分享他关softplus的回归的研究心得。具体安排如下,欢迎感兴趣的师生参加:

  一、时 间:2016年12月13日(周二)上午10:00

  二、地 点:清水河校区经管楼宾诺咖啡厅

  三、主 题:Softplus Regressions and Convex Polytopes

  四、主讲人:周名远(德克萨斯大学奥斯汀分校助理教授)

  五、主持人:国家“青年千人”计划入选者 徐增林教授

  六、承办单位:计算机科学与工程学院、统计机器智能与学习实验室(SMILE Lab)

  七、交流内容:

  To construct flexible nonlinear predictive distributions, I introduce a family of softplus function based regression models that convolve, stack, or combine both operations by convolving countably infinite stacked gamma distributions, whose scales depend on the covariates. Generalizing logistic regression that uses a single hyperplane to partition the covariate space into two halves, softplus regressions employ multiple hyperplanes to construct a confined space, related to a single convex polytope defined by the intersection of multiple half-spaces or a union of multiple convex polytopes, to separate one class from the other. The gamma process is introduced to support the convolution of countably infinite (stacked) covariate-dependent gamma distributions. For Bayesian inference, Gibbs sampling derived via novel data augmentation and marginalization techniques is used to deconvolve and/or demix the highly complex nonlinear predictive distribution. Example results demonstrate that softplus regressions provide flexible nonlinear decision boundaries, achieving classification accuracies comparable to that of kernel support vector machine while requiring significant less computation for out-of-sample prediction.

 

 

                    人力资源部教师发展中心

                      2016年12月8日


编辑:  / 审核:罗莎  / 发布者:罗莎