科研工作
首页 >> 科研工作 >> 学术交流 >> 正文

william威廉亚洲官方线上学术报告: Learning rates for kernel-regularized regressions with general convex losses

发布日期:2020/12/02    点击:

报告时间:2020年12月5日(周六)上午10:45-11:45

会议地点: 腾讯会议 ID259 943 704

点击下面链接入会,或添加至会议列表:https://meeting.tencent.com/s/8GbF4R00h1hf

主讲人:盛宝怀教授


报告摘要: In this talk, the speaker systematic presents the convex approach for bounding the learning rates of kernel regularized regressions with convex analysis skills. The main point focuses on the author's own research achievements in recent years. The contents include following ten contents. (1) convex analysis, notions and results. (2) some probability inequalities. (3) kernel regularized regression models. (4) sample error, approximation error. (5) the reasons of using refined convex losses.(6). learning rates with strong convex losses. (7) two-sided reproducing kernel Banach spaces (RKBSs). (8). comparison inequalities for some convex losses. (9) examples of two-sided RKBSs. (10). learning rates associating with some refined losses


报告人简介:
   盛宝怀,男,1962年4月生,绍兴文理学院教授,博士,硕士导师. 浙江省151人才第二层次,绍兴市第6批拔尖人才。从事非线性最优化、函数逼近论、统计学习理论等的研究。在这些领域发表论文60多篇,SCI收录25篇,主持和完成国家基金面上项目2项。


上一条:william威廉亚洲官方线上学术报告: Learning Rates of Singular Functions of Kernel Cross-Covariance Operators

下一条:william威廉亚洲官方线上学术报告: Theory of Deep Learning