您的位置 > 首页 > 会议讲座 > 香港科技大学杨灿博士学术讲座

香港科技大学杨灿博士学术讲座

来源:电子科技大学经济管理学院 | 2019-04-18 | 发布:ope电竞之家

香港科技大学杨灿博士学术讲座

讲座题目:Deep Generative Learning via Variational Gradient Flow

主 讲 人:杨灿博士

讲座时间:2019年4月19日下午2:30

讲座地点:清水河校区ope电竞楼A105教室

主讲人简介:

杨灿博士在香港科技大学任助理教授。杨灿博士在浙江大学获电子工程本科和硕士学位,在香港科技大学电子与计算机工程系获博士学位。曾在耶鲁卫生学院从事博士后研究,曾在耶鲁医学院人研究助理,曾在香港浸会大学任助理教授。

杨灿博士研究兴趣为:Machine learning, Statistical Genetics等, 曾获 2012 Hong Kong Young Scientist等奖项。研究成果发表在:American Journal of Human Genetics, Annals of Statistics, Bioinformatics, IEEE Transactions on Pattern Analysis and Machine Intelligence, PLoS Genetics, and Proceedings of the National Academy of Sciences. 至2018年12月,谷歌引用 2, 300余次,h-index 22,i10-index 34.

讲座简介:

Learning the generative model, i.e., the underlying data generating distribution, based on large amounts of data is one of the fundamental tasks in machine learning and statistics. Recent progresses in deep generative models have provided novel techniques for unsupervised and semi-supervised learning, with broad application varying from image synthesis, semantic image editing, image-to-image translation to low-level image processing. However, statistical understanding of deep generative models is still lacking, e.g., why the logD trick works well in training generative adversarial network (GAN). In this talk, we introduce a general framework, variational gradient flow (VGrow), to learn a deep generative model to sample from the target distribution via combing the strengths of variational gradient flowon probability space, particle optimization and deep neural network. The proposed framework is applied to minimize the f-divergence between the evolving distribution and the target distribution. We prove that the particles driven by VGrow are guaranteed to converge to the target distribution asymptotically. Connections of our proposed VGrow method with other popular methods, such as VAE, GAN and flow-based methods, have been established in this framework, gaining new insights of deep generative learning. We also evaluated several commonly used f-divergences, including Kullback-Leibler, Jensen-Shannon, Jeffrey divergences as well as our newly discovered “logD” divergence which serves as the objective function of the logD-trick GAN. Experimental results on benchmark datasets demonstrate that VGrow can generate high-fidelity images in a stable and efficient manner, achieving competitive performance with state-of-the-art GANs. This is a joint work with Yuan Gao, Yuling Jiao, Yao Wang, Yang Wang and Shunkang Zhang.

欢迎各位老师和同学参加!

经济与管理学院人事办公室

2019年4月17日


本文已经过优化显示,查看原文请点击以下链接:
查看原文:http://www.mgmt.uestc.edu.cn/Document/ArticlePage?Id=41787

看图学经济more

  • 【ope电竞之家】 P2P网贷行业流量之伤与评级之伤 08-10
  • 【ope电竞之家】 财富管理论:从理财师到智能投顾 08-10
  • 【ope电竞之家】 轮回的学生贷江湖,你可懂?(下) 04-05
  • 【ope电竞之家】 互联网票据理财之二:风险辨识不容易 03-30
  • 【ope电竞之家】 互联网票据理财之一:业务运作模式详解! 03-29
  • 京ICP备11001960号  京ICP证090565号 京公网安备1101084107号 论坛法律顾问:王进律师知识产权保护声明免责及隐私声明   主办单位:人大经济论坛 版权所有
    联系QQ:2881989700  邮箱:service@pinggu.org
    合作咨询电话:(010)62719935 广告合作电话:13661292478(刘老师)

    投诉电话:(010)68466864 不良信息处理电话:(010)68466864