本次研讨会我们邀请到了来自Montreal Institute for Learning Algorithms （蒙特利尔学习算法研究所）的唐建老虎机电玩城微信上下分为我们讲解图表示学习与推断的前沿算法研究，以及来自上海Amazon AWS AI团队的张峥老虎机电玩城微信上下分为我们讲解从DGL系统到前沿应用和研究落地的工作。研讨会由提现老虎机游戏手机版助理教授张伟楠主持。
活动时间：7 月 3 日（周三）13 : 30 - 17 : 00
1、Title: Graph Representation Learning and Reasoning
Abstract: Graphs, a general type of data structures for capturing interconnected objects, are ubiquitous in a variety of disciplines and domains. This talk is divided into two parts. In the first part, I will introduce our work on learning node representations (LINE, WWW’15), extremely low-dimensional node representation learning for graph and high-dimensional data visualization (LargeVis, WWW’16), knowledge graph embedding (RotatE, ICLR’19), and a general and high-performance graph embedding system (GraphVite, WWW’19). In the second part, I will introduce our recent work on combining statistical relational learning and graph neural networks for predictions and reasoning on graphs (GMNN, ICML’19).
Bio: Dr. Jian Tang is an assistant professor at Mila (Quebec AI institute) and HEC Montreal since December, 2017. He is named to the first cohort of Canada CIFAR Artificial Intelligence Chairs (CIFAR AI Research Chair). His research interests focus on deep graph representation learning with a variety of applications such as knowledge graphs, drug discovery and recommender systems. He was a research fellow in University of Michigan and Carnegie Mellon University. He received his Ph.D degree from Peking University and was a visiting student in University of Michigan for two years. He was a researcher in Microsoft Research Asia for two years. His work on graph representation learning (e.g., LINE, LargeVis, and RotatE) are widely recognized. He received the best paper award of ICML’14 and was nominated for the best paper of WWW’16.
2、Title: Deep Graph Made Easy (and faster); and a Number of Studies
Abstract: All real-world data has structures that are best described as graphs. If there is one data structure for deep learning algorithms, graph would be the foremost candidate. The graph structure can be either explicit, such in social networks, knowledge graphs, and protein-interaction networks, etc., or latent and implicit, as in the case of languages and images. Leveraging and discovering graph structures have many immediate applications and also serves as a futile ground for the next generation of algorithms.
This talk begins with an introduction of DGL, an open-source platform designed to accelerate research in this new emerging field, with its philosophy to support graph as the core abstraction and take care to maintain both forward (i.e. supporting new research ideas) and backward (i.e. integration with existing components) compatibility. DGL has been tested on a variety of models, including but not limited to the popular Graph Neural Networks (GNN) and its variants, with promising speed, memory footprint and scalability.
We then describe several more recent work. The first, SegTran, takes a graph perspective of the popular Transformer model and applies sparsification to derive at a lighter architecture. SegTran's core idea is to leverage a latent tree over different spans of spans so as to extract hierarchical features. By imposing structural inductive bias this way we are able to strike a balance between the power of the model and training/computational efficiency, arriving at an O(n log n) architecture.
The second is an empirical study of learned attention in Graph Attention Networks (GATs). We found that is, independent of learning setting, task and attention variant, datasets have a much stronger influence. We further explore the possibility of transferring attentions for graph sparsification, and show that, when applicable, attention-based sparsification retains enough information to obtain good performance while reducing computational and storage cost.
Bio：Zheng Zhang is Professor of Computer Science, NYU Shanghai; Global Network Professor, NYU. As of fall of 2018, Professor Zhang is taking a leave of absence and has joined Amazon AWS, taking the role of the founding Director of AWS Shanghai AI Lab. He also holds an affiliated appointment with the Department of Computer Science at the Courant Institute of Mathematical Sciences and with the Center for Data Science at NYU's campus in New York City. Prior to joining NYU Shanghai, he was the founder of the System Research Group in Microsoft Research Asia, where he served as Principle Researcher and research area manager. Before he moved to Beijing, he was project lead and member of technical staff in HP-Labs. He holds a PhD from the University of Illinois, Urbana-Champaign, an MS from University of Texas, Dallas, and a BS Fudan University.