- Ran Duan (IIIS, Tsinghua)
- Yi Li (Divison of Mathematical Sciences, School of Physical and Mathematical Sciences, NTU)
- Sinno Jialin Pan (School of Computer Science and Engineering, NTU)
- Wenfei Wu (IIIS, Tsinghua)
- Xavier Bresson (School of Computer Science and Engineering, NTU)
Title: Bottleneck Path: Breaking the Sorting Barrier
Abstract:
In this talk, I will talk about our new O(m\sqrt{log n}) time algorithm on the (directed) single-source bottleneck path (SSBP) problem, which is the first algorithm faster than the Dijkstra bound O(m+n log n) in sparse graphs. Then I will also give a brief introduction on the algorithms for minimum spanning tree, including randomized linear time algorithm [KKT95], deterministic O(m log* n) time algorithm [FT87], and proved optimal algorithm [PR02], which would give you ideas on improving the time bound of SSBP.
Title: Two Problems in Data Streaming Algorithms
Abstract:
The talk will consist of two parts and I shall discuss two problems in data streaming algorithms, one in each part. The first problem is the heavy hitters and the sparse recovery problem, including the classical, off-the-grid and Fourier variants. The second problem is the classical F_p moment and the extended Schatten norm estimation problem, which lies at the interplay of analysis, probability, communication complexity, etc.
Title: Transfer Learning: Fundamentals and Applications
Abstract:
Transfer learning is a learning paradigm motivated by human’s learning ability on transferring experience across different tasks or problems. Different from traditional machine learning paradigms, which assume training data and test data follow the same distribution, in transfer learning, training data and test data are usually drawn from different distributions or even represented in different feature spaces as they come from different domains or tasks. In this course, I will first give an overview on transfer learning, and then introduce representative transfer learning methods and applications.
Title: Software Development and Optimization for Network Functions
Abstract:
Network Functions (NFs, or middleboxes) have got widely used in modern computer networks. In the background of implementing NFs as software, rapid delivering high-performance NFs is crucial for NF vendors and network operators. Targeting NF vendors, we design two frameworks to develop and optimize NFs. First, to develop new NFs, we redesign the development process: we decouple the packet processing logic and environmental adaptation logic in NFs; we design a vendor independent language to develop NFs and a compiler to integrate environment related features into NFs; thus, both pieces of logic can evolve independently. Second, for existing legacy NFs, we propose to apply program analysis techniques to optimize their internal logic. The optimizer takes NF code and runtime configurations as input, eliminates runtime unused logic, and outputs a concise and high-performance NF.
Title: Convolutional Neural Networks on Graphs
Abstract:
In the past years, deep learning methods have achieved unprecedented performance on a broad range of problems in various fields from computer vision to speech recognition. So far research has mainly focused on developing deep learning methods for grid-structured data, while many important applications have to deal with graph-structured data. Such geometric data are becoming increasingly important in computer graphics and 3D vision, sensor networks, drug design, biomedicine, recommendation systems, and web applications. The purpose of this talk is to introduce the emerging field of deep learning on graphs, overview existing solutions as well as applications for this class of problems.
NIPS’17: Geometric Deep Learning on Graphs, https://nips.cc/Conferences/2017/Schedule?showEvent=8735
UCLA’18: New Deep Learning Techniques, https://www.ipam.ucla.edu/programs/workshops/new-deep-learning-techniques