Home > Seminars > Hanghang Tong - Deep Network Mining: Exploring the Power and Beauty of Network of Networks

Hanghang Tong - Deep Network Mining: Exploring the Power and Beauty of Network of Networks


4/18/2019 at 3:30PM


4/18/2019 at 4:45PM


126 DeBartolo


College of Engineering close button

Meng Jiang

Meng Jiang

VIEW FULL PROFILE Email: mjiang2@nd.edu
Phone: 574-631-7454
Website: http://www.meng-jiang.com
Office: 326C Cushing Hall


College of Engineering Assistant Professor
Data science, user behavior modeling, recommender systems, fraud detection, information extraction
Click for more information about Meng
Add to calendar:
iCal vCal

Networks not only appear in many high-impact application domains, but also have become an indispensable ingredient in a variety of data mining and machine learning problems. Often these networks are collected from different sources, at different time, at different granularities. State-of-the-art focuses on mining networks at three complementary levels, including the network-level at the coarsest granularity, the subgraph-level in the middle, and node/link-level at the finest granularity. In other words, an individual node or link is often the finest-granulated object (i.e., the atom) in a network mining model and algorithm.

In this talk, I will present our recent work on mining multiple inter-correlated networks, that allows us to go deeper inside a node or a link of a network, i.e., to model and mine networks hidden inside an atom (i.e., a node or a link). First, I will introduce a new data model for a network of networks, where the key idea is to leverage the network itself as a context to connect different networks. Second, I will present some algorithmic examples on how to perform mining with a network of networks, where the key idea is to leverage the contextual network as an effective regularizer during the mining process, including ranking and clustering. We believe that the power and beauty of networks of networks lie in thinking of networks as a context, and this vision (networks-as-a-context) goes well beyond network data. To demonstrate that, I will further introduce our other work on how to use networks as a powerful and unifying context to connect different types of data from different sources with different data mining algorithms, including a network of co-evolving time series, a network of regression models, a network of inference problems and a network of control problems. Finally, I will share my thoughts about the future plan, including how networks-as-a-context can help in the era of big data and AI.

Seminar Speaker:

Hanghang Tong

Hanghang Tong

Hanghang Tong is currently an associate professor at School of Computing, Informatics, and Decision Systems Engineering (CIDSE), Arizona State University. Before that, he was an assistant professor at Computer Science Department, City College, City University of New York, a research staff member at IBM T.J. Watson Research Center and a Post-doctoral fellow at Carnegie Mellon University. He received his M.Sc. and Ph.D. degrees from Carnegie Mellon University in 2008 and 2009, both in Machine Learning. His research interest is in large scale data mining for graphs and multimedia. He has received several awards, including SDM/IBM Early Career Data Mining Research award (2018), NSF CAREER award (2017), ICDM 10-Year Highest Impact Paper award (2015), four best paper awards (TUP'14, CIKM'12, SDM'08, ICDM'06), seven 'bests of conference' (ICDM'18, CSoNet’18, KDD'16, SDM'15, ICDM'15, SDM'11 and ICDM'10), 1 best demo, honorable mention (SIGMOD'17), and 1 best demo candidate, second place (CIKM'17). He has published over 100 refereed articles. He is the Editor-in-Chief of SIGKDD Explorations (ACM), an action editor of Data Mining and Knowledge Discovery (Springer), and an associate editor of Knowledge and Information Systems (Springer) and Neurocomputing Journal (Elsevier); and has served as a program committee member in multiple data mining, database and artificial intelligence venues (e.g., SIGKDD, SIGMOD, AAAI, WWW, CIKM, etc.).