From Mahantesh.Halappanavar at pnnl.gov Fri Apr 7 14:17:20 2017
From: Mahantesh.Halappanavar at pnnl.gov (Halappanavar, Mahantesh M)
Date: Fri, 7 Apr 2017 18:17:20 +0000
Subject: [Csc] GraML - First Workshop on the Intersection of Graph
Algorithms and Machine Learning
Message-ID: <1DEF47BC-F09F-4BA2-A949-3C0FA081CDE6@pnnl.gov>
[Apologies if you receive multiple copies of this CFP]
GraML ?17
http://hpc.pnl.gov/graml/
First Workshop on the Inserction of Graph Algorithms and Machine Learning
June 2, 2017
Buena Vista Hotel
Co-Located with IPDPS
Theme
We are experiencing an exponential growth in the number of proposed graph and machine learning solutions for a variety of problems. With explosive growth come claims and counterclaims as to which approach---graph or machine learning---is best. In some cases, old problems are recast in the alternate approach in the hope of finding a better solution; while in other cases, an approach is chosen to solve a new problem without a sound theoretical basis for success.
The conundrum is that both graph algorithms and machine learning can solve many real world problems, and that their domains intersect, but are not equivalent. For example, both community detection algorithms and SVMs partition data into subsets of similar members, and Bayesian Networks are a probabilistic graphical model of random variables and conditional dependencies used to learn causal relationships.
In reality, many analytic workloads require both approaches: graphs to understand relationships and organizational structures, and machine-learning methods to identify signature features. Given the difference in the parallel execution models of graph algorithms and machine learning methods, current tools, runtime systems, and architectures do not deliver similar performance in all cases.
The objectives of this workshop are:
* Clarify the domain of problems best solved by graphs and those best solved by machine learning approaches,
* Formulate a sound theoretical basis for choosing among approaches,
* Identify analytic workloads requiring multiple approaches, and
* Evaluate the performance and scalability of integrated platforms for graph methods and machine learning.
While there is a significant amount of interesting and critical research on the development of platforms for graphs and machine learning, and the scaling of the platforms themselves on novel and high performance systems, this workshop aspires to be more theoretical in nature, investigating their respective problem domains. The theoretical applicability aspect then naturally impacts the more practical tractability aspect, in terms of complexity, performance, and quality of solution, as they are dependent on the actual implementation of the systems frameworks (software, hardware, and combination thereof).
Preliminary Program
8:20 - 8:30 Welcome and Introduction
Antonino Tumeo, John Feo, Mahantesh Halappanavar
8:30 - 9:30 Keynote - Chair: Antonino Tumeo (PNNL)
Neural Graph Learning
Sujith Ravi (Google)
9:30 - 10:00 Coffee Break
10:00 - 11:30 Paper Session - Chair: TBD
10:00 - 10:30 Learning on Graphs for Predictions of Fracture Propagation, Flow and
Transport
Hristo Djidjev, Daniel O'Malley, Hari Viswanathan, Jeffrey Hyman, Satish
Karra and Gowri Srinivasan.
10:30 - 11:00 Analyzing Community Structure in Networks.
Hongyuan Zhan and Kamesh Madduri.
11:00 - 11:30 Compound Analytics: Templates for Integrating Graph Algorithms and
Machine Learning
Ronald Hagan, Michael Langston and Charles Phillips.
11:30 - 12:30 Debate - Moderator: John Feo (PNNL)
Panelists: Ananth Kalyanaraman (Washington State University),
Chris Long (DoD), Fredrik Manne (University of Bergen),
Maxim Naumov (NVIDIA)
Keynote Talk abstract: Neural Graph Learning
Abstract: Machine learning has become ubiquitous in solving complex problems, and recent advances have leapfrogged the ability to build intelligent systems that can read text, see images, or hear sounds from the real world and understand semantics. While this has led to the development of many tools and accelerated progress in several fields, designing machine learning approaches ?from scratch? remains a daunting challenge for many applications. On the other hand, graphs offer a simple, elegant way to express different types of relationships observed in data or to concisely encode structure underlying a problem. Thus, how can we combine the flexibility of graphs with the power of machine learning? This problem motivates new approaches that employ graph-based machine learning as a computing mechanism for solving real-world prediction tasks. So, the question remains: how do we design efficient learning algorithms for such scenarios and deal with the difficulty of hard-to-optimize machine learning functions, massively-sized graphs, and complex prediction tasks involving large (exponential) output spaces?
In his talk, Dr. Sujith Ravi, of Google, will describe how to address these challenges and design efficient distributed algorithms using Expander, Google?s large-scale graph-based machine learning framework. This work was motivated by the need to design robust methods that learn to generalize from data (and underlying relationships) with minimal supervision?the way humans do. The graph learning framework can easily handle massive graphs (containing billions of vertices and trillions of edges) and make predictions over billions of output labels while achieving O(1) space complexity per vertex. The framework is used to power a number of machine intelligence applications, including Smart Reply, image recognition, and video summarization, and can be combined with deep neural networks to tackle complex language understanding and computer vision problems. Dr. Ravi?s talk also will highlight some of Google?s latest research and results on ?neural graph learning,? a new joint optimization framework for combing graph learning with deep neural network models.
-------------- next part --------------
HTML attachment scrubbed and removed