Abstract : In recent decades, topological data analysis "TDA" and geometric data analysis "GDA" have provided great impacts on data science, characterizing valuable information on “shape of data”. In this series of mini-symposia, we present recent progresses of theory and applications of TDA and GDA, including persistent homology, optimal transportation, filling radius, Reeb graph, graph embeddings, flow data analysis, dimensionality reduction, geometric deep learning, Hodge Laplacian, discrete exterior calculus, and their various applications in materials, chemistry, biology, and data sciences.
[01514] Vietoris-Rips persistent homology, injective metric spaces, and the filling radius
Format : Talk at Waseda University
Author(s) :
Sunhyuk Lim (Max Planck Institute for Mathematics in the Sciences)
Facundo Memoli (The Ohio State University)
Osman Berat Okutan (Florida State University)
Abstract : In the applied algebraic topology community, the persistent homology induced by the Vietoris-Rips simplicial filtration is a standard method for capturing topological information from metric spaces. In this paper, we consider a different, more geometric way of generating persistent homology of metric spaces which arises by first embedding a given metric space into a larger space and then considering thickenings of the original space inside this ambient metric space. In the course of doing this, we construct an appropriate category for studying this notion of persistent homology and show that, in a category theoretic sense, the standard persistent homology of the Vietoris-Rips filtration is isomorphic to our geometric persistent homology provided that the ambient metric space satisfies a property called injectivity.
As an application of this isomorphism result we are able to precisely characterize the type of intervals that appear in the persistence barcodes of the Vietoris-Rips filtration of any compact metric space and also to give succinct proofs of the characterization of the persistent homology of products and metric gluings of metric spaces. Our results also permit proving several bounds on the length of intervals in the Vietoris-Rips barcode by other metric invariants, for example the notion of spread introduced by M. Katz.
As another application, we connect this geometric persistent homology to the notion of filling radius of manifolds introduced by Gromov and show some consequences related to (1) the homotopy type of the Vietoris-Rips complexes of spheres which follow from work of M. Katz and (2) characterization (rigidity) results for spheres in terms of their Vietoris-Rips persistence barcodes which follow from work of F. Wilhelm.
Finally, we establish a sharp version of Hausmann’s theorem for spheres which may be of independent interest.
[01746] Reeb Order Method and its Application to Topological Flow Data Analysis
Format : Talk at Waseda University
Author(s) :
Tomoki UDA (Tohoku University)
Abstract : Sakajo and Yokoyama have classified the topology of streamlines and characterised them by unique tree representations, called Cyclically Ordered rooted Tree (COT) representations. The author realised the practical application of their theory to data science, called Topological Flow Data Analysis (TFDA), utilising Reeb graphs and their discretised version, Reeb order. In this talk, we briefly introduce TFDA theory and its application to meteorology and oceanography.
[04575] Topological Node2vec: Graph Embeddings via Persistent Homology
Format : Talk at Waseda University
Author(s) :
Killian Meehan (Kyoto University)
Abstract : Node2vec is a machine learning framework which specializes in transforming graph into euclidean data. However, we demonstrate that with very simple examples we see a high destruction of topological information during the embedding process. Our project builds on top of the original Node2vec framework and introduces a topological loss function derived from optimal transport which forces this new machine learning network to maximally preserve graph information while checking topological loss at every step.
[04694] Data, Geometry, and homology
Format : Talk at Waseda University
Author(s) :
Wojciech Chacholski (KTH, Royal Institute of Technology)
Jens Agerberg (KTH, Royal Institute of Technology and Ericsson)
Ryan Ramanujam (Karolinska Institutet (Dept. of Clinical Neuroscience) and Datanon Corporation)
Francesca Tombari (KTH, Royal Institute of Technology)
Abstract : For a successful analysis a suitable representation of data by objects amenable for statistical methods is fundamental.
There has been an explosion of applications in which homological representations of data played a significant role. I will present one such representation called stable rank and introduce various novel ways of using it to encode geometry, and then analyse, data. I will provide several illustrative examples of how to use stable ranks to find meaningful results.
[04865] Topological Representation Learning for Biomedical Image Analysis
Format : Talk at Waseda University
Author(s) :
Chao Chen (Stony Brook University)
Abstract : Modern analytics is facing highly complex and heterogeneous data. While deep learning models have pushed our prediction power to a new level, they are not satisfactory in some crucial merits such as transparency, robustness, data-efficiency, etc. In this talk, I will focus on our recent work on combining topological reasoning with learning to solve problems in biomedical image analysis. In biomedicine, we encounter various complex structures such as neurons, vessels, tissues and cells. These structures encode important information about underlying biological mechanisms. To fully exploit these structures, we propose to enhance learning pipelines through the application of persistent homology theory. This inspires a series of novel methods for segmentation, generation, and analysis of these topology-rich biomedical structures. Complex structures also arise in many other contexts beyond biomedicine. We will also briefly introduce how topological reasoning can be used to strengthen graph neural networks and to improve the robustness of deep neural networks against noise and against backdoor attacks.
[05110] New Algorithms for Random Graph Embeddings
Format : Talk at Waseda University
Author(s) :
Jason H Cantarella (University of Georgia)
Clayton Shonkwiler (Colorado State University)
Henrik Schumacher (Technische Universitat Chemnitz)
Tetsuo Deguchi (Ochanomizu University)
Erica Uehara (Kyoto University)
Abstract : We discuss the problem of randomly embedding graphs in $R^d$, which often arises in machine learning. Given an arbitrary probability distribution on each edge, we condition the joint distribution on the graph type (loops of edges must close). The key idea is to use an unusual version of cohomology to encode embedding data. Our method is particularly well suited to embeddings with fixed edge lengths, which arise in polymer science and robotics.
[05318] Topological Deep Learning: Going Beyond Graph Data
Format : Talk at Waseda University
Author(s) :
Mustafa Hajij (university of San Francisco)
Ghada Zamzmi (University of South Florida)
Theodore Papamarkou (University of Manchester)
Nina Miolane (University of Californian Santa Barbara)
Aldo Saenz (IBM)
Karthikeyan Natesan Ramamurthy (IBM)
Tolga Birdal (Imperial College London)
Tamal Krishna Dey (Purdue University)
Soham Mukherjee (Purdue University)
Shreyas Samaga (Purdue University)
Neal Livesay (Northeastern University)
Robin Walters (Northeastern University)
Paul Rosen (University of Utah)
Michael Schaub (RWTH Aachen University)
Abstract : Topological deep learning is a rapidly growing field that pertains to the development of deep learning models for data supported on topological domains such as simplicial complexes, cell complexes, and hypergraphs, which generalize many domains encountered in scientific computations. In this paper, we present a unifying deep learning framework built upon a richer data structure that includes widely adopted topological domains.
Specifically, we first introduce combinatorial complexes, a novel type of topological domain. Combinatorial complexes can be seen as generalizations of graphs that maintain certain desirable properties. Similar to hypergraphs, combinatorial complexes impose no constraints on the set of relations. In addition, combinatorial complexes permit the construction of hierarchical higher-order relations, analogous to those found in simplicial and cell complexes. Thus, combinatorial complexes generalize and combine useful traits of both hypergraphs and cell complexes, which have emerged as two promising abstractions that facilitate the generalization of graph neural networks to topological spaces.
Second, building upon combinatorial complexes and their rich combinatorial and algebraic structure, we develop a general class of message-passing combinatorial complex neural networks (CCNNs), focusing primarily on attention-based CCNNs. We characterize permutation and orientation equivariances of CCNNs, and discuss pooling and unpooling operations within CCNNs in detail.
Third, we evaluate the performance of CCNNs on tasks related to mesh shape analysis and graph learning. Our experiments demonstrate that CCNNs have competitive performance as compared to state-of-the-art deep learning models specifically tailored to the same tasks. Our findings demonstrate the advantages of incorporating higher-order relations into deep learning models in different applications.