Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
MS197, part 2: Numerical differential geometry
Time:
Tuesday, 09/Jul/2019:
3:00pm - 5:00pm

Location: Unitobler, F-112
30 seats, 54m^2

Presentations
3:00pm - 5:00pm

Numerical Differential Geometry

Chair(s): Tingran Gao (THE UNIVERSITY OF CHICAGO, United States of America), Ke Ye (Chinese Academy of Sciences)

The profound theory of differential geometry have interacted with the computational and statistical communities in the past decades, yielding fruitful outcomes in a wide range of fields including manifold learning, Riemannian optimization, and geometry processing. This minisymposium encourages researchers from applied differential geometry, optimization, manifold learning, and geometry processing to share their perspectives and technical tools on problems lying in the intersection of geometry and computations.

 

(25 minutes for each presentation, including questions, followed by a 5-minute break; in case of x<4 talks, the first x slots are used unless indicated otherwise)

 

Anisotropic Diffusion Kernels to Compare Distributions

Xiuyuan Cheng
Duke University

We introduce a kernel-based Maximum Mean Discrepancy (MMD) statistic for measuring the distance between two distributions from finitely-many multivariate samples, where the kernel is anisotropic. The kernel computes the affinity between n data points and a set of nR reference points, where nR can be drastically smaller than n. When the unknown distributions are locally low-dimensional, the proposed MMD test can be more powerful to distinguish certain alternatives, which is theoretically characterized by the spectral decomposition of the kernel. The consistency of the test is proved as long as the magnitude of the distribution departure is of a higher order than n^{-1/2}, and a finite-sample lower bound of the testing power is provided. The test is applied to flow cytometry and diffusion MRI datasets, which motivate the proposed approach to compare distributions.

 

Coupled Geometric and Topological Basis for Data-Driven Shape Reconstruction

Qixing Huang
The University of Texas at Austin

We introduce a data-driven geometry reconstruction method with provable guarantees. A key enabler for the robustness of our shape recovery, with geometric and topological fidelity, is a new coupled basis representation that combines a voxelized implicit form of the shape geometry, and a vectorized persistent diagram of the shape topology. Our method optimizes an objective function that enforces the agreement between the reconstructed shape and the input point cloud, regularizes geometry and topology of the reconstruction with data and enforces the consistency between the geometric prior and the topological prior. We show how to solve this optimization problem effectively by combing spectral initialization under the geometric representation alone and gradient-descent refinement under the coupled representation. In particular, we show that the spectral initialization does not need to be accurate, as the refinement procedure is able to improve the topology of the reconstruction. Experimental results on synthetic and real datasets justify the usefulness of our approach.

 

Intrinsic Gaussian processes on complex constrained domains

Mu Niu
Plymouth University

We propose a class of intrinsic Gaussian processes (in-GPs) for interpolation, regression and classification on manifolds with a primary focus on complex constrained domains or irregular-shaped spaces arising as subsets or submanifolds of R, R2, R3 and beyond. For example, in-GPs can accommodate spatial domains arising as complex sub- sets of Euclidean space. in-GPs respect the potentially complex boundary or interior conditions as well as the intrinsic geometry of the spaces. The key novelty of the proposed approach is to utilise the relationship between heat kernels and the transition density of Brownian motion on manifolds for constructing and approximating valid and computation- ally feasible covariance kernels. This enables in-GPs to be practically applied in great generality, while existing approaches for smoothing on constrained domains are limited to simple special cases. The broad utilities of the in-GP approach are illustrated through simulation studies and data examples.

 

Locally Linear Embedding on Manifold

Nan Wu
Duke University

Locally Linear Embedding(LLE), is a well known manifold learning algorithm published in Science by S. T. Roweis and L. K. Saul in 2000. In this talk, we provide an asymptotic analysis of the LLE algorithm under the manifold setup. We establish the kernel function associated with the LLE and show that the asymptotic behavior of the LLE depends on the regularization parameter in the algorithm. We show that on a closed manifold, asymptotically we may not obtain the Laplace--Beltrami operator, and the result may depend on the non-uniform sampling, unless a correct regularization is chosen. The talk is based on the joint work with Hau-tieng Wu.