Conference Agenda

Overview and details of the sessions of this conference. Please select a date or location to show only sessions at that day or location. Please select a single session for detailed view (with abstracts and downloads if available).

 
Session Overview
Session
MS171, part 2: Grassmann and flag manifolds in data analysis
Time:
Saturday, 13/Jul/2019:
3:00pm - 5:00pm

Location: Unitobler, F007
30 seats, 59m^2

Presentations
3:00pm - 5:00pm

Grassmann and flag manifolds in data analysis

Chair(s): Chris Peterson (Colorado State University, United States of America), Michael Kirby (Colorado State University), Javier Alvarez-Vizoso (Max-Planck Institute for Solar System Research in Göttingen)

A number of applications in large scale geometric data analysis can be expressed in terms of an optimization problem on a Grassmann or flag manifold.The solution of the optimization problem helps one to understand structure underlying a data set for the purposes such as classification, feature selection, and anomaly detection.

For example, given a collection of points on a Grassmann manifold, one could imagine finding a Schubert variety of best fit corresponds to minimizing some function on the flag variety parameterizing the given class of Schubert varieties.

A number of different algorithms that exist for points in a linear space have analogues for points in a Grassmann or flag manifold such as clustering, endmember detection, self organized mappings, etc

The purpose of this minisymposium is to bring together researchers who share a common interest in algorithms and techniques involving Grassmann and Flag varieties applied to problems in data analysis.

 

(25 minutes for each presentation, including questions, followed by a 5-minute break; in case of x<4 talks, the first x slots are used unless indicated otherwise)

 

A dual subgradient approach to computing an optimal rank Grassmannian circumcenter

Tim Marrinan
Université de Mons

This talk concerns the circumcenter of a collection of linear subspaces. When the subspaces are k-dimensional subspaces of n-dimensional Euclidean space, this can be cast as an infinity-norm minimization problem on a Grassmann manifold, Gr(k,n). For subspaces of different dimension, the setting becomes a disjoint union of Grassmannians rather than a single manifold, and the problem is no longer well-defined. However, natural geometric maps exist between these manifolds with a well-defined notion of distance for the images of the subspaces under the mappings. Solving the initial problem in this context leads to a candidate circumcenter on each of the constituent manifolds, but does not inherently provide intuition about which candidate is the best representation of the data. Additionally, the solutions of different rank are generally not nested so a deflationary approach will not suffice, and the problem must be solved independently on each manifold. In this talk we propose and solve an optimization problem parametrized by the rank of the circumcenter. The solution can be computed approximately using a dual subgradient algorithm. By scaling the objective and penalizing the information lost by the rank-k circumcenter, we jointly recover an optimal dimension, k*, and a central subspace on Gr(k*,n) that best represents the correlated subspace of the data.

 

Low Rank Representations of Matrices using Nuclear Norm Heuristics

Silvia Dinica
Romanian Senate

The connection between the entries of an Euclidean distance matrix and the nuclear norm of the matrix in the positive semidefinite cone given by the one to one correspondence between the two cones. In the case when the Euclidean distance matrix is the distance matrix for a complete k-partite graph, the nuclear norm of the associated positive semidefinite matrix can be evaluated in terms of the second elementary symmetric polynomial evaluated at the partition.

For k-partite graphs the maximum value of the nuclear norm of the associated positive semidefinite matrix is attained in the situation when we have equal number of vertices in each set of the partition. This result can be used to determinea lower bound on the chromatic number of the graph.

 

Grassmann Tangent-Bundle Means

Justin Marks
Gonzaga University

Applications of geometric data analysis often involve producing collections of subspaces, such as illumination spaces for digital imagery. For a given collection of subspaces, a natural task is to find the mean of the collection. A robust suite of algorithms has been developed to generate mean representatives for a collection of subspaces of xed dimension, or equivalently, a collection of points on a particular Grassmann manifold. These representatives include the flag mean, the normal mean, and the Karcher mean. In this talk, we catalogue the types of means and present comparative heuristics for the suite of mean representatives. We respond to, and at times, challenge, the conclusions of a recent paper outlining various means built via tangent-bundle maps on the Grassmann manifold.