# Math Department Colloquium

The colloquium features recent research in the mathematical and statistical sciences.The colloquia are scheduled for Tuesdays from 4pm-5pm in ILC 404, unless noted otherwise. Refreshments will be served in MB226 from 2pm-3pm before each talk.

If you wish to be added to the department colloquium mailing list, or if you wish to give a colloquium talk, please contact the organizer Grady Wright.

Archive of past math department colloquium abstracts

## Schedule for 2019–2020

### Spring 2020

#### February 25 – Matthew Ferguson, Boise State University, Physics

Title: The Living Genome: Quantifying Gene Expression and Regulation in Living Cells by Fluorescence Fluctuation Imaging

Abstract: Mechanisms of transcription and translation take the information encoded in the genome and make it “work” in cells, through the production of proteins defined by nucleic acid coding regions. This involves the coordination of many multi-subunit complexes about which most knowledge is inferred from ensemble and/or in vitro assays, giving a detailed but static picture. How these macromolecular machines coordinate in living cells remains unknown but recent advances in the application of fluctuation analysis to time resolved multi-color fluorescence imaging can now give an unprecedented level of dynamic in vivo information. My talk will describe recent results on a study of RNA transcription and splicing in human cancer cell lines. By two color, in vivo, RNA fluorescent labeling, we visualize the rise and fall of the intron and exon during transcription of a single gene in human cells. By cross- correlation analysis, we determine the speed of transcription, co- transcriptional splicing and termination of the RNA transcript discerning correlation between elongation, splicing, cleavage and their relation to the chromatin environment.

#### March 3 – Elina Robeva, University of British Columbia

Title: Estimation of Totally Positive Densities

Abstract: Nonparametric density estimation is a challenging problem in theoretical statistics — in general a maximum likelihood estimate (MLE) does not even exist! Introducing shape constraints allows a path forward. In this talk I will discuss non-parametric density estimation under total positivity (i.e. log-supermodularity) and log-concavity. I will first show that though they possess very special structure, totally positive random variables are quite common in real world data and possess appealing mathematical properties. Given i.i.d. samples from a totally positive distribution, we prove that the maximum likelihood estimator exists with probability one assuming there are at least 3 samples. We characterize the domain of the MLE and show that it is in general larger than the convex hull of the observations. If the observations are 2-dimensional or binary, we show that the logarithm of the MLE is a tent function (i.e. a piecewise linear function) with “poles” at the observations, and we show that a certain convex program can find it. Instead of using a maximum likelihood estimator, we discuss the possibility of using kernel density estimation. This new estimator raises an abundance of theoretical questions.

#### March 10 – Natasha Dobrinen, Denver University

Title: Ramsey Theory for Infinite Structures and Set Theoretic Methods

Abstract: Ramsey’s Theorem for the natural numbers says that given any positive integer k and any coloring of the k-sized sets of natural numbers into finitely many colors, there is an infinite subset on which all the k-sized subsets have the same color. Extensions of this theorem to infinite structures have posed extra challenges. In this talk, we will provide background on previous analogues of Ramsey’s theorem to infinite structures. Then we will introduce the method of “strong coding trees” which the speaker invented to develop Ramsey theory for Henson graphs, analogues of the Rado graph which omit k-cliques. The method is turning out to have broader implications for other types of homogeneous structures. Central to these results is the use of forcing methods to do unbounded searches for finite objects, building on previous work of Harrington.

#### April 7 – Sasha Wang, Boise State University, Mathematics

Title: TBP

Abstract: TBP

#### April 21 – Roman Kossak, CUNY

Title: TBP

Abstract: TBP

#### April 28 – Bryan Quaife, Florida State University

Title: TBP

Abstract: TBP

### Fall 2019

#### September 10 – Aykut Satici, Boise State University, Mechanical Engineering

Title: Exploiting sum-of-squares programming for the analysis and design of robotic manipulators

Abstract: The main theme of this work is the use of a certain type of convex optimization to analyze and design robotic manipulators. On the analysis front, we provide a general framework to determine inner and outer approximations to the singularity free workspace of robotic manipulators. A similar framework is utilized for optimal dimensional synthesis of robotic manipulators with respect to their kinematic and dynamic properties. This framework utilizes the sum-of-squares optimization technique, which is numerically implemented by semidefinite programming. In order to apply the sum-of-squares optimization technique, we convert the trigonometric functions in the kinematics of the manipulator to polynomial functions with an additional constraint. Sum-of-squares programming is shown to promise advantages as it can provide globally optimal results up to machine precision and scales better with respect to the number of design variables than other methods which can obtain globally optimal solutions.

#### September 17 – Nicholas J. Horton, Amherst College

Title: Multivariate thinking and the introductory statistics and data science course: preparing students to make sense of a world of observational data

Abstract: We live in a world of ever expanding “found” (or observational) data. To make decisions and disentangle complex relationships, students need a solid background in design and confounding. The revised Guidelines for Assessment and Instruction in Statistical Education (GAISE) College Report enunciated the importance of multivariate thinking as a way to move beyond bivariate thinking. But how do such learning outcomes compete with other aspects of statistics knowledge (e.g., inference and p-values) in introductory courses that are already overfull. In this talk I will offer some reflections and guidance about how we might move forward, with specific implications for introductory statistics and data science courses.

#### September 24 – Donna Calhoun, Boise State University, Mathematics

Title: The Serre-Green-Naghdi equations for modeling shallow, dispersive geophysical flows

Abstract: The depth-averaged shallow water wave equations are commonly used to model flows arising from natural hazards. The GeoClaw code, developed by D. George, R. J. LeVeque, M. J. Berger, K. Mandli and others is one example of a depth-averaged flow solver now widely used for modeling tsunamis, overland flooding, debris flows, storm surges and so on. Generally, depth averaged flow models show excellent large scale agreement with observations and can thus be reliably used to predict whether tsunamis will reach distant coast lines, and if, so can give vital information about arrival times. However, for other types of flows, dispersive effects missing from the SWE model can play an important role in determining localized effects such as whether waves will overtop seawalls, or whether a landslide entering a lake will trigger tsunami-like behavior on the opposite shore. Because of the importance of these dispersive effects, several depth averaged codes include dispersive corrections to the SWE. One set of equations commonly used to model these dispersive effects are the Serre-Green-Naghdi (SGN)equations.

We will present our work to include dispersive correction terms into the GeoClaw extension of ForestClaw, a parallel adaptive library for Cartesian grid methods. One formulation of the SGN equations stabilizes higher order derivatives by treating them implicitly. As a result, a key component of an SGN solver is a variable coefficient Poisson solver. We will discuss our current work in developing both an iterative solver, based on multi-grid preconditioned BiCG-STAB solver (Scott Aiton, Boise State), and a direct solver based on the Hierarchical-Poincaré-Steklov (HPS) method developed by Gillman and Martinsson (2014). We will describe the SGN equations and provide an overview of their derivation, and then show preliminary results on uniform Cartesian meshes. Comparisons with the SGN solver in Basilisk (S. Popinet) and BoussClaw (J. Kim et al) will also be shown to verify our model.

#### October 1 – Katherine E. Stange, University of Colorado, Boulder

Title: Cryptography in the face of quantum computers

Abstract: When quantum computers are engineered to scale, quantum algorithms will be able to break our current cryptographic standards. So what do we replace them with? I’ll discuss two of the front-runners: ring-learning-with-errors (based on lattices in number fields) and isogeny-based cryptography (based on elliptic curves). I’ll describe the fundamental “hard problems” (without assuming much background in number theory) which we believe quantum (or classical) computers cannot solve efficiently. I’ll explain a little about why we might believe that, and what we can do if we do.

#### October 8 – James P. Keener, University of Utah

Title: The mathematics of life: making diffusion your friend

Abstract: Diffusion is the enemy of life. This is because diffusion is a ubiquitous feature of molecular motion that is constantly spreading things out, destroying molecular aggregates. However, all living organisms, whether single cell or multicellular have ways to use the reality of molecular diffusion to their advantage. That is, they expend energy to concentrate molecules and then use the fact that molecules move down their concentration gradient to do useful things.

In this talk, I will show some of the ways that cells use diffusion to their advantage, to signal, to form structures and aggregates, and to make measurements of length and size of populations. Among the examples I will describe are signalling by nerves, cell polarization, bacterial quorum sensing, and regulation of flagellar molecular motors. In this way, I hope to convince you that living organisms have made diffusion their friend, not their enemy.

#### October 15 – William A. Bogley, Oregon State University

Title: Combinatorial Group Theory Treasures from the Non-aspherical Realm

Abstract: In the 1950s it was shown that there are significant limits to what can be discovered about groups using algorithmic means alone. For families of groups described in terms of generators and relations, it can be difficult to work out answers to elementary questions such as which members of the family are finite, or which are trivial. I will discuss the role of planar and spherical diagrams along with the concept of “asphericity,” which provides a practical filter that can be used to isolate interesting cases. I will also report on recent work by Matthias Merzenich, who has employed computer-based recursive methods to construct previously unseen objects.

#### October 22 – Frank Giraldo, Naval Postgraduate School

Title: Efficient Time-Integration Strategies for Non-hydrostatic Atmospheric Models

Abstract: The Non-hydrostatic Unified Model of the Atmosphere (NUMA) is a compressible Navier-Stokes solver that sits inside of the U.S. Navy’s NEPTUNE weather model based on high-order continuous and discontinuous Galerkin (CG/DG) methods. Therefore, it is imperative that NUMA runs as efficiently as possible without sacrificing accuracy and conservation. One of the last places to squeeze out more performance is in the time-integration strategy used in the model. In this talk, I will review the various time-integration strategies currently available in NUMA such as: fully explicit methods with large stability regions, fully implicit methods, implicit-explicit methods, and multirate methods. With multirate methods, the idea is to partition the processes with different speeds (or stiffness) in some hierarchical way in order to use time-steps commensurate with the wave speed of each process. However, to gain all the benefits of this approach requires fully embracing this idea at the code level which then means that the time-integrators and spatial discretization methods have to be fully aware of each other (complicates the code). In this talk, I will discuss our recent results on time-integrators and possible preconditioning strategies for the implicit solvers and our experience shows that IMEX time-integrators are only competitive with explicit time-integrators if and only if the Schur complement is used.

#### October 29 – Peter Schmid, Imperial College London

Title: Koopman analysis and dynamic modes

Abstract: Koopman analysis is a mathematical technique that embeds nonlinear dynamical systems into a linear framework based on a sequence of observables of the state vector. Computing the proper embeddings that result in a closed linear system requires the extraction of the eigenfunctions of the Koopman operator from data. Dynamic modes approximate these eigenfunctions via a tailored data-matrix decomposition. The associated spectrum of this decomposition is given by a convex optimization problem that balances data-conformity with sparsity of the spectrum. The Koopman-dynamic-mode process will be discussed and illustrated on physical examples.

#### November 5 – Lynn Schreyer, Washington State University

Title: Developing a Deterministic Model for Refugee Migration (or Mammal Migration) based on Physics of Fluid Flow through Porous Media

Abstract: Here we discuss how we developed a model for the movement of refugees by adopting a model for fluid flow through porous media. The result is a model that can also be applied for mammal (or more specifically, ungulates – hooved mammals) migration. This model accounts for characteristics that have not yet been captured by current deterministic (or stochastic) differential equation models such as terrain characteristics and that people (and herds) prefer to stay at an “equilibrium density” – not too far nor too close to each other. To develop such a model, we started with a second-order parabolic differential equation, then modified the governing equation to a forward-backward parabolic differential equation, and currently we are using a fourth-order differential equation known as the Cahn-Hilliard equation. In this talk we go through the evolution of our model, and will discuss some of the trials and tribulations of obtaining our current model.

#### November 12 – Chris Herald, University of Nevada, Reno

Title: Reduced Khovanov homology and the Fukaya category of the pillowcase

Abstract: In this talk, I’ll discuss the pillowcase, which is a 2-sphere with 4 “corners,” and an algebraic construction known as its Fukaya category. This is related to curves in the surface, and polygons formed by the curves.

I will describe some recent work with Matthew Hogancamp, Matthew Hedden and Paul Kirk relating this Fukaya category to a knot invariant called reduced Khovanov homology. Given a knot or link K in R^3 or S^3, separated into two 2-tangles by a 2-sphere, I will describe an algebraic construction in the Fukaya category of the pillowcase, from which one can obtain the reduced Khovanov homology of K.

#### November 19 – Daniel Appelö, University of Colorado, Boulder

Title: WaveHoltz: Parallel and Scalable Solution of the Helmholtz Equation via Wave Equation Iteration

Abstract: We introduce a novel idea, the WaveHoltz iteration, for solving the Helmholtz equation inspired by recent work on exact controllability (EC) methods. As in EC methods our method make use of time domain methods for wave equations to design frequency domain Helmholtz solvers but unlike EC methods we do not require adjoint solves. We show that the WaveHoltz iteration we propose is symmetric and positive definite in the continuous setting. We also present numerical examples, using various discretization techniques, that show that our method can be used to solve problems with rather high wave numbers.

This is joint work with, Fortino Garcia, University of Colorado, Boulder, USA and Olof Runborg, Royal Institute of Technology, Stockholm, Sweden.