Skip to main content
Loading Events

« All Events

  • This event has passed.

Dr. Jodi Mead - Statistical Tests for Regularization

September 16 @ 10:30 am MDT

Dr. Jodi Mead, Professor of Mathematics, co-Director Computing PhD program

Dr. Mead’s Website


Event Zoom Link


Statistical Tests for Regularization


Jodi Mead received her PhD in Computational Mathematics in 1998 from Arizona State University under the direction of Rosemary Renaut. Upon graduation she held a postdoc in the College of Oceanic and Atmospheric Sciences at Oregon State University under the direction of Andrew Bennett. She is currently a professor in the mathematics department at Boise State University. While at Boise State she has led efforts to create a B.S in applied mathematics, M.S. in mathematics and PhD in Computing. She was the graduate coordinator for the M.S. in mathematics from 2007-2016, and then became a co-Director for the PhD in Computing.

Professor Mead has lead multiple NSF funded projects, in addition to participating in a number of other funded interdisciplinary projects. These projects involve data assimilation and inverse methods applied to geoscience fields ranging from oceanography, atmospheric sciences, hydrology, and geophysics. She was recently funded by NSF with Donna Calhoun on a project titled “Data-Enabled Modeling of Wildfire Smoke Transport”.


Regularization is necessary to solve ill-posed problems, such as those that occur in machine learning and inverse problems. Ill-posedness arises when there is insufficient data or incomplete models. The amount of regularization introduced to a problem is controlled with a regularization parameter. Typically the regularization parameter is adjusted manually until a solution is obtained that looks good, but there are methods for finding one such as the discrepancy principle, L-curve and generalized cross validation. In the case of least squares regularization we have developed a variant of the discrepancy principle that uses a chi-squared test to identify a regularization parameter, which reduces the overfitting that typically occurs with the discrepancy principle. More recently we have discovered a new chi-squared test for regularization parameter selection in Total Variation regularization (TV) that results when TV is viewed as a sparsity prior with a Laplacian distribution. I will show results from a few imaging problems using these chi-square tests for regularization parameter selection.