Snigdha Panigrahi

Snigdha Panigrahi

Assistant Professor of Statistics

University of Michigan

About me


I am an Assistant Professor of Statistics and also hold a courtesy appointment at the Department of Biostatistics. My work involves developing tools for replicable learning from complex data.

My current research, supported by both the National Science Foundation and the National Institutes of Health, focuses on quantifying uncertainties in outputs from machine learning methods. My work was recognized with the CAREER Award for early-career faculty by the National Science Foundation in the 2023 funding cycle and the Bernoulli Society’s New Researcher Award in Mathematical Statistics for 2025.

I have been serving as a member of the editorial board for the Journal of Computational and Graphical Statistics and as an elected member of the International Statistical Institute since 2021.

Research highlights


Selective inference/ Post-selection inference

Selective inference offers a rigorous statistical framework for making honest inferences post selection. My recent and ongoing work focuses on developing flexible, tractable, and scalable methods of selective inference.

Most notably, methods have been developed that: (i) handle selection events that do not have a polyhedral shape or even straightforward descriptions; (ii) provide distribution-free inferences for different types of data, rather than relying on specific parametric models such as Gaussian models; (iii) yield easy-to-solve selective inferences in both a Bayesian and frequentist framework.

Click HERE for some of the main research highlights.

Beyond polyhedral-shaped selection events: New methods of selective inference have been developed to address selection events that do not have a polyhedral shape or easy descriptions. These advancements have made it possible to perform selective inferences for problems involving the selection of [groups of variables], [multi-task learning], [graph learning], and unsupervised learning such as [PCA].


Distribution-free inferences: My work under this theme broadens the scope of selective inferences beyond normal data. This includes asymptotic properties of conditional methods for a [nonparametric] family of models, selective inferences for likelihood-based estimation problems using the [GLMs] and more general estimation problems such as quasi-likelihood estimation and [quantile regression].


Tractable and scalable inferences: My work develops: (i) sampling methods, leveraging the benefits of the [Bayesian] machinery, (ii) probabilistic [approximation] techniques that bypass sampling or resampling from data and instead rely on convex optimization techniques, (iii) [exact] methods for polyhedral-shaped selection events.

CLOSE

Data aggregation and integration

My work utilizes new randomization-based tools for aggregating and integrating information from large datasets. This has led to the development of high-dimensional methods for drawing inferences from distributed data, conducting efficient meta-analyses, and inferring from models integrated over different modalities of information.

Click HERE for some of the main research highlights.

Large-scale estimation from data: My work enables the integration of low-dimensional summary statistics from many large datasets for [unbiased estimation] and [large-scale inferences]. Besides data aggregation, I also develop methods that allow for inferences in data-adaptive integrative models. These models combine various types of data, such as imaging and genomics, to model clinical outcomes in [cancer] patients.

CLOSE

Scientific applications

My research finds applications in various scientific fields, specifically in the development of replicable inferential pipelines. This gives me an opportunity to collaborate with data scientists from diverse fields.

Click HERE to find some of the practical applications of my methodology in various domains.

Applications: The applications of my work have included analyzing data from [genomics], [imaging], [neuroimaging], and [health trials].

Some of my current efforts are directed toward developing honest inferences and enabling reliable decision-making in mobile health trials. Additionally, I have an ongoing collaboration with the National Institutes of Health to understand malaria serology.

CLOSE

Miscellaneous

Some of my previous work in applied probability has studied the [path characteristics] and [integral geometric properties] of random fields. In addition, I’ve worked on learning and structure estimation using [mixture models].

Papers and software

New Papers (2024)


Yumeng Wang, Snigdha Panigrahi, and Xuming He. Asymptotically-exact selective inference for quantile regression. 2024. [arxiv]


Ronan Perry, Snigdha Panigrahi, Jacob Bien, and Daniela Witten. Inference on the proportion of variance explained in principal component analysis. 2024. [arxiv]


Snigdha Panigrahi, Kevin Fry, and Jonathan Taylor. Exact Selective Inference with Randomization. 2024. Biometrika. [arxiv] [publication]


Snigdha Panigrahi, Jingshen Wang, and Xuming He. Treatment Effect Estimation via Efficient Data Aggregation. 2024. Bernoulli (Accepted). [arxiv]


Snigdha Panigrahi, Natasha Stewart, Chandra Sripada, and Elizaveta Levina. Selective Inference for Sparse Multitask Regression with Applications in Neuroimaging. 2024. Annals of Applied Statistics. [arxiv] [publication]

Papers (2023 and older)


Yiling Huang, Snigdha Panigrahi, and Walter Dempsey. Selective Inference for Sparse Graphs via Neighborhood Selection. 2023. [arxiv]


Yiling Huang, Sarah Pirenne, Snigdha Panigrahi, and Gerda Claeskens. Selective Inference using Randomized Group Lasso Estimators for General Models. 2023. [arxiv]


Snigdha Panigrahi. Carving model-free inference. 2023. Annals of Statistics. [arxiv] [publication]


Snigdha Panigrahi, Peter W. Macdonald, and Daniel Kessler. Approximate Post-Selective Inference for Regression with the Group LASSO. 2023. Journal of Machine Learning Research. [arxiv] [publication]


Sifan Liu and Snigdha Panigrahi. Selective Inference with Distributed Data. 2023. [arxiv]


Snigdha Panigrahi, Shariq Mohammad, Arvind Rao, and Veerabhadran Baladandayuthapani. Integrative Bayesian models using Post-selective Inference: a case study in Radiogenomics. 2022. Biometrics. [arxiv] [publication]


Snigdha Panigrahi and Jonathan Taylor. Approximate selective inference via maximum likelihood. 2022. Journal of the American Statistical Association [arxiv]; [publication]


Snigdha Panigrahi, Jonathan Taylor, and Asaf Weinstein. Integrative methods for post-selection inference under convex constraints. 2021. Annals of Statistics. [arxiv]; [publication]


Snigdha Panigrahi, Parthanil Roy, and Yimin Xiao. Maximal Moments and Uniform Modulus of Continuity for Stable Random Fields. 2021. Stochastic processes and their applications. [arxiv]; [publication]


Basil Saeed, Snigdha Panigrahi, and Caroline Uhler. Causal Structure Discovery from Distributions Arising from Mixtures of DAGs. 2020. International Conference on Machine Learning. [arxiv]; [publication]


Snigdha Panigrahi, Junjie Zhu, and Chiara Sabatti. Selection-adjusted inference: an application to confidence intervals for cis-eQTL effect sizes. 2019. Biostatistics. [arxiv]; [publication]


Qingyuan Zhao and Snigdha Panigrahi. Selective Inference for Effect Modication: An Empirical Investigation. 2019. Observational Studies: Special issue devoted to ACIC. [arxiv]; [publication]


Snigdha Panigrahi, Nadia Fawaz, and Ajith Pudhiyaveetil. Temporal Evolution of Behavioral User Personas via Latent Variable Mixture models. 2019. IUI Workshops on Exploratory Search and Interactive Data Analytics. [arxiv]; [publication]


Snigdha Panigrahi, Jonathan Taylor, and Sreekar Vadlamani. Kinematic Formula for Heterogeneous Gaussian Related Fields. 2018. Stochastic processes and their applications. [arxiv]; [publication]


Snigdha Panigrahi and Jonathan Taylor. Scalable methods for Bayesian selective inference. 2018. Electronic Journal of Statistics. [arxiv]; [publication]


Snigdha Panigrahi, Jelena Markovic, and Jonathan Taylor. An MCMC-free approach to post-selective inference. 2017. [arxiv]


Xiaoying Tian Harris, Snigdha Panigrahi, Jelena Markovic, Nan Bi, and Jonathan Taylor. Selective sampling after solving a convex problem. 2016. [arxiv]

                SOFTWARE     My contributions to software development in the field of Selective Inference can be tracked here: [Github]

Teaching

 
 
 
 
 

STATS 415 - Data Mining and Machine Learning

Upper-level Undergraduate Course

Jan 2024 – Present
I have taught this course in the Winter semester of 2024.
 
 
 
 
 

STATS 600 - Linear Models

Graduate Level Course - Core course on regression for the Statistics Ph.D. program

Aug 2020 – Present
I taught this course in the Fall semesters of 2020, 2022 and 2023.
 
 
 
 
 

STATS 605 - Statistical methods for Adaptive Data Analysis

Special Topics Course for PhD students

Jan 2019 – Apr 2019
I designed and taught this course in 2019 to cover contemporary topics in adaptive data analysis and selective inference.
 
 
 
 
 

STATS 280 - Honors Introduction to Statistics & Data Analysis

Lower-level Undergraduate Course for Honors Students

Aug 2018 – Dec 2022
I taught this course in the Fall semesters of 2018, 2019, 2020, 2022.

Meet the Team

PhD Students

Yumeng Wang
Soham Bakshi
Yiling Huang
Judy Wu