R. Teal Witter
I am a PhD candidate at NYU Tandon where I am fortunate to be advised by Chris Musco and Lisa Hellerstein. My work is generously supported by an NSF Graduate Research Fellowship.
I design and analyze algorithms, leveraging ideas from theoretical computer science and machine learning. My recent research has focused on randomized algorithms for problems with social impact.
I received my undergraduate degrees in Mathematics and Computer Science from Middlebury College. At Middlebury, I designed and analyzed quantum algorithms with Shelby Kimmel and applied math to board games with Alex Lyford.
Education
New York University
PhD in Computer Science • September 2020 - Present
Middlebury College
BA in Mathematics, BA in Computer Science • Summa Cum Laude • February 2017 - May 2020
Teaching
Middlebury CSCI 1052: Randomized Algorithms for Data Science
Course Instructor (Winter 2024)
Middlebury CSCI 1051: Deep Learning
Course Instructor (Winter 2023)
NYU CS-GY 6953: Deep Learning
Course Assistant (Fall 2022, Spring 2023, Fall 2023)
NYU CS-GY 6763: Algorithmic Machine Learning and Data Science
Course Assistant (Fall 2021, Spring 2022, Fall 2023)
NYU CS-GY 6923: Machine Learning
Course Assistant (Spring 2021, Spring 2023)
Papers
In the tradition of theoretical computer science, an asterisk (*) indicates that authors are listed in alphabetical order.
Kernel Banzhaf: A Fast and Robust Estimator for Banzhaf Values
Yurong Liu*, R. Teal Witter*, Flip Korn, Tarfah Alrashed, Dimitris Paparas, Juliana Freire
Preprint
Provably Accurate Shapley Value Estimation via Leverage Score Sampling
Christopher Musco*, R. Teal Witter*
Preprint
FairlyUncertain: A Comprehensive Benchmark of Uncertainty in Algorithmic Fairness
Lucas Rosenblatt*, R. Teal Witter*
Preprint
Benchmarking Estimators for Natural Experiments: A Novel Dataset and a Doubly Robust Algorithm
R. Teal Witter, Christopher Musco
Conference on Neural Information Processing Systems (NeurIPS 2024)
Minimizing Cost Rather Than Maximizing Reward in Restless Multi-Armed Bandits
R. Teal Witter, Lisa Hellerstein
Preprint
I Open at the Close: A Deep Reinforcement Learning Evaluation of Open Streets Initiatives
R. Teal Witter, Lucas Rosenblatt
AAAI Conference on Artificial Intelligence (AAAI 2024)
Robust and Space-Efficient Dual Adversary Quantum Query Algorithms
Michael Czekanski*, Shelby Kimmel*, R. Teal Witter*
European Symposium on Algorithms (ESA 2023)
Counterfactual Fairness Is Basically Demographic Parity
Lucas Rosenblatt, R. Teal Witter
AAAI Conference on Artificial Intelligence (AAAI 2023)
A Local Search Algorithm for the Min-Sum Submodular Cover Problem
Lisa Hellerstein*, Thomas Lidbetter*, R. Teal Witter*
International Symposium on Algorithms and Computation (ISAAC 2022)
Adaptivity Gaps for the Stochastic Boolean Function Evaluation Problem
Lisa Hellerstein*, Devorah Kletenik*, Naifeng Liu*, R. Teal Witter*
Workshop on Approximation and Online Algorithms (WAOA 2022)
How to Quantify Polarization in Models of Opinion Dynamics
Christopher Musco*, Indu Ramesh*, Johan Ugander*, R. Teal Witter*
International Workshop on Mining and Learning with Graphs (MLG 2022)
R. Teal Witter
International Conference on Combinatorial Optimization and Applications (COCOA 2021)
A Query-Efficient Quantum Algorithm for Maximum Matching on General Graphs
Shelby Kimmel*, R. Teal Witter*
Algorithms and Data Structures Symposium (WADS 2021)
Applications of Graph Theory and Probability in the Board Game Ticket to Ride*
R. Teal Witter, Alex Lyford
International Conference on the Foundations of Digital Games (FDG 2020)
Applications of the Quantum Algorithm for st-Connectivity
Kai DeLorenzo*, Shelby Kimmel*, R. Teal Witter*
Conference on the Theory of Quantum Computation, Communication and Cryptography (TQC 2019)
More Writing
I wrote lecture notes to accompany Chris Musco’s graduated algorithmic machine learning and data science class. I used a subset of these notes for my own randomized algorithms for data science class.
I developed code-based tutorials on adversarial image attacks, neural style transfer, variational autoencoders, and diffusion for Chris Musco’s graduate machine learning class.
I wrote notes on contrastive learning, stable diffusion, and implicit regularization for my deep learning class.
I curated code-based demos that accompany Chinmay Hegde’s graduate deep learning class and my own undergraduate deep learning class. Recordings of the demos are available here.
After struggling for years, I compiled a how-to guide for NYU’s high performance computing cluster.