Research

The less time scientists have to wait for code to run, the more time they can spend thinking about the problems they are tackling. My research aims to develop methods to support the scientists taking on the problems of today.

I’m interested in incorporating probabilistic techniques into classical algorithms to develop methods which are fast and reliable, both in theory and in practice. Right now, I in the field of Numerical Linear Algebra on Krylov subspace methods, specifically the conjugate gradient algorithm. I hope that my work will help to bridge the gap between theoretical computer science and applied computational science.

I am committed to making my research accessible and to facilitating the reproducibility of my work. Please feel free to contact me with any questions or concerns about my research.

I’m advised by Anne Greenbaum.

This page contains a collection of the things I’ve been working on recently. In general, I try to include a description of papers which are readable by a broader audience interested in learning about or keeping up with recent advancements in the field. I also have some short introductory pieces on topics I think are interesting.

Publications

Here are links to my Google Scholar profile and ORCID: 0000-0002-1187-1026.

Tyler Chen.
arXiv:1905.01549.
Anne Greenbaum, Hexuan Liu, Tyler Chen.
arXiv:1905.05874.

Talks and Posters

Symmetric Preconditioner Refinement Using Low Rank Approximations.
Tyler Chen.
Presentation at Baidu Research.

Introductions to some topics I think are interesting

Collaboration

I’m always interested in finding things to collaborate on (and people to collaborate with).

If you’re an undergrad student interested in research or grad school, please feel free to reach out; I’d be happy to try to help you find something to work on! You may also be interested in the Women in Applied Mathematics Mentorship Program.