My dissertation advisor is Kevin T. Kelly.

This is my CV. Contact me at first.last at gmail.

### Research

Inductive inference, methodology, Ockham's razor, reliabilism, formal epistemology, learning theory, belief revision, topology, statistics and machine learning.I study how to make reliable inferences from statistical data. I believe in *feasibility contextualism*, the thesis that epistemic justification consists in adopting the most reliable means for arriving at the true answer to the question under investigation, given the kind of data one can hope to receive. On that view, scientific preferences for theoretical virtues like simplicity, unity, or testability can be epistemically justified only by proving that they are necessary for good truth-finding performance. For example, Ockham's razor mandates a preference for simpler theories. Standard justifications do not explain why Ockham's razor is better at finding the truth than competing methods. However, Kelly, Glymour, and Schulte have shown that Ockham's razor is *necessary* for keeping inquiry on the straightest path to the truth. With Kelly, I have extended that result by refining the underlying notion of simplicity, elaborating the exact sense of "straightest path," and developing a simplicity-driven theory of belief revision better suited to inductive inference than standard AGM theory.

My thesis work constitutes a major advance for that approach. Previous results were grounded in a non-statistical account of information, on which information states are basic neighborhoods in a topological space. However, scientific data is statistical, and some critics, including Elliott Sober, doubt that the gap between propositional and statistical information can be bridged. I answer the skeptics by identifying the unique topology on probability measures whose open sets are exactly the statistically verifiable propositions. By means of that bridge result, I obtain a new foundation for Ockham's razor in statistical inference that is securely grounded in the aim of finding the truth, and that neither begs the question with a Bayesian prior bias toward simplicity, nor avoids it by changing the subject from inferring theories to selecting models. Those results establish a revealing new topological framework in which to study real empirical inquiry as practiced in machine learning, statistics and the sciences.