For more information on my research, teaching, and myself, see my website.
In short, I work on:
■ Methods that are helpful for 1) prediction problems with limited and/or noisy data, 2) intelligent gathering of data, experimental design and active learning, 3) decision making under uncertainty.
■ Making neural networks 1) more robust by improving estimates of their uncertainty when making predictions, and 2) more automatic, by creating methods that automatically tune hyperparameters. In the long term, I hope to develop a convenient way that allows the structure of a neural network to be learned just as easily as the weights. Statistical analysis, Bayesian inference and information theory underly the approaches that I take.
If any of this interests you, do reach out. I welcome new connections to people from industry who are interested in applying my methods and making them work really well in practice. If you are interested in doing a PhD, please get in touch after reading my website.
et al., 2022, Numerically Stable Sparse Gaussian Processes via Minimum Separation using Cover Trees
Ouderaa TFAVD, Romero DW, Wilk MVD, 2022, Relaxing Equivariance Constraints with Non-stationary Continuous Filters
Ouderaa TFAVD, Wilk MVD, 2022, Learning Invariant Weights in Neural Networks
et al., 2022, Invariance Learning in Deep Neural Networks with Differentiable Laplace Approximations