Imperial College London

MrXingLiu

Faculty of Natural SciencesDepartment of Mathematics

Research Postgraduate
 
 
 
//

Contact

 

xing.liu16 Website

 
 
//

Location

 

Room 617Huxley BuildingSouth Kensington Campus

//

Summary

 

Publications

Publication Type
Year
to

4 results found

Liu X, Duncan A, Gandy A, 2023, Using Perturbation to Improve Goodness-of-Fit Tests based on Kernelized Stein Discrepancy, Fortieth International Conference on Machine Learning

Conference paper

Huang KH, Liu X, Duncan AB, Gandy Aet al., 2023, A High-dimensional Convergence Theorem for U-statistics with Applications to Kernel-based Testing, Pages: 3827-3918

We prove a convergence theorem for U-statistics of degree two, where the data dimension d is allowed to scale with sample size n. We find that the limiting distribution of a U-statistic undergoes a phase transition from the non-degenerate Gaussian limit to the degenerate limit, regardless of its degeneracy and depending only on a moment ratio. A surprising consequence is that a non-degenerate U-statistic in high dimensions can have a non-Gaussian limit with a larger variance and asymmetric distribution. Our bounds are valid for any finite n and d, independent of individual eigenvalues of the underlying function, and dimension-independent under a mild assumption. As an application, we apply our theory to two popular kernel-based distribution tests, MMD and KSD, whose high-dimensional performance has been challenging to study. In a simple empirical setting, our results correctly predict how the test power at a fixed threshold scales with d and the bandwidth.

Conference paper

Liu X, Zhu H, Ton J-F, Wynne G, Duncan Aet al., 2022, Grassmann stein variational gradient descent, International Conference on Artificial Intelligence and Statistics, Publisher: JMLR-JOURNAL MACHINE LEARNING RESEARCH, Pages: 1-20, ISSN: 2640-3498

Stein variational gradient descent (SVGD) is a deterministic particle inference algorithm that provides an efficient alternative to Markov chain Monte Carlo. However, SVGD has been found to suffer from variance underestimation when the dimensionality of the target distribution is high. Recent developments have advocated projecting both the score function and the data onto real lines to sidestep this issue, although this can severely overestimate the epistemic (model) uncertainty. In this work, we propose Grassmann Stein variational gradient descent (GSVGD) as an alternative approach, which permits projections onto arbitrary dimensional subspaces. Compared with other variants of SVGD that rely on dimensionality reduction, GSVGD updates the projectors simultaneously for the score function and the data, and the optimal projectors are determined through a coupled Grassmann-valued diffusion process which explores favourable subspaces. Both our theoretical and experimental results suggest that GSVGD enjoys efficient state-space exploration in high-dimensional problems that have an intrinsic low-dimensional structure.

Conference paper

Zhu H, Liu X, Kang R, Shen Z, Flaxman S, Briol F-Xet al., 2020, Bayesian Probabilistic Numerical Integration with Tree-Based Models, Conference on Neural Information Processing Systems

Conference paper

This data is extracted from the Web of Science and reproduced under a licence from Thomson Reuters. You may not copy or re-distribute this data in whole or in part without the written consent of the Science business of Thomson Reuters.

Request URL: http://wlsprd.imperial.ac.uk:80/respub/WEB-INF/jsp/search-html.jsp Request URI: /respub/WEB-INF/jsp/search-html.jsp Query String: respub-action=search.html&id=01200990&limit=30&person=true