Directeur : Christian Robert
This thesis focuses on accelerating Markov chain Monte Carlo algorithms for Bayesian computation; the contributions consist of devising novel MCMC samplers and scaling classic MCMC algorithm to massive data.
Through updating only one coordinate of the parameter between two event times, we derive a novel non-reversible, continuous-time, Gibbs-like MCMC sampler, Coordinate Sampler, and show its acceleration over zigzag sampler empirically. Furthermore, we prove that the Markov chain induced by coordinate sampler exhibits geometrical ergodicity for distributions which tails decay at least as fast as an exponential distribution and at most as fast as a Gaussian distribution.
By keeping the momentum across iterations unchanged, we derive an efficient variant of Hamiltonian Monte Carlo to recycle the intermediate proposals along leapfrog path. By replacing the deterministic transition dynamic in bouncy particle sampler with a random one, we overcome the reducibility problem suffered from by canonical bouncy particle sampler. Besides, by embedding random forest in divide-and-conquer methods, we scale MCMC for big data and show that this novel algorithm works well, in low-dimensional setting, from Gaussian case to strongly non Gaussian case, and to model misspecification with the help of random forest and scaled sub-posteriors.