@ Mathematical Data Science Faculty of Computer Science and Mathematics Universität Passau

Daniel Rudolf

Professor for Mathematical Data Science at the Faculty of Computer Science and Mathematics at the University Passau.
Institution: University of Passau
Faculty of Computer Science and Mathematics
Email: daniel.rudolf (at) uni-passau.de
Phone: +49 851 509 5140
Office: Raum 222, IM
Address: Universität Passau
Innstraße 33
94032 Passau


Felix-Bernstein-Institute for Mathematical Statistics
DFG Collaborative Research Center 1456
MATH Database (Zentralblatt)
Christoph Aistleitner Samuel Boardman Simon Breneis Benjamin Eltzner Josef Dick Manuel Diehn Michael Habeck Aicke HinrichsLaura Jula Vanegas David Krieg Robert J. Kunsch Krzysztof Latuszynski Felipe Medina Aguayo Axel Munk Viacheslav Natarovskii Maik Neukirch Erich Novak Joscha Prochno Art Owen Laurent Saloff-Coste Nikolaus Schweizer Björn Sprungk Mario Ullrich Andi Q. Wang Houying Zhu.
Research group:
Viacheslav Natarovskii (PhD student).
Former member of the research group:
Björn Sprungk (Junior Professor at the TU Freiberg, see also SIAG/UQ Early Career Prize 2019).

Past teaching

Statistical Data Science (2021)
Diskrete Stochastik (2020/21)
Hidden Markov models (2020)
Markov chains II (2019/20)
Markov chains I (2019)
Monte Carlo methods and stochastic simulations (2018/19)
Stochastik (2018)
Maß- und Wahrscheinlichkeitstheorie (2017/18)
Markovketten Monte-Carlo Methoden (2015)
Versicherungsmathematik (2014/15)
Markov chain Monte Carlo on general state spaces (2012)

Research interests

Editorial work


(see also Articles on arXiv)


4. Perturbation theory for killed Markov processes and quasi-stationary distributions,
with Andi Q. Wang,
submitted. [arxiv]
3. The minimal spherical dispersion,
with Simon Breneis and Joscha Prochno,
submitted. [arxiv]
2. Analyzing cross-talk between superimposed signals: Vector norm dependent hidden Markov models and applications,
with Laura Jula Vanegas, Benjamin Eltzner, Miroslav Dura, Stephan E. Lehnart, Axel Munk,
submitted. [arxiv]
1. Convergence of hybrid slice sampling via spectral gap,
with Krzysztof Latuszynski,
submitted. [arxiv]

Peer-reviewed publications:

31. The hit-and-run version of top-to-random,
with Samuel Boardman and Laurent Saloff-Coste,
accepted in J. Appl. Probab. [arxiv]
30. Geometric convergence of elliptical slice sampling,
with Viacheslav Natarovskii and Björn Sprungk,
PMLR 139, (2021), 7969-7978. [arxiv]
29. A strong law of large numbers for scrambled net integration,
with Art Owen,
SIAM Rev. 63, (2021), 360-372. [arxiv]
28. Stability of doubly-intractable distributions,
with Michael Habeck and Björn Sprungk,
Electron. Commun. Probab. 25, (2020), 1-13. [arxiv]
27. Quantitative spectral gap estimate and Wasserstein contraction of simple slice sampling,
with Viacheslav Natarovskii and Björn Sprungk,
Ann. Appl. Probab. 31, (2021), 806-825. [arxiv]
26. Expected dispersion of uniformly distributed points,
with Aicke Hinrichs, David Krieg and Robert J. Kunsch,
J. Complexity 61, (2020). [arxiv]
25. On a Metropolis-Hastings importance sampling estimator,
with Björn Sprungk,
Electron. J. Stat. 14, (2020), 857-889. [arxiv]
24. Optimal confidence for Monte Carlo integration of smooth functions,
with Robert J. Kunsch,
Adv. Comput. Math. 45, (2019), 3095-3122. [arxiv]
23. Perturbation bounds for Monte Carlo within Metropolis via restricted approximations,
with Felipe Medina-Aguayo and Nikolaus Schweizer,
Stoch. Proc. Appl. 130, (2020), 2200-2227. [arxiv]
22. The Amplitude-Phase Decomposition of the Magnetotelluric Impedance Tensor,
with Maik Neukirch, Xavier Garcia and Savitri Galiana,
Geophysics 84, (2019), A43-Z28. [arxiv]
21. A weighted discrepancy bound of quasi-Monte Carlo importance sampling,
with Josef Dick and Houying Zhu,
Stat. Prob. Letters 149 (2019), 100-106. [arxiv]
20. Solvable integration problems and optimal sample size selection,
with Robert J. Kunsch and Erich Novak,
J. Complexity 53 (2019), 40-67. [arxiv]
19. Maximum likelihood estimation in hidden Markov models with inhomogeneous noise,
with Manuel Diehn and Axel Munk,
ESAIM Probab. Stat. 23 (2019), 492-523. [arxiv]
18. Recovery algorithms for high-dimensional rank one tensors,
with David Krieg,
J. Approx. Theory 237 (2019), 17-29. [arxiv]
17. Comparison of hit-and-run, slice sampling and random walk Metropolis,
with Mario Ullrich,
J. Appl. Probab. 55 (2018), 1186-1202. [arxiv]
16. An upper bound of the minimal dispersion via delta covers,
Contemporary Computational Mathematics - A Celebration of the 80th Birthday of Ian Sloan, Springer-Verlag, (2018), 1099-1108. [arxiv]
15. Perturbation theory for Markov chains via Wasserstein distance,
with Nikolaus Schweizer,
Bernoulli 24 (2018), 2610-2639. [arxiv]
14. On a generalization of the preconditioned Crank-Nicolson Metropolis algorithm,
with Björn Sprungk,
Found. Comput. Math. 18 (2018), 309-343. [arxiv]
13. Metropolis-Hastings Importance Sampling Estimator,
with Björn Sprungk,
PAMM Proc. Appl. Math. Mech. 17 (2017), 731-734. [pdf]
12. On the size of the largest empty box amidst a point set,
with Christoph Aistleitner and Aicke Hinrichs,
Discrete Appl. Math. 230 (2017), 146-150. [arxiv]
11. Discrepancy bounds for uniformly ergodic Markov chain quasi-Monte Carlo,
with Josef Dick and Houying Zhu,
Ann. Appl. Probab. 26 (2016), 3178-3205. [arxiv]
10. Tractability of the approximation of high-dimensional rank one tensors,
with Erich Novak,
Constr. Approx. 43 (2016), 1-13. [arxiv]
9. Discussion of "Sequential Quasi-Monte-Carlo Sampling" by Gerber and Chopin,
J. R. Stat. Soc. Ser. B 77 (2015), 570-571. [pdf]
8. Error bounds of MCMC for functions with unbounded stationary variance,
with Nikolaus Schweizer,
Stat. Prob. Letters 99 (2015), 6-12. [arxiv]
7. Discrepancy estimates for variance bounding Markov chain quasi-Monte Carlo,
with Josef Dick,
Electron. J. Probab. 19 (2014), 1-24. [arxiv]
6. Computation of expectations by Markov chain Monte Carlo methods,
with Erich Novak,
Extraction of Quantifiable Information from Complex Systems, Lecture Notes in Computational Science and Engineering Volume 102 (2014), 397-411. [arxiv]
5. Positivity of hit-and-run and related algorithms,
with Mario Ullrich,
Electron. Commun. Probab. 18 (2013), 1-8. [arxiv]
4. Hit-and-run for numerical integration,
Monte Carlo and Quasi-Monte Carlo Methods 2012, Springer Proceedings in Mathematics & Statistics Volume 65 (2013), 597-612. [arxiv]
3. Explicit error bounds for Markov chain Monte Carlo,
Dissertationes Math. 485 (2012), 93 pp. [arxiv]
2. Error bounds for computing the expectation by Markov chain Monte Carlo,
Monte Carlo Meth. Appl. 16 (2010), 323-342. [arxiv]
1. Explicit error bounds for lazy reversible Markov chain Monte Carlo,
J. Complexity 25 (2009), 11-24. [arxiv]
@ Mathematical Data Science Faculty of Computer Science and Mathematics Universität Passau