Preface to the Second Edition |
|
ix | |
Preface to the First Edition |
|
xiii | |
|
|
1 | (34) |
|
|
1 | (4) |
|
|
5 | (7) |
|
|
12 | (7) |
|
1.4 Deterministic Numerical Methods |
|
|
19 | (4) |
|
|
19 | (2) |
|
|
21 | (1) |
|
|
21 | (2) |
|
|
23 | (7) |
|
|
30 | (5) |
|
1.6.1 Prior Distributions |
|
|
30 | (2) |
|
|
32 | (3) |
|
2 Random Variable Generation |
|
|
35 | (44) |
|
|
35 | (7) |
|
|
36 | (2) |
|
2.1.2 The Inverse Transform |
|
|
38 | (2) |
|
|
40 | (1) |
|
|
41 | (1) |
|
2.2 General Transformation Methods |
|
|
42 | (5) |
|
2.3 Accept-Reject Methods |
|
|
47 | (6) |
|
2.3.1 The Fundamental Theorem of Simulation |
|
|
47 | (4) |
|
2.3.2 The Accept-Reject Algorithm |
|
|
51 | (2) |
|
2.4 Envelope Accept-Reject Methods |
|
|
53 | (9) |
|
2.4.1 The Squeeze Principle |
|
|
53 | (3) |
|
2.4.2 Log-Concave Densities |
|
|
56 | (6) |
|
|
62 | (10) |
|
|
72 | (7) |
|
|
72 | (3) |
|
2.6.2 Quasi-Monte Carlo Methods |
|
|
75 | (2) |
|
2.6.3 Mixture Representations |
|
|
77 | (2) |
|
3 Monte Carlo Integration |
|
|
79 | (44) |
|
|
79 | (4) |
|
3.2 Classical Monte Carlo Integration |
|
|
83 | (7) |
|
|
90 | (17) |
|
|
90 | (4) |
|
3.3.2 Finite Variance Estimators |
|
|
94 | (9) |
|
3.3.3 Comparing Importance Sampling with Accept-Reject |
|
|
103 | (4) |
|
3.4 Laplace Approximations |
|
|
107 | (3) |
|
|
110 | (9) |
|
|
119 | (4) |
|
3.6.1 Large Deviations Techniques |
|
|
119 | (1) |
|
3.6.2 The Saddlepoint Approximation |
|
|
120 | (3) |
|
4 Controling Monte Carlo Variance |
|
|
123 | (34) |
|
4.1 Monitoring Variation with the CLT |
|
|
123 | (7) |
|
4.1.1 Univariate Monitoring |
|
|
124 | (4) |
|
4.1.2 Multivariate Monitoring |
|
|
128 | (2) |
|
|
130 | (4) |
|
4.3 Riemann Approximations |
|
|
134 | (6) |
|
|
140 | (7) |
|
4.4.1 Antithetic Variables |
|
|
140 | (5) |
|
|
145 | (2) |
|
|
147 | (6) |
|
|
153 | (4) |
|
4.6.1 Monitoring Importance Sampling Convergence |
|
|
153 | (1) |
|
4.6.2 Accept-Reject with Loose Bounds |
|
|
154 | (1) |
|
|
155 | (2) |
|
5 Monte Carlo Optimization |
|
|
157 | (48) |
|
|
157 | (2) |
|
5.2 Stochastic Exploration |
|
|
159 | (15) |
|
|
159 | (3) |
|
|
162 | (1) |
|
5.2.3 Simulated Annealing |
|
|
163 | (6) |
|
|
169 | (5) |
|
5.3 Stochastic Approximation |
|
|
174 | (14) |
|
5.3.1 Missing Data Models and Demarginalization |
|
|
174 | (2) |
|
|
176 | (7) |
|
|
183 | (3) |
|
|
186 | (2) |
|
|
188 | (12) |
|
|
200 | (5) |
|
|
200 | (1) |
|
|
201 | (1) |
|
5.5.3 The Robbins-Monro procedure |
|
|
201 | (2) |
|
5.5.4 Monte Carlo Approximation |
|
|
203 | (2) |
|
|
205 | (62) |
|
|
206 | (2) |
|
|
208 | (5) |
|
6.3 Irreducibility, Atoms, and Small Sets |
|
|
213 | (5) |
|
|
213 | (1) |
|
6.3.2 Atoms and Small Sets |
|
|
214 | (3) |
|
6.3.3 Cycles and Aperiodicity |
|
|
217 | (1) |
|
6.4 Transience and Recurrence |
|
|
218 | (5) |
|
6.4.1 Classification of Irreducible Chains |
|
|
218 | (3) |
|
6.4.2 Criteria for Recurrence |
|
|
221 | (1) |
|
|
221 | (2) |
|
|
223 | (8) |
|
|
223 | (1) |
|
|
224 | (5) |
|
6.5.3 Reversibility and the Detailed Balance Condition |
|
|
229 | (2) |
|
6.6 Ergodicity and Convergence |
|
|
231 | (7) |
|
|
231 | (5) |
|
6.6.2 Geometric Convergence |
|
|
236 | (1) |
|
|
237 | (1) |
|
|
238 | (9) |
|
|
240 | (2) |
|
6.7.2 Central Limit Theorems |
|
|
242 | (5) |
|
|
247 | (11) |
|
|
258 | (9) |
|
|
258 | (4) |
|
6.9.2 Eaton's Admissibility Condition |
|
|
262 | (1) |
|
6.9.3 Alternative Convergence Conditions |
|
|
263 | (1) |
|
6.9.4 Mixing Conditions and Central Limit Theorems |
|
|
263 | (2) |
|
6.9.5 Covariance in Markov Chains |
|
|
265 | (2) |
|
7 The Metropolis-Hastings Algorithm |
|
|
267 | (54) |
|
|
267 | (2) |
|
7.2 Monte Carlo Methods Based on Markov Chains |
|
|
269 | (1) |
|
7.3 The Metropolis-Hastings algorithm |
|
|
270 | (6) |
|
|
270 | (2) |
|
7.3.2 Convergence Properties |
|
|
272 | (4) |
|
7.4 The Independent Metropolis-Hastings Algorithm |
|
|
276 | (11) |
|
|
276 | (9) |
|
7.4.2 A Metropolis-Hastings Version of ARS |
|
|
285 | (2) |
|
|
287 | (5) |
|
7.6 Optimization and Control |
|
|
292 | (10) |
|
7.6.1 Optimizing the Acceptance Rate |
|
|
292 | (3) |
|
7.6.2 Conditioning and Accelerations |
|
|
295 | (4) |
|
|
299 | (3) |
|
|
302 | (11) |
|
|
313 | (8) |
|
7.8.1 Background of the Metropolis Algorithm |
|
|
313 | (2) |
|
7.8.2 Geometric Convergence of Metropolis-Hastings Algorithms |
|
|
315 | (1) |
|
7.8.3 A Reinterpretation of Simulated Annealing |
|
|
315 | (1) |
|
7.8.4 Reference Acceptance Rates |
|
|
316 | (2) |
|
7.8.5 Langevin Algorithms |
|
|
318 | (3) |
|
|
321 | (16) |
|
8.1 Another Look at the Fundamental Theorem |
|
|
321 | (5) |
|
8.2 The General Slice Sampler |
|
|
326 | (3) |
|
8.3 Convergence Properties of the Slice Sampler |
|
|
329 | (4) |
|
|
333 | (2) |
|
|
335 | (2) |
|
8.5.1 Dealing with Difficult Slices |
|
|
335 | (2) |
|
9 The Two-Stage Gibbs Sampler |
|
|
337 | (34) |
|
9.1 A General Class of Two-Stage Algorithms |
|
|
337 | (7) |
|
9.1.1 From Slice Sampling to Gibbs Sampling |
|
|
337 | (2) |
|
|
339 | (4) |
|
9.1.3 Back to the Slice Sampler |
|
|
343 | (1) |
|
9.1.4 The Hammersley-Clifford Theorem |
|
|
343 | (1) |
|
9.2 Fundamental Properties |
|
|
344 | (10) |
|
9.2.1 Probabilistic Structures |
|
|
344 | (5) |
|
9.2.2 Reversible and Interleaving Chains |
|
|
349 | (2) |
|
9.2.3 The Duality Principle |
|
|
351 | (3) |
|
9.3 Monotone Covariance and Rao-Blackwellization |
|
|
354 | (3) |
|
9.4 The EM-Gibbs Connection |
|
|
357 | (3) |
|
|
360 | (1) |
|
|
360 | (6) |
|
|
366 | (5) |
|
9.7.1 Inference for Mixtures |
|
|
366 | (2) |
|
|
368 | (3) |
10 The Multi-Stage Gibbs Sampler |
|
371 | (54) |
|
|
371 | (7) |
|
|
371 | (2) |
|
|
373 | (3) |
|
10.1.3 The General Hammersley-Clifford Theorem |
|
|
376 | (2) |
|
10.2 Theoretical Justifications |
|
|
378 | (9) |
|
10.2.1 Markov Properties of the Gibbs Sampler |
|
|
378 | (3) |
|
10.2.2 Gibbs Sampling as Metropolis-Hastings |
|
|
381 | (2) |
|
10.2.3 Hierarchical Structures |
|
|
383 | (4) |
|
10.3 Hybrid Gibbs Samplers |
|
|
387 | (9) |
|
10.3.1 Comparison with Metropolis-Hastings Algorithms |
|
|
387 | (1) |
|
10.3.2 Mixtures and Cycles |
|
|
388 | (4) |
|
10.3.3 Metropolizing the Gibbs Sampler |
|
|
392 | (4) |
|
10.4 Statistical Considerations |
|
|
396 | (11) |
|
10.4.1 Reparameterization |
|
|
396 | (6) |
|
10.4.2 Rao-Blackwellization |
|
|
402 | (1) |
|
|
403 | (4) |
|
|
407 | (12) |
|
|
419 | (6) |
|
10.6.1 A Bit of Background |
|
|
419 | (1) |
|
|
420 | (1) |
|
10.6.3 Nonparametric Mixtures |
|
|
420 | (2) |
|
|
422 | (3) |
11 Variable Dimension Models and Reversible Jump Algorithms |
|
425 | (34) |
|
11.1 Variable Dimension Models |
|
|
425 | (4) |
|
11.1.1 Bayesian Model Choice |
|
|
426 | (1) |
|
11.1.2 Difficulties in Model Choice |
|
|
427 | (2) |
|
11.2 Reversible Jump Algorithms |
|
|
429 | (15) |
|
|
429 | (3) |
|
11.2.2 A Fixed Dimension Reassessment |
|
|
432 | (1) |
|
11.2.3 The Practice of Reversible Jump MCMC |
|
|
433 | (11) |
|
11.3 Alternatives to Reversible Jump MCMC |
|
|
444 | (5) |
|
|
444 | (2) |
|
11.3.2 Continuous-Time Jump Processes |
|
|
446 | (3) |
|
|
449 | (9) |
|
|
458 | (1) |
|
|
458 | (1) |
12 Diagnosing Convergence |
|
459 | (52) |
|
|
459 | (6) |
|
12.1.1 Convergence Criteria |
|
|
461 | (3) |
|
|
464 | (1) |
|
12.1.3 Monitoring Reconsidered |
|
|
465 | (1) |
|
12.2 Monitoring Convergence to the Stationary Distribution |
|
|
465 | (15) |
|
12.2.1 A First Illustration |
|
|
465 | (1) |
|
12.2.2 Nonparametric Tests of Stationarity |
|
|
466 | (4) |
|
|
470 | (4) |
|
|
474 | (4) |
|
12.2.5 Distance Evaluations |
|
|
478 | (2) |
|
12.3 Monitoring Convergence of Averages |
|
|
480 | (20) |
|
12.3.1 A First Illustration |
|
|
480 | (3) |
|
12.3.2 Multiple Estimates |
|
|
483 | (7) |
|
|
490 | (7) |
|
12.3.4 Within and Between Variances |
|
|
497 | (2) |
|
12.3.5 Effective Sample Size |
|
|
499 | (1) |
|
12.4 Simultaneous Monitoring |
|
|
500 | (4) |
|
|
500 | (3) |
|
12.4.2 Valid Discretization |
|
|
503 | (1) |
|
|
504 | (4) |
|
|
508 | (3) |
|
|
508 | (1) |
|
|
509 | (2) |
13 Perfect Sampling |
|
511 | (34) |
|
|
511 | (2) |
|
13.2 Coupling from the Past |
|
|
513 | (19) |
|
13.2.1 Random Mappings and Coupling |
|
|
513 | (3) |
|
13.2.2 Propp and Wilson's Algorithm |
|
|
516 | (2) |
|
13.2.3 Monotonicity and Envelopes |
|
|
518 | (5) |
|
13.2.4 Continuous States Spaces |
|
|
523 | (3) |
|
13.2.5 Perfect Slice Sampling |
|
|
526 | (4) |
|
13.2.6 Perfect Sampling via Automatic Coupling |
|
|
530 | (2) |
|
|
532 | (3) |
|
13.4 Perfect Sampling in Practice |
|
|
535 | (1) |
|
|
536 | (3) |
|
|
539 | (6) |
|
|
539 | (1) |
|
13.6.2 Perfect Sampling and Tempering |
|
|
540 | (5) |
14 Iterated and Sequential Importance Sampling |
|
545 | (36) |
|
|
545 | (1) |
|
14.2 Generalized Importance Sampling |
|
|
546 | (1) |
|
|
547 | (12) |
|
14.3.1 Sequential Monte Carlo |
|
|
547 | (2) |
|
14.3.2 Hidden Markov Models |
|
|
549 | (2) |
|
|
551 | (1) |
|
|
552 | (2) |
|
14.3.5 Sampling Strategies |
|
|
554 | (2) |
|
14.3.6 Fighting the Degeneracy |
|
|
556 | (2) |
|
14.3.7 Convergence of Particle Systems |
|
|
558 | (1) |
|
14.4 Population Monte Carlo |
|
|
559 | (11) |
|
|
560 | (1) |
|
14.4.2 General Iterative Importance Sampling |
|
|
560 | (2) |
|
14.4.3 Population Monte Carlo |
|
|
562 | (1) |
|
14.4.4 An Illustration for the Mixture Model |
|
|
563 | (2) |
|
14.4.5 Adaptativity in Sequential Algorithms |
|
|
565 | (5) |
|
|
570 | (7) |
|
|
577 | (8) |
|
14.6.1 A Brief History of Particle Systems |
|
|
577 | (1) |
|
14.6.2 Dynamic Importance Sampling |
|
|
577 | (2) |
|
14.6.3 Hidden Markov Models |
|
|
579 | (2) |
A Probability Distributions |
|
581 | (4) |
B Notation |
|
585 | (6) |
|
|
585 | (1) |
|
|
586 | (1) |
|
|
586 | (1) |
|
|
587 | (1) |
|
|
588 | (1) |
|
|
588 | (3) |
References |
|
591 | (32) |
Index of Names |
|
623 | (8) |
Index of Subjects |
|
631 | |