8 Day 8 (February 13)
8.1 Announcements
The material I talk about today will come from Chapters 4, 6, and 23.
Final project is posted
- Short lecture with donuts/meet people today
 
Activity 3 is posted
8.2 Introduction to Metropolis-Hastings algorithm
Note this material is in Ch. 4
What is a Metropolis-Hastings algorithm?
Why use a Metropolis-Hastings algorithm?
- Original work (see link and link)
 - Wikipedia page
 - Only need to know a function that is proportional to the PDF/PMF
 - Why this is such a big deal for Bayesian statistics?
 - What else do we need to unlock the power of Bayesian?
 
What we loose by using a Metropolis-Hastings algorithm
- Requires a bit more programming and supervision/checking
 - Correlated samples vs. independent samples
 - Burn-in interval
 
Live example using bat and coin data/model
- See pg. 25 in book for algorithm
 - Download R script
 
Automated software
- All of this in R (WinBugs, OpenBugs, JAGS, NIMBLE, STAN, etc)
 
8.3 Our second statistical model
- Dig into the rabies test a bit more….
- Rabies test results
 - Building a statistical model using a hierarchical Bayesian approach
 - Specify (write out) the data model
 - Specify the process model
 - Specify the parameter model (or prior) including hyper-parameters
 - Select an approach to obtain the posterior distribution
- Gibbs sampler
 - Derive full conditionals
 - Discussion of trade-offs with Gibbs sampler with analytical full conditionals vs. Metropolis-Hastings