Category Archives: TidyX Screen Cast

TidyX Episode 104: TidyX goes on a lubridate

This week, Ellis Hughes and I discuss date objects in R, which can be rather confusing. We talk about what is under the hood of a date object, how to specify them, how to parse them, and how to calculate the difference between them.

If you work with any timestamped data (e.g., GPS, Accelerometer, Force Plate, etc.) this is going to be an episode that you will want to check out as dealing with those pesky date/time columns can be tricky!

To watch the screen cast, CLICK HERE.

To access our code, CLICK HERE.

TidyX Episode 103: GIBBS Sampling

This week Ellis Hughes and I wrap up our Intro to Bayesian Analysis series. Up to this point we’ve been talking about conjugate priors for the binomial distribution, poisson distribution, and normal distribution.

Unfortunately, when using the normal-normal conjugate you need to assume that one of the two distribution parameters (mean or standard deviation) are known and then estimate the other parameter. For example, last episode we assumed the standard deviation was known, allowing us to estimate the mean. This is a problem in situations where you don’t always know what the standard deviation is and, therefore, need to estimate both parameters. For this, we turn to GIBBS sampling. A GIBBS sampler is a Markov Chain Monte Carlo (MCMC) approach to Bayesian inference.

In this episode we will walk through building your own GIBBS sampler, calculating posterior summary statistics, and plotting the posterior samples and trace plots. We wrap up by showing how to use the normpostsim() function from Jim Albert’s {LearnBayes} package, for instances where you don’t want to code up your own GIBBS sampler.

To watch our screen cast, CLICK HERE.

To access the code, CLICK HERE.

TidyX Episode 102: Normal-Normal Conjugate

So far we’ve covered the beta-binomial conjugate and the gamma-poisson conjugate. This week, Ellis Hughes and I discuss Bayesian analysis using the normal-normal conjugate. We build an example of a basketball player’s efficiency rating and show how to estimate their performance given prior knowledge of player efficiency ratings and some observed values from the player’s games.

To watch the screen cast, CLICK HERE.

To access our code, CLICK HERE.

TidyX Episode 101: Gamma-Poisson Conjugate

This week Ellis Hughes and I continue to discuss Bayesian analysis and this time we transition to talking about count data. We build an example of using the gamma-poisson conjugate to take observations of a basketball players points per game and update our prior knowledge about the points per game for our given population.

To watch the screen cast, CLICK HERE.

To access the code, CLICK HERE.

TidyX Episode 100: Beta Conjugate

This week, Ellis Hughes and I shift our initial intro to Bayes towards solving an actual problem using the beta distribution as a conjugate prior for a binomial likelihood. We discuss what a conjugate prior is, we cover updating your prior knowledge in line, as new data becomes available, and show how the posterior distribution, produced by our Bayesian approach, works to change our beliefs about a specific outcome.

Aside from this episode, we also discussed this approach in TidyX Episode 11, where we used the a beta prior to update our knowledge about the winning percentage of professional beach volleyball teams across a season. That episode also discussed how to use the method of moments (sometimes referred to as moment matching) to select the alpha and beta parameters for the beta distribution, if you don’t have a good sense of what they should be in order to establish your prior.

To watch our screen cast, CLICK HERE.

To access our code, CLICK HERE.

To access the screen cast and code for Episode 11, CLICK HERE.