Thursday, October 11, 2018

2018-10-11 Thursday - Proposed Talk for O'Reilly AI Conference 2019

A proposed talk I've submitted for the O'Reilly Artificial Intelligence Conference in New York, April 15-18, 2019

https://conferences.oreilly.com/artificial-intelligence/ai-ny

JAGS vs Edward for Markov Chain Monte Carlo (MCMC) Simulation

An AI neophyte's experience exploring the capabilities and features  of JAGS vs Edward for Markov Chain Monte Carlo Simulation.  Lessons learned. A discussion of my approach to tackling the various learning materials I found useful to gaining a level of working knowledge.
This presentation will compare and contrast the utility of JAGS vs Edward for implementing Markov chain Monte Carlo (MCMC) simulation.

In statistics, Markov chain Monte Carlo (MCMC) methods comprise a class of algorithms for sampling from a probability distribution. By constructing a Markov chain that has the desired distribution as its equilibrium distribution, one can obtain a sample of the desired distribution by observing the chain after a number of steps. The more steps there are, the more closely the distribution of the sample matches the actual desired distribution.

Edward is a Python library for probabilistic modeling, inference, and criticism. It is a testbed for fast experimentation and research with probabilistic models, ranging from classical hierarchical models on small data sets to complex deep probabilistic models on large data sets. Edward fuses three fields: Bayesian statistics and machine learning, deep learning, and probabilistic programming.

JAGS is Just Another Gibbs Sampler. It is a program for analysis of Bayesian hierarchical models using Markov Chain Monte Carlo (MCMC) simulation.

No comments:

Copyright

© 2001-2021 International Technology Ventures, Inc., All Rights Reserved.