Show simple item record

dc.contributor.authorResnik, Philip
dc.contributor.authorHardisty, Eric
dc.date.accessioned2010-04-18T23:04:28Z
dc.date.available2010-04-18T23:04:28Z
dc.date.issued2010-04-16
dc.identifier.urihttp://hdl.handle.net/1903/10058
dc.description.abstractThis document is intended for computer scientists who would like to try out a Markov Chain Monte Carlo (MCMC) technique, particularly in order to do inference with Bayesian models on problems related to text processing. We try to keep theory to the absolute minimum needed, though we work through the details much more explicitly than you usually see even in "introductory" explanations. That means we've attempted to be ridiculously explicit in our exposition and notation. After providing the reasons and reasoning behind Gibbs sampling (and at least nodding our heads in the direction of theory), we work through an example application in detail|the derivation of a Gibbs sampler for a Naive Bayes model. Along with the example, we discuss some practical implementation issues, including the integrating out of continuous parameters when possible. We conclude with some pointers to literature that we've found to be somewhat more friendly to uninitiated readers.en_US
dc.language.isoen_USen_US
dc.relation.ispartofseriesUM Computer Science Department;CS-TR-4956
dc.relation.ispartofseriesUMIACS;UMIACS-TR-2010-04
dc.relation.ispartofseries;LAMP-153
dc.titleGibbs Sampling for the Uninitiateden_US
dc.typeTechnical Reporten_US


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record