• norsk
    • English
  • English 
    • norsk
    • English
  • Login
View Item 
  •   Home
  • Fakultet for informasjonsteknologi og elektroteknikk (IE)
  • Institutt for matematiske fag
  • View Item
  •   Home
  • Fakultet for informasjonsteknologi og elektroteknikk (IE)
  • Institutt for matematiske fag
  • View Item
JavaScript is disabled for your browser. Some features of this site may not work without it.

Approximate marginal inference in binary Markov random fields using a mean squared error energy approximation and the junction tree algorithm

Wiig, Trygve Bertelsen
Master thesis
Thumbnail
View/Open
14236_FULLTEXT.pdf (1.180Mb)
14236_COVER.pdf (1.556Mb)
URI
http://hdl.handle.net/11250/2400157
Date
2016
Metadata
Show full item record
Collections
  • Institutt for matematiske fag [1769]
Abstract
In this thesis, we use a mean squared error energy approximation for edge deletion in order to make performing inference in Markov random fields using the junction tree algorithm tractable, and by that develop an approximate marginal inference algorithm for binary Markov random fields. We apply the algorithm to a wide range of example models, including pairwise Ising models and Boltzmann machines, as well as two types of higher-order grid models. Three different approaches to selecting edges for deletion are developed and compared, and the quality of approximation of our algorithm is compared to that of the loopy belief propagation algorithm as well as to Gibbs sampling based inference, two traditional approaches to performing approximate marginal inference.

The results of the numerical experiments we have performed indicate that traditional approaches to approximate inference usually give superior results to those from our algorithm when working with pairwise models. However, the algorithm developed in this thesis shows promise when applied to higher-order models, and in cases where loopy belief propagation does not converge and stochastic methods are undesirable. We also find that choosing edges to delete using the Kullback-Leibler divergence as a criterion is particularly effective, but that other edge selection methods may be as effective for certain types of models.
Publisher
NTNU

Contact Us | Send Feedback

Privacy policy
DSpace software copyright © 2002-2019  DuraSpace

Service from  Unit
 

 

Browse

ArchiveCommunities & CollectionsBy Issue DateAuthorsTitlesSubjectsDocument TypesJournalsThis CollectionBy Issue DateAuthorsTitlesSubjectsDocument TypesJournals

My Account

Login

Statistics

View Usage Statistics

Contact Us | Send Feedback

Privacy policy
DSpace software copyright © 2002-2019  DuraSpace

Service from  Unit