<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE article PUBLIC "-//NLM//DTD Journal Publishing DTD v2.3 20070202//EN" "journalpublishing.dtd">
<article article-type="review-article" dtd-version="2.3" xml:lang="EN" xmlns:mml="http://www.w3.org/1998/Math/MathML" xmlns:xlink="http://www.w3.org/1999/xlink">
<front>
<journal-meta>
<journal-id journal-id-type="publisher-id">Front. Phys.</journal-id>
<journal-title>Frontiers in Physics</journal-title>
<abbrev-journal-title abbrev-type="pubmed">Front. Phys.</abbrev-journal-title>
<issn pub-type="epub">2296-424X</issn>
<publisher>
<publisher-name>Frontiers Media S.A.</publisher-name>
</publisher>
</journal-meta>
<article-meta>
<article-id pub-id-type="publisher-id">641859</article-id>
<article-id pub-id-type="doi">10.3389/fphy.2021.641859</article-id>
<article-categories>
<subj-group subj-group-type="heading">
<subject>Physics</subject>
<subj-group>
<subject>Review</subject>
</subj-group>
</subj-group>
</article-categories>
<title-group>
<article-title>Random Fields in Physics, Biology and Data Science</article-title>
<alt-title alt-title-type="left-running-head">Hern&#xe1;ndez-Lemus</alt-title>
<alt-title alt-title-type="right-running-head">Random Fields</alt-title>
</title-group>
<contrib-group>
<contrib contrib-type="author">
<name>
<surname>Hern&#xe1;ndez-Lemus</surname>
<given-names>Enrique</given-names>
</name>
<xref ref-type="aff" rid="aff1">
<sup>1</sup>
</xref>
<xref ref-type="aff" rid="aff2">
<sup>2</sup>
</xref>
<xref ref-type="corresp" rid="c001">&#x2a;</xref>
<uri xlink:href="https://loop.frontiersin.org/people/49930/overview"/>
</contrib>
</contrib-group>
<aff id="aff1">
<label>
<sup>1</sup>
</label>Computational Genomics Division, National Institute of Genomic Medicine, <addr-line>Arenal Tepepan</addr-line>, <country>Mexico</country>
</aff>
<aff id="aff2">
<label>
<sup>2</sup>
</label>Centro de Ciencias de La Complejidad, Universidad Nacional Aut&#xf3;noma de M&#xe9;xico, <addr-line>Coyoac&#xe1;n</addr-line>, <country>Mexico</country>
</aff>
<author-notes>
<fn fn-type="edited-by">
<p>
<bold>Edited by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/951875/overview">Umberto Lucia</ext-link>, Politecnico di Torino, Italy</p>
</fn>
<fn fn-type="edited-by">
<p>
<bold>Reviewed by:</bold> <ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/1177206/overview">Farrukh Mukhamedov</ext-link>, United Arab Emirates University, United Arab Emirates</p>
<p>
<ext-link ext-link-type="uri" xlink:href="https://loop.frontiersin.org/people/1179692/overview">Luca Martino</ext-link>, Rey Juan Carlos University, Spain</p>
</fn>
<corresp id="c001">&#x2a;Correspondence: Enrique Hern&#xe1;ndez-Lemus, <email>ehernandez@inmegen.gob.mx</email>
</corresp>
<fn fn-type="other">
<p>This article was submitted to Interdisciplinary Physics, a section of the journal Frontiers in Physics</p>
</fn>
</author-notes>
<pub-date pub-type="epub">
<day>15</day>
<month>04</month>
<year>2021</year>
</pub-date>
<pub-date pub-type="collection">
<year>2021</year>
</pub-date>
<volume>9</volume>
<elocation-id>641859</elocation-id>
<history>
<date date-type="received">
<day>15</day>
<month>12</month>
<year>2020</year>
</date>
<date date-type="accepted">
<day>01</day>
<month>02</month>
<year>2021</year>
</date>
</history>
<permissions>
<copyright-statement>Copyright &#xa9; 2021 Hern&#xe1;ndez-Lemus.</copyright-statement>
<copyright-year>2021</copyright-year>
<copyright-holder>Hern&#xe1;ndez-Lemus</copyright-holder>
<license xlink:href="http://creativecommons.org/licenses/by/4.0/">
<p>This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) and the copyright owner(s) are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these&#x20;terms.</p>
</license>
</permissions>
<abstract>
<p>A random field is the representation of the joint probability distribution for a set of random variables. Markov fields, in particular, have a long standing tradition as the theoretical foundation of many applications in statistical physics and probability. For strictly positive probability densities, a Markov random field is also a Gibbs field, i.e.,&#x20;a random field supplemented with a measure that implies the existence of a regular conditional distribution. Markov random fields have been used in statistical physics, dating back as far as the Ehrenfests. However, their measure theoretical foundations were developed much later by Dobruschin, Lanford and Ruelle, as well as by Hammersley and Clifford. Aside from its enormous theoretical relevance, due to its generality and simplicity, Markov random fields have been used in a broad range of applications in equilibrium and non-equilibrium statistical physics, in non-linear dynamics and ergodic theory. Also in computational molecular biology, ecology, structural biology, computer vision, control theory, complex networks and data science, to name but a few. Often these applications have been inspired by the original statistical physics approaches. Here, we will briefly present a modern introduction to the theory of random fields, later we will explore and discuss some of the recent applications of random fields in physics, biology and data science. Our aim is to highlight the relevance of this powerful theoretical aspect of statistical physics and its relation to the broad success of its many interdisciplinary applications.</p>
</abstract>
<kwd-group>
<kwd>random fields</kwd>
<kwd>probabilistic graphical models</kwd>
<kwd>Gibbs fields</kwd>
<kwd>Markov fields</kwd>
<kwd>Gaussian random fields</kwd>
</kwd-group>
</article-meta>
</front>
<body>
<sec id="s1">
<title>1 Introduction</title>
<p>The theory and applications of random fields born out of the fortunate marriage of two simple but deep lines of reasoning. On the one hand, physical intuition, strongly founded in the works of Boltzmann and the Ehrenfests, but also in other originators of the kinetic theory of matter, was that large scale, long range phenomena may originate from (a multitude of) local interactions. On the other hand, probabilistic reasoning induced us to think that such multitude of local interactions would be stochastic in nature. These two ideas, paramount to statistical mechanics, have been extensively explored and develop into a full theoretical subdiscipline, the theory of random fields. Perhaps the archetypal instance of a random field was laid out in the doctoral thesis of Ernst Ising, the Ising model of ferromagnetism [<xref ref-type="bibr" rid="B1">1</xref>]. However, although the physical ideas have been laid out mainly by physicists, much of the further mathematical development was made by the Russian school of probability. In particular, by the works of Averintsev [<xref ref-type="bibr" rid="B2">2</xref>, <xref ref-type="bibr" rid="B3">3</xref>], which&#x2013;along with the measure theoretical-inspired formalization of statistical mechanics by J.W. Gibbs&#x2013;, was able to specify a general class of fields described only by pair potentials [<xref ref-type="bibr" rid="B4">4</xref>]. Theoretical advances were given by Stavskaya who studied random fields by measure theory considering them as invariant states for local processes [<xref ref-type="bibr" rid="B5">5</xref>, <xref ref-type="bibr" rid="B6">6</xref>], by Vasilyev who consider stationary measures as derived from local interactions in discrete mappings [<xref ref-type="bibr" rid="B7">7</xref>] and others.</p>
<p>The formal establishment of the theory of Markov-Gibbs random fields, however, is often attributed to the works of Dobruschin, Lanford and Ruelle [<xref ref-type="bibr" rid="B8">8</xref>, <xref ref-type="bibr" rid="B9">9</xref>], in particular to their DLR equations for the probability measures. Also remarkable is the contribution of Hammersley and Clifford, who developed a proof of the equivalence of Gibbs random fields and Markov random fields, provided positive definite probabilities [<xref ref-type="bibr" rid="B10">10</xref>]. Although the authors never officially published this work, that they thought to be incomplete given the&#x2013;now known to be essential&#x2013;requirement of positive definite probabilities, several published works have been made on top of it and even alternative proofs have been published [<xref ref-type="bibr" rid="B11">11</xref>&#x2013;<xref ref-type="bibr" rid="B13">13</xref>].</p>
<p>Aside from the extensive use of the Ising model and other random fields in statistical mechanics&#x2013;too many contributions to mention here, but most of them comprehensively reviewed in the monographs by Baxter [<xref ref-type="bibr" rid="B14">14</xref>], Cipra [<xref ref-type="bibr" rid="B15">15</xref>], McCoy and Wu [<xref ref-type="bibr" rid="B16">16</xref>], Thompson [<xref ref-type="bibr" rid="B17">17</xref>] and in the simulation-oriented book by Adler [<xref ref-type="bibr" rid="B18">18</xref>]&#x2013;; there has been also a deep interest in development in models in biophysics, computer science and other fields. The development of Hopfield networks as models of addressable memory in neurophysiology (and artificial neural networks) [<xref ref-type="bibr" rid="B19">19</xref>] is perhaps one of the earliest examples. Followed by the implementation if the so-called Boltzmann machines in artificial intelligence (AI) applications [<xref ref-type="bibr" rid="B20">20</xref>, <xref ref-type="bibr" rid="B21">21</xref>] paved the way to a plethora of theoretical, computational and representational applications of random fields.</p>
<p>In the rest of this review paper, we will present some general grounds of the theory of Markov random fields to serve as a framework to elaborate on many of its relevant applications inside and outside physics. Our emphasis here will not be to be comprehensive but illustrative of some relevant features that have made this quintessential model of statistical physics so pervasive in our discipline and in many others (<italic>Markov Random Fields: A Theoretical Framework</italic>). We will also discuss how methodological and computational advances in these areas may be implemented to improve on the applications of random fields in physical models. We have chosen to focus on applications in Physics (<italic>Markov Random Fields in Physics</italic>), Biology (<italic>Markov Random Fields in Biology</italic>) and Data Science (<italic>Markov Random Fields in Data Science and Machine Learning</italic>). We are aware that by necessity (finiteness), we are leaving out contributions in fields such as sociology (Axelrod models, for instance), finance (volatility maps, Markov switching models, etc.) and others. However, we believe this panoramic view will make easier for the interested reader to look into these other applications. Finally, in <italic>Concluding Remarks</italic> we will outline some brief concluding remarks.</p>
</sec>
<sec id="s2">
<title>2 Markov Random Fields: A Theoretical Framework</title>
<p>Here we will define and describe Markov random fields [<xref ref-type="bibr" rid="B8">8</xref>, <xref ref-type="bibr" rid="B12">12</xref>] (MRFs) as an appropriate theoretical framework useful for systematic probabilistic analysis in various settings. An MRF represents, in this context, the joint probability distribution for a set (as large as desired) of real-valued random variables. There are several extensions of the general ideas presented here, that will be presented and briefly addressed as needed.</p>
<p>Let <inline-formula id="inf1">
<mml:math id="minf1">
<mml:mrow>
<mml:mi>X</mml:mi>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>&#x3b1;</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> be a vector of random variables (i.e.,&#x20;the features or characteristic functions used to describe a system of interest). An MRF may be represented as an undirected graph depicting the statistical dependency structure of <italic>X</italic>, as given by the joint probability distribution <inline-formula id="inf2">
<mml:math id="minf2">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>X</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>&#x20;[<xref ref-type="bibr" rid="B22">22</xref>].</p>
<p>Let this graph be embodied in the form of a duplex <inline-formula id="inf3">
<mml:math id="minf3">
<mml:mrow>
<mml:mi>G</mml:mi>
<mml:mo>&#x3d;</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>V</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>E</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula> consisting of a set <italic>V</italic> of vertices or nodes (the random variables <inline-formula id="inf4">
<mml:math id="minf4">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>&#x2019;s) and a set <inline-formula id="inf5">
<mml:math id="minf5">
<mml:mrow>
<mml:mi>E</mml:mi>
<mml:mo>&#x2286;</mml:mo>
<mml:mi>V</mml:mi>
<mml:mo>&#xd7;</mml:mo>
<mml:mi>V</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula> of edges connecting the nodes (thus representing the statistical dependencies between random variables). <italic>E</italic> also represents a neighborhood law <italic>N</italic> stating which vertex is connected (i.e.,&#x20;dependent) to which other vertex in the graph. With this in mind, an MRF can be also represented as <inline-formula id="inf6">
<mml:math id="minf6">
<mml:mrow>
<mml:mi>G</mml:mi>
<mml:mo>&#x3d;</mml:mo>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>V</mml:mi>
<mml:mo>,</mml:mo>
<mml:mi>N</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>. The set of neighbors of a given point <inline-formula id="inf7">
<mml:math id="minf7">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> is denoted&#x20;<inline-formula id="inf8">
<mml:math id="minf8">
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>.</p>
<sec id="s2-1">
<title>2.1 Configuration</title>
<p>We can assign each point in the graph, one of a finite set <italic>S</italic> of labels. Such assignment, it is often called a <italic>configuration</italic>. We can then assign probability measures to the set <inline-formula id="inf9">
<mml:math id="minf9">
<mml:mtext>&#x3a9;</mml:mtext>
</mml:math>
</inline-formula> of all possible configurations &#x3c9;. Hence, <inline-formula id="inf10">
<mml:math id="minf10">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> represents the configuration &#x3c9; restricted to the subset <italic>A</italic> of <italic>V</italic>. We may think of <inline-formula id="inf11">
<mml:math id="minf11">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> as a configuration on the subgraph <inline-formula id="inf12">
<mml:math id="minf12">
<mml:mrow>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> restricting <italic>V</italic> to points of&#x20;<italic>A</italic>.</p>
</sec>
<sec id="s2-2">
<title>2.2 Local Characteristics</title>
<p>We can define <italic>local characteristics</italic> on MRFs. The local characteristics of a probability measure <inline-formula id="inf13">
<mml:math id="minf13">
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
</mml:math>
</inline-formula> defined on <inline-formula id="inf14">
<mml:math id="minf14">
<mml:mtext>&#x3a9;</mml:mtext>
</mml:math>
</inline-formula> are the conditional probabilities:<disp-formula id="e1">
<mml:math id="me1">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>t</mml:mi>
</mml:msub>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:mi>T</mml:mi>
<mml:mi mathvariant="normal">&#x2216;</mml:mi>
<mml:mi>t</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>t</mml:mi>
</mml:msub>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mi>t</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
<label>(1)</label>
</disp-formula>
</p>
<p>This represents the probability that the point <italic>t</italic> is assigned the value <inline-formula id="inf15">
<mml:math id="minf15">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>t</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>, given the values at all other points of the graph. Let us re-write <xref ref-type="disp-formula" rid="e1">Eq. 1</xref>. Since the probability measure will define an MRF if the local characteristics depend only on the outcomes at neighboring points, i.e.,&#x20;if for every <italic>&#x3c9;</italic>
<disp-formula id="e2">
<mml:math id="me2">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:mi>G</mml:mi>
<mml:mi mathvariant="normal">&#x2216;</mml:mi>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>N</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
<label>(2)</label>
</disp-formula>
</p>
</sec>
<sec id="s2-3">
<title>2.3 Cliques</title>
<p>Given an arbitrary graph, we may refer to a set of points <italic>C</italic>, as a <italic>clique</italic>, if every pair of points in <italic>C</italic> are neighbors. This includes the empty set as a clique. A clique is then a set whose <italic>induced subgraph</italic> is complete. Cliques are also called <italic>complete induced subgraphs</italic> or <italic>maximal subgraphs</italic>.</p>
</sec>
<sec id="s2-4">
<title>2.4 Configuration Potentials</title>
<p>A <italic>potential &#x3b7;</italic> is an assignment of a number <inline-formula id="inf16">
<mml:math id="minf16">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3b7;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula> to every subconfiguration <inline-formula id="inf17">
<mml:math id="minf17">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> of a configuration <italic>&#x3c9;</italic> in the graph <italic>G</italic>. A given <italic>&#x3b7;</italic>, induces an <italic>energy</italic> <inline-formula id="inf18">
<mml:math id="minf18">
<mml:mrow>
<mml:mi>U</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula> on the set of all configurations <italic>&#x3c9;</italic> as follows:<disp-formula id="e3">
<mml:math id="me3">
<mml:mrow>
<mml:mi>U</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:munder>
<mml:mstyle displaystyle="true">
<mml:mo>&#x2211;</mml:mo>
</mml:mstyle>
<mml:mi>A</mml:mi>
</mml:munder>
<mml:msub>
<mml:mi>&#x3b7;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
<label>(3)</label>
</disp-formula>
</p>
<p>Here, for fixed <italic>&#x3c9;</italic>, the sum is taken over all subsets <inline-formula id="inf19">
<mml:math id="minf19">
<mml:mrow>
<mml:mi>A</mml:mi>
<mml:mo>&#x2286;</mml:mo>
<mml:mi>V</mml:mi>
</mml:mrow>
</mml:math>
</inline-formula> including the empty set. It is possible to define a probability measure, called the <italic>Gibbs measure induced</italic> by <italic>U</italic> as<disp-formula id="e4">
<mml:math id="me4">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msup>
<mml:mi>e</mml:mi>
<mml:mrow>
<mml:mo>&#x2212;</mml:mo>
<mml:mi>U</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:msup>
</mml:mrow>
<mml:mi>Z</mml:mi>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(4)</label>
</disp-formula>
</p>
<p>
<italic>Z</italic> (taken from the German word <italic>zustanssumme</italic> or <italic>sum over states</italic>) is a normalization constant called the <italic>partition function</italic>. As it is known, explicit computation of the partition function is in many cases a very challenging endeavor. There is a great deal of work in the development of methods and approaches to overcome some (but not all) challenges in this regard. Some of these approximations will be discussed later on.<disp-formula id="e5">
<mml:math id="me5">
<mml:mrow>
<mml:mi>Z</mml:mi>
<mml:mo>&#x3d;</mml:mo>
<mml:munder>
<mml:mstyle displaystyle="true">
<mml:mo>&#x2211;</mml:mo>
</mml:mstyle>
<mml:mi>&#x3c9;</mml:mi>
</mml:munder>
<mml:msup>
<mml:mi>e</mml:mi>
<mml:mrow>
<mml:mo>&#x2212;</mml:mo>
<mml:mi>U</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:msup>
</mml:mrow>
</mml:math>
<label>(5)</label>
</disp-formula>
</p>
<p>The term <italic>potential</italic> is often used in connection with potential energies. In this context <inline-formula id="inf20">
<mml:math id="minf20">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3b7;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> is commonly termed a <italic>potential energy</italic> in physics applications. <inline-formula id="inf21">
<mml:math id="minf21">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3d5;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:msup>
<mml:mi>e</mml:mi>
<mml:mrow>
<mml:mo>&#x2212;</mml:mo>
<mml:msub>
<mml:mi>&#x3b7;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula> is then called a potential.</p>
<p>
<xref ref-type="disp-formula" rid="e4">Equations 4</xref>, <xref ref-type="disp-formula" rid="e5">5</xref> can be thus rewritten as:<disp-formula id="e6">
<mml:math id="me6">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mstyle displaystyle="true">
<mml:mo>&#x220f;</mml:mo>
</mml:mstyle>
<mml:mi>A</mml:mi>
</mml:msub>
<mml:msub>
<mml:mi>&#x3d5;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mi>Z</mml:mi>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(6)</label>
</disp-formula>
<disp-formula id="e7">
<mml:math id="me7">
<mml:mrow>
<mml:mi>Z</mml:mi>
<mml:mo>&#x3d;</mml:mo>
<mml:munder>
<mml:mstyle displaystyle="true">
<mml:mo>&#x2211;</mml:mo>
</mml:mstyle>
<mml:mi>&#x3c9;</mml:mi>
</mml:munder>
<mml:munder>
<mml:mstyle displaystyle="true">
<mml:mo>&#x220f;</mml:mo>
</mml:mstyle>
<mml:mi>A</mml:mi>
</mml:munder>
<mml:msub>
<mml:mi>&#x3d5;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
<label>(7)</label>
</disp-formula>
</p>
<p>Since this latter use is more common in probability and graph theory, and it is also used in theoretical physics, we will refer to <xref ref-type="disp-formula" rid="e6">Eqs. 6</xref>, <xref ref-type="disp-formula" rid="e7">7</xref> as the definitions of Gibbs measure and partition function (respectively) unless otherwise stated. This will also be justified given that <xref ref-type="disp-formula" rid="e6">Eq. 6</xref> is a form of probability factorization (in this case a <italic>clique factorization</italic>)&#x20;[<xref ref-type="bibr" rid="B11">11</xref>].</p>
</sec>
<sec id="s2-5">
<title>2.5 Gibbs Fields</title>
<p>A potential is termed a nearest neighbor Gibbs potential if <inline-formula id="inf22">
<mml:math id="minf22">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3d5;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula> whenever <italic>A</italic> is not a clique. We often call a <italic>Gibbs measure</italic> to any regular measure induced by a nearest neighbor Gibbs potential. However, we may define more general Gibbs measures by considering different classes of potentials.</p>
<p>The inclusion of all cliques in the calculation of the Gibbs measure is needed to establish the equivalence between Gibbs random fields and Markov random fields. A nearest neighbor Gibbs measure on a graph determines an MRF as follows [<xref ref-type="bibr" rid="B22">22</xref>]:</p>
<p>Let <inline-formula id="inf23">
<mml:math id="minf23">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula> be a probability measure determined on <inline-formula id="inf24">
<mml:math id="minf24">
<mml:mtext>&#x3a9;</mml:mtext>
</mml:math>
</inline-formula> by a nearest neighbor Gibbs potential <italic>&#x3d5;</italic>:<disp-formula id="e8">
<mml:math id="me8">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mstyle displaystyle="true">
<mml:mo>&#x220f;</mml:mo>
</mml:mstyle>
<mml:mi>C</mml:mi>
</mml:msub>
<mml:msub>
<mml:mi>&#x3d5;</mml:mi>
<mml:mi>C</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mi>Z</mml:mi>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(8)</label>
</disp-formula>
</p>
<p>With the product taken over all cliques <italic>C</italic> on the graph <italic>G</italic>. Then,<disp-formula id="e9">
<mml:math id="me9">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:mi>G</mml:mi>
<mml:mi mathvariant="normal">&#x2216;</mml:mi>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mstyle displaystyle="true">
<mml:mo>&#x2211;</mml:mo>
</mml:mstyle>
<mml:mrow>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>&#x2032;</mml:mo>
</mml:mrow>
</mml:msub>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msup>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>&#x2032;</mml:mi>
</mml:msup>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(9)</label>
</disp-formula>
</p>
<p>Here <inline-formula id="inf25">
<mml:math id="minf25">
<mml:mrow>
<mml:msup>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>&#x2032;</mml:mi>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula> is any configuration which agrees with <italic>&#x3c9;</italic> at all points except <inline-formula id="inf26">
<mml:math id="minf26">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>.<disp-formula id="e10">
<mml:math id="me10">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>&#x3c9;</mml:mi>
<mml:mrow>
<mml:mi>G</mml:mi>
<mml:mi mathvariant="normal">&#x2216;</mml:mi>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mstyle displaystyle="true">
<mml:mo>&#x220f;</mml:mo>
</mml:mstyle>
<mml:mi>C</mml:mi>
</mml:msub>
<mml:msub>
<mml:mi>&#x3d5;</mml:mi>
<mml:mi>C</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mstyle displaystyle="true">
<mml:mo>&#x2211;</mml:mo>
</mml:mstyle>
<mml:mrow>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>&#x2032;</mml:mo>
</mml:mrow>
</mml:msub>
<mml:msub>
<mml:mstyle displaystyle="true">
<mml:mo>&#x220f;</mml:mo>
</mml:mstyle>
<mml:mi>C</mml:mi>
</mml:msub>
<mml:msub>
<mml:mi>&#x3d5;</mml:mi>
<mml:mi>C</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msup>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>&#x2032;</mml:mi>
</mml:msup>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(10)</label>
</disp-formula>
</p>
<p>For any clique <italic>C</italic> that does not contain <inline-formula id="inf27">
<mml:math id="minf27">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>, <inline-formula id="inf28">
<mml:math id="minf28">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3d5;</mml:mi>
<mml:mi>C</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>&#x3c9;</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi>&#x3d5;</mml:mi>
<mml:mi>C</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msup>
<mml:mi>&#x3c9;</mml:mi>
<mml:mi>&#x2032;</mml:mi>
</mml:msup>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>, So that all the terms that correspond to the cliques that do not contain the point <inline-formula id="inf29">
<mml:math id="minf29">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> cancel both from the numerator and the denominator in <xref ref-type="disp-formula" rid="e10">Eq. 10</xref>, therefore this probability depends only on the values <inline-formula id="inf30">
<mml:math id="minf30">
<mml:mrow>
<mml:msub>
<mml:mi>x</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> at <inline-formula id="inf31">
<mml:math id="minf31">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> and its neighbors. <inline-formula id="inf32">
<mml:math id="minf32">
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
</mml:math>
</inline-formula> defines thus an MRF. A more general proof of this equivalence was given by Hammersley-Clifford theorem (see for instance [<xref ref-type="bibr" rid="B11">11</xref>]).</p>
<p>In essence, we can state that among the general class of random fields, Markov random fields are defined by obeying the Markov neighborhood law. Gibbs fields are usually understood as Markov fields with strictly positive probability measures (in particular, a strictly positive joint probability density). These Markov-Gibbs fields are thus defined by the Markov property and the positive definite probabilities and are the ones that follow the Hammersley-Clifford theorem. More general Gibbs fields can be defined by other neighborhood laws than the Markov property [<xref ref-type="bibr" rid="B23">23</xref>], but these will not be addressed in the present&#x20;work.</p>
</sec>
<sec id="s2-6">
<title>2.6 Conditional Independence in Markov Random Fields</title>
<p>To discuss the conditional independence structure induced by MRFs, let us consider the following: An adjacency matrix <inline-formula id="inf33">
<mml:math id="minf33">
<mml:mrow>
<mml:msub>
<mml:mi>A</mml:mi>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> represents the neighborhood law (as given by the Markov property) on the graph <italic>G</italic>. Every non-zero entry in this matrix represents a statistical dependency relation between two elements on <italic>X</italic>. The conditional dependence structure on MRFs is related not only to the <italic>local</italic> statistical independence conditions, but also to the dependency structure of the whole graph [<xref ref-type="bibr" rid="B11">11</xref>,&#x20;<xref ref-type="bibr" rid="B24">24</xref>].</p>
<p>A definition of conditional independence (CI) for the set of random variables can be given as follows:<disp-formula id="e11">
<mml:math id="me11">
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:mo>&#x22a5;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>&#x21d4;</mml:mo>
<mml:msub>
<mml:mi mathvariant="double-struck">F</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi mathvariant="double-struck">F</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x22c5;</mml:mo>
<mml:msub>
<mml:mi mathvariant="double-struck">F</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
<label>(11)</label>
</disp-formula>
<disp-formula id="equ1">
<mml:math id="mequ1">
<mml:mrow>
<mml:mo>&#x2200;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>&#x2208;</mml:mo>
<mml:mi>X</mml:mi>
</mml:mrow>
</mml:math>
</disp-formula>
</p>
<p>Here <inline-formula id="inf34">
<mml:math id="minf34">
<mml:mrow>
<mml:mo>&#x22a5;</mml:mo>
<mml:mo>&#x22a5;</mml:mo>
</mml:mrow>
</mml:math>
</inline-formula> refers to conditional independence between two random variables. <inline-formula id="inf35">
<mml:math id="minf35">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="double-struck">F</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mi>P</mml:mi>
<mml:mi>r</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x2264;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x2264;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula> is the joint conditional cumulative distribution of <inline-formula id="inf36">
<mml:math id="minf36">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> and <inline-formula id="inf37">
<mml:math id="minf37">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> given <inline-formula id="inf38">
<mml:math id="minf38">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>. <inline-formula id="inf39">
<mml:math id="minf39">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
</mml:math>
</inline-formula>, <inline-formula id="inf40">
<mml:math id="minf40">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
</mml:math>
</inline-formula> and <inline-formula id="inf41">
<mml:math id="minf41">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mtext>&#x2a;</mml:mtext>
</mml:mrow>
</mml:math>
</inline-formula> are realizations of the corresponding random variables.</p>
<p>In the case of MRFs, CI is defined by means of graph separation: Hence <inline-formula id="inf42">
<mml:math id="minf42">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:mi>G</mml:mi>
</mml:msub>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> iff <inline-formula id="inf43">
<mml:math id="minf43">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> separates <inline-formula id="inf44">
<mml:math id="minf44">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> from <inline-formula id="inf45">
<mml:math id="minf45">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> in <italic>G</italic>. This means that if we remove node <inline-formula id="inf46">
<mml:math id="minf46">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> there are no undirected paths from <inline-formula id="inf47">
<mml:math id="minf47">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> to <inline-formula id="inf48">
<mml:math id="minf48">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> in&#x20;<italic>G</italic>.</p>
<p>Conditional independence in random fields can be considered in terms of subsets of <italic>V</italic>. Let <italic>A</italic>, <italic>B</italic> and <italic>C</italic> be subsets of <italic>V</italic>. The statement <inline-formula id="inf49">
<mml:math id="minf49">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:mrow>
<mml:mover accent="true">
<mml:mi>G</mml:mi>
<mml:mo>&#x5e;</mml:mo>
</mml:mover>
</mml:mrow>
</mml:msub>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>B</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>C</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>, which holds only iff <italic>C</italic> separates <italic>A</italic> from <italic>B</italic> in <italic>G</italic>, means that if we remove all vertices in <italic>C</italic> there will be no paths connecting any vertex in <italic>A</italic> to any vertex in <italic>B</italic>. This is customarily called the <italic>global Markov property</italic> of TMFs [<xref ref-type="bibr" rid="B11">11</xref>,&#x20;<xref ref-type="bibr" rid="B24">24</xref>].</p>
<p>The smallest set of vertices that renders a vertex <inline-formula id="inf50">
<mml:math id="minf50">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> conditionally independent of all other vertices in the graph is called its <italic>Markov blanket</italic>, denoted <inline-formula id="inf51">
<mml:math id="minf51">
<mml:mrow>
<mml:mi>m</mml:mi>
<mml:mi>b</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>. If we define the <italic>closure</italic> of a node <inline-formula id="inf52">
<mml:math id="minf52">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> as <inline-formula id="inf53">
<mml:math id="minf53">
<mml:mrow>
<mml:mi mathvariant="script">C</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula> then <inline-formula id="inf54">
<mml:math id="minf54">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:mo>&#x22a5;</mml:mo>
<mml:mi>G</mml:mi>
<mml:mi mathvariant="script">&#x2216;C</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x7c;</mml:mo>
<mml:mi>m</mml:mi>
<mml:mi>b</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>.</p>
<p>In an MRF, the Markov blanket of a vertex is its set of first neighbors. This statement is the so-called <italic>undirected local Markov property</italic>. Starting from the local Markov property, it is possible to show that two vertices <inline-formula id="inf55">
<mml:math id="minf55">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> and <inline-formula id="inf56">
<mml:math id="minf56">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> are conditionally independent given the rest if there is no direct edge between them. This is the <italic>pairwise Markov property</italic>.</p>
<p>If we denote by <inline-formula id="inf57">
<mml:math id="minf57">
<mml:mrow>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x2192;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> the set of undirected paths in the graph <italic>G</italic> connecting vertices <inline-formula id="inf58">
<mml:math id="minf58">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> and <inline-formula id="inf59">
<mml:math id="minf59">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>, then the pairwise Markov property of an MRF is given by:<disp-formula id="e12">
<mml:math id="me12">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:mo>&#x22a5;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:mi>G</mml:mi>
<mml:mi mathvariant="normal">&#x2216;</mml:mi>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x21d4;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mi>G</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x2192;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:mo>&#x2205;</mml:mo>
</mml:mrow>
</mml:math>
<label>(12)</label>
</disp-formula>
</p>
<p>Hence the global Markov property implies the local Markov property which, in turn, implies the pairwise Markov property. For systems with positive definite probability densities, it has been proved that pairwise Markov actually implied global Markov (See [<xref ref-type="bibr" rid="B11">11</xref>] p. 119 for a proof). This is important for applications since it is easier to assess pairwise conditional independence statements.</p>
<sec id="s2-6-1">
<title>2.6.1 Indepence Maps</title>
<p>Let <inline-formula id="inf60">
<mml:math id="minf60">
<mml:mrow>
<mml:msub>
<mml:mi>I</mml:mi>
<mml:mi>G</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> denote the set of all conditional independence relations encoded by the graph <italic>G</italic> (i.e.,&#x20;those CI relations given by the Global Markov property). Let <inline-formula id="inf61">
<mml:math id="minf61">
<mml:mrow>
<mml:msub>
<mml:mi>I</mml:mi>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> be the set of all CI relations implied by the probability distribution <inline-formula id="inf62">
<mml:math id="minf62">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>. A graph <italic>G</italic> will be called an <italic>independence map</italic> (<italic>I-map</italic>) for a probability distribution <inline-formula id="inf63">
<mml:math id="minf63">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>, if all CI relations implied by <italic>G</italic> hold for <inline-formula id="inf64">
<mml:math id="minf64">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>, i.e.,&#x20;<inline-formula id="inf65">
<mml:math id="minf65">
<mml:mrow>
<mml:msub>
<mml:mi>I</mml:mi>
<mml:mi>G</mml:mi>
</mml:msub>
<mml:mo>&#x2286;</mml:mo>
<mml:msub>
<mml:mi>I</mml:mi>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>&#x20;[<xref ref-type="bibr" rid="B11">11</xref>].</p>
<p>The converse statement is however not necessarily true, i.e.,&#x20;there may be some CI relations implied by <inline-formula id="inf66">
<mml:math id="minf66">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula> that are not coded in the graph <italic>G</italic>. We may often be interested in the so-called <italic>minimal I-maps</italic>, i.e.,&#x20;I-maps from which none of the edges could be removed without destroying its CI properties.</p>
<p>Every distribution has a unique minimal I-map (and a given graph representation). Let <inline-formula id="inf67">
<mml:math id="minf67">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3e;</mml:mo>
<mml:mn>0</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula>. Let <inline-formula id="inf68">
<mml:math id="minf68">
<mml:mrow>
<mml:msup>
<mml:mi>G</mml:mi>
<mml:mo>&#x2020;</mml:mo>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula> be the graph obtained by introducing edges between all pairs of vertices <inline-formula id="inf69">
<mml:math id="minf69">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>, <inline-formula id="inf70">
<mml:math id="minf70">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> such that <inline-formula id="inf71">
<mml:math id="minf71">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:mo>&#x22a5;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:mi>X</mml:mi>
<mml:mi mathvariant="normal">&#x2216;</mml:mi>
<mml:mrow>
<mml:mo>{</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>}</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
</inline-formula>, then <inline-formula id="inf72">
<mml:math id="minf72">
<mml:mrow>
<mml:msup>
<mml:mi>G</mml:mi>
<mml:mo>&#x2020;</mml:mo>
</mml:msup>
</mml:mrow>
</mml:math>
</inline-formula> is the unique minimal I-map. We call <italic>G</italic> a <italic>perfect map</italic> of <inline-formula id="inf73">
<mml:math id="minf73">
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
</mml:math>
</inline-formula> when there are no dependencies <italic>G</italic> which are not indicated by <inline-formula id="inf74">
<mml:math id="minf74">
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
</mml:math>
</inline-formula>, i.e.,&#x20;<inline-formula id="inf75">
<mml:math id="minf75">
<mml:mrow>
<mml:msub>
<mml:mi>I</mml:mi>
<mml:mi>G</mml:mi>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi>I</mml:mi>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>&#x20;[<xref ref-type="bibr" rid="B11">11</xref>].</p>
</sec>
<sec id="s2-6-2">
<title>2.6.2 Conditional Independence Tests</title>
<p>Conditional independence tests are useful to evaluate whether CI conditions apply either exactly or in the case of applications under a certain bounded error [<xref ref-type="bibr" rid="B24">24</xref>]. In order to be able to write down expressions for C.I. tests let us introduce the following <italic>conditional kernels</italic> [<xref ref-type="bibr" rid="B25">25</xref>]:<disp-formula id="e13">
<mml:math id="me13">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="normal">&#x2102;</mml:mi>
<mml:mi>A</mml:mi>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>B</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>B</mml:mi>
<mml:mo>&#x7c;</mml:mo>
<mml:mi>A</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>A</mml:mi>
<mml:mi>B</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>A</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(13)</label>
</disp-formula>
</p>
<p>As well as their generalized recursive relations:<disp-formula id="e14">
<mml:math id="me14">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="normal">&#x2102;</mml:mi>
<mml:mrow>
<mml:mi>A</mml:mi>
<mml:mi>B</mml:mi>
<mml:mi>C</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>D</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi mathvariant="normal">&#x2102;</mml:mi>
<mml:mrow>
<mml:mi>A</mml:mi>
<mml:mi>B</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>&#x7c;</mml:mo>
<mml:mi>C</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="normal">&#x2102;</mml:mi>
<mml:mrow>
<mml:mi>A</mml:mi>
<mml:mi>B</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:mi>C</mml:mi>
<mml:mi>D</mml:mi>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="normal">&#x2102;</mml:mi>
<mml:mrow>
<mml:mi>A</mml:mi>
<mml:mi>B</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mi>C</mml:mi>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(14)</label>
</disp-formula>
</p>
<p>The conditional probability of <inline-formula id="inf76">
<mml:math id="minf76">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> given <inline-formula id="inf77">
<mml:math id="minf77">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> can be thus written as:<disp-formula id="e15">
<mml:math id="me15">
<mml:mrow>
<mml:msub>
<mml:mi mathvariant="normal">&#x2102;</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(15)</label>
</disp-formula>
</p>
<p>We can then write down expressions for Markov conditional independence as follows:<disp-formula id="e16">
<mml:math id="me16">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:mo>&#x22a5;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>&#x21d2;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#xd7;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
<label>(16)</label>
</disp-formula>
</p>
<p>Following Bayes&#x2019; theorem, CI conditions&#x2013;in this case&#x2013;will be of the form:<disp-formula id="e17">
<mml:math id="me17">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
<mml:mo>&#xd7;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#xd7;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(17)</label>
</disp-formula>
</p>
<p>
<xref ref-type="disp-formula" rid="e17">Equation 17</xref> is useful since in large scale data applications is computationally cheaper to work with joint and marginal probabilities rather than conditionals.</p>
<p>Now let us consider the case of conditional independence given several conditional variables. The case for CI given two variables could be written&#x2013;using conditional kernels&#x2013;as follows:<disp-formula id="e18">
<mml:math id="me18">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:mo>&#x22a5;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
<mml:mo>&#x21d2;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#xd7;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
<label>(18)</label>
</disp-formula>
</p>
<p>Hence,<disp-formula id="e19">
<mml:math id="me19">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:msub>
<mml:mi mathvariant="normal">&#x2102;</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#xd7;</mml:mo>
<mml:msub>
<mml:mi mathvariant="normal">&#x2102;</mml:mi>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
</mml:msub>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
<label>(19)</label>
</disp-formula>
</p>
<p>Using Bayes&#x2019; theorem,<disp-formula id="e20">
<mml:math id="me20">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
<mml:mo>&#xd7;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(20)</label>
</disp-formula>
</p>
<p>Or<disp-formula id="e21">
<mml:math id="me21">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>&#x7c;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#xd7;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>l</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>n</mml:mi>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(21)</label>
</disp-formula>
</p>
<p>In order to generalize the previous results to CI relations given an arbitrary set of conditionals, let us consider the following <italic>sigma-algebraic</italic> approach:</p>
<p>Let <inline-formula id="inf78">
<mml:math id="minf78">
<mml:mrow>
<mml:msub>
<mml:mtext>&#x3a3;</mml:mtext>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> be the &#x3c3;-algebra of all subsets of <italic>X</italic> that do not contain <inline-formula id="inf79">
<mml:math id="minf79">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> or <inline-formula id="inf80">
<mml:math id="minf80">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula>. A relevant problem for network reconstruction is that of establishing the more general Markov pairwise CI conditions, i.e.,&#x20;the CI relations for every edge not drawn on the graph. Two arbitrary nodes <inline-formula id="inf81">
<mml:math id="minf81">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> and <inline-formula id="inf82">
<mml:math id="minf82">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> are conditionally independent given the rest of the graph iff:<disp-formula id="e22">
<mml:math id="me22">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>&#x22a5;</mml:mo>
<mml:mo>&#x22a5;</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mtext>&#x3a3;</mml:mtext>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>&#x21d2;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mtext>&#x3a3;</mml:mtext>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mtext>&#x3a3;</mml:mtext>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#xd7;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mtext>&#x3a3;</mml:mtext>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
</mml:math>
<label>(22)</label>
</disp-formula>
</p>
<p>By using conditional kernels, the recursive relations and Bayes&#x2019; theorem it is possible to write down:<disp-formula id="e23">
<mml:math id="me23">
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mtext>&#x2009;</mml:mtext>
<mml:mo>&#x7c;</mml:mo>
<mml:mtext>&#x2009;</mml:mtext>
<mml:msub>
<mml:mtext>&#x3a3;</mml:mtext>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#x3d;</mml:mo>
<mml:mfrac>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mtext>&#x3a3;</mml:mtext>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
<mml:mo>&#xd7;</mml:mo>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
<mml:mo>,</mml:mo>
<mml:msub>
<mml:mtext>&#x3a3;</mml:mtext>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mrow>
<mml:mi mathvariant="normal">&#x2119;</mml:mi>
<mml:msup>
<mml:mrow>
<mml:mrow>
<mml:mo>(</mml:mo>
<mml:mrow>
<mml:msub>
<mml:mtext>&#x3a3;</mml:mtext>
<mml:mrow>
<mml:mi>i</mml:mi>
<mml:mi>j</mml:mi>
</mml:mrow>
</mml:msub>
</mml:mrow>
<mml:mo>)</mml:mo>
</mml:mrow>
</mml:mrow>
<mml:mn>2</mml:mn>
</mml:msup>
</mml:mrow>
</mml:mfrac>
</mml:mrow>
</mml:math>
<label>(23)</label>
</disp-formula>
</p>
<p>The family of <xref ref-type="disp-formula" rid="e23">Eq. 23</xref> represent the CI relations for all the non-existing edges in the graph <italic>G</italic>, i.e.,&#x20;every pair of nodes <inline-formula id="inf83">
<mml:math id="minf83">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>i</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> and <inline-formula id="inf84">
<mml:math id="minf84">
<mml:mrow>
<mml:msub>
<mml:mi>X</mml:mi>
<mml:mi>j</mml:mi>
</mml:msub>
</mml:mrow>
</mml:math>
</inline-formula> not-connected in <italic>G</italic> must be conditionally independent given the rest of the nodes in the graph. This is perhaps the most important features of MRFs in connection with potential applications as probabilistic graphical models. CI conditions often lead to simpler (or at least computationally tractable) ways to factorize the PDF or compute the partition function.</p>
<p>The algorithmic complexity of doing so <italic>in general</italic> (since the number of CI relations grows combinatorially with the size of the graph), makes it prohibitive in the case of a large number of variables/relationships, in spite of recent advances on optimizing large dimensional space CI testing for discrete distributions [<xref ref-type="bibr" rid="B26">26</xref>]. This is the biggest advantage of the present approach. As long as one deals with strictly positive probabilities (that one can often attain <italic>via</italic> regularization) and Hammersley-Clifford conditions apply, modeling with nearest neighbor Gibbs potentials ensure CI conditions in the graph (recall that global Markov property implies pairwise Markov property and vice versa).</p>
<p>Now that we have presented the fundamentals of MRFs at an introductory level, this may allow to discuss on how these features have impact on their wide range of applications, as the basis for probabilistic graphical models. Let us start by considering some recent applications in physics.</p>
</sec>
</sec>
</sec>
<sec id="s3">
<title>3 Markov Random Fields in Physics</title>
<p>From the pioneering work of the Ehrenfests, to the foundational Izing models and its extensions (Potts, XY, etc.), MRFs have been thoroughly used and developed in many subdisciplines of physics, ranging from condensed matter and mathematical physics to geophysics, econophysics and more. There are numerous in-depth reviews and monographs summarizing research along these lines (see, for instance [<xref ref-type="bibr" rid="B27">27</xref>&#x2013;<xref ref-type="bibr" rid="B30">30</xref>]). Since the main goal here is to present some of the characteristic features of the usefulness of MRFs as probabilistic graphical models, in terms of their mathematical properties and broad scope of applicability, both within and outside physics; our discussion will be somehow biased toward work showing one or more of such features.</p>
<sec id="s3-1">
<title>3.1 MRFs in Statistical Mechanics and Mathematical Physics</title>
<p>Due to their intrinsic simplicity and generality, MRFs have attracted the attention of mathematical physicists and probability theorists looking to extend their associated theoretical foundations. Important work has been done, for instance, to incorporate geometrical properties and generalized embeddings to the theory of random fields. Extremely relevant in this regard is the monumental work presented in the monograph by Adler and Taylor [<xref ref-type="bibr" rid="B31">31</xref>]. There, the authors expand on the consideration of a random field as a stochastic process in a metric space (discrete, Euclidean, etc.) to consider random fields as stochastic mappings over manifolds. This extension is given <italic>via</italic> writing down differential geometry characterizations of the fields based on a measure-theoretic definition of probability. Though this work may seem quite abstract, it was indeed born out of an idea for an application of random fields to neuroscience. Nurturing from similar ideas, recent work by Ganchev [<xref ref-type="bibr" rid="B32">32</xref>] has expanded the notion of locality of MRFs and assimilate it to the geometric features present in lattice quantum gauge theories, to generate a <italic>gauge theory of Markov-Gibbs fields</italic>. Again, even if the setting seems to be quite theoretical, an application to the modeling of trading networks in finance is&#x20;given.</p>
<p>Other mathematical extensions of Markov random fields are related to the nature of the graphical model considered. In general, probabilistic graphical models may belong to one of two quite general classes: Markov networks (such as MRFs) which are <italic>undirected</italic> graphs or Bayesian networks which are <italic>directed</italic> graphs. The difference between undirected and directed graphical models impose consequences in the kind of fundamental mathematical objects of the theory: joint probabilities or conditional probabilities, loopy graphs or trees&#x2013;directed acyclic graphs&#x2013;, clique factorization vs. conditional probability factorization <italic>via</italic> the chain rule, etc. Whether the model is undirected or directed also has modeling and computational consequences. To be fair, both models have pros and&#x20;cons.</p>
<p>Trying to overcome the limitations of both general approaches, Freno and Trentin [<xref ref-type="bibr" rid="B33">33</xref>] developed a more general approach to random fields termed <italic>Hybrid random fields</italic> (HRFs). The purpose of HRFs is to allow the systems to present a wider variety of conditional independence structures. As we will discuss later, allowing for a systematic incorporation of more general classes of conditional independence structures in indeed one of the current <italic>hot topics</italic> in computational intelligence and machine learning. Actually, even when HRFs are theoretical constructs (much alike MRFs) they were designed to be <italic>learning machines</italic>, i.e.,&#x20;to be supplemented with training algorithms to deal with high dimensional data. HRFs were developed for logical inference in the presence of partial information or noise. As in the case of MRFs and of their gauge extensions just mentioned, HRFs were developed to rely on a <italic>principle of locality</italic> which is an extension of the Markov property that allows for sparse stochastic matrix representations amenable for the computation on actual applications. Once a (graph) structure has been given (or inferred) HRFs are able (as is the case of MRFs) to learn the local (conditional or joint-partial) probability distributions from empirical data, a task commonly known in statistics as <italic>parameter learning</italic> [<xref ref-type="bibr" rid="B34">34</xref>]. Hence HRFs are theoretically founded, but developed thinking in applications. The scope of applicability of MRFs has also become broader by expanding its applicability to model tensor valued quantities [<xref ref-type="bibr" rid="B35">35</xref>], giving rise to the so-called multilayer graphical models, also called multilayer networks [<xref ref-type="bibr" rid="B36">36</xref>&#x2013;<xref ref-type="bibr" rid="B39">39</xref>].</p>
<p>Aside from expanding the fundamental structure of MRFs, mathematical physics applications of Gibbs random fields are abundant. In particular, the so-called Random Field Ising model (RFIM) has gained a lot of attention in the recent years. By using the monotonicity properties of the associated stochastic field, Aizenmann and Peled [<xref ref-type="bibr" rid="B40">40</xref>] were able to prove that there is a power law upper bound on the correlations on a two-dimensional Ising model, supplemented with a quenched random magnetic field. The fact that by combining random fields (the intrinsic Ising field and the quenched magnetic field), the nature of the phase transitions may drastically change has made the RFIM a current topic of discussion in mathematical statistical mechanics. The consequences of the induction of long range order in the RFIM, leading to the emergence of the so-called Imry-Ma phase or Imry-Ma states (named so since Imry and Ma were actually behind the first proposal of the RFIM [<xref ref-type="bibr" rid="B41">41</xref>]) have been the object of intense study recently. Berzin and co-workers [<xref ref-type="bibr" rid="B42">42</xref>] used MRFs to analyze the dynamic fluctuations of the order parameter in the Imry-Ma RFIM and its coupling with the static fluctuations of the structural random field (accounting for the defects). Interestingly, anisotropic coupling arises from two non-absolutely overlapping local fields [<xref ref-type="bibr" rid="B43">43</xref>]. The effects of the non-overlapping fields in anisotropy and disorder has been studied since several decades ago [<xref ref-type="bibr" rid="B44">44</xref>], but the actual relationship with non-locality was established relatively recently. For instance, it was until 2018 that Chatterjee was able to quantitatively describe the decay of correlations of the 2D RFIM [<xref ref-type="bibr" rid="B45">45</xref>] in a relevant paper that led Aizenmann to re-analyze his former, mostly qualitative proposal [<xref ref-type="bibr" rid="B40">40</xref>,&#x20;<xref ref-type="bibr" rid="B46">46</xref>].</p>
<p>Local stochastic phenomena in non-homogeneous and disordered media in the context of the RFIM has also attracted attention in relation to critical exponents and scaling. Trying to expand on the origins of long range order from local interactions, Fytas and coworkers have studied the 4D RFIM and its hyperscaling coefficients [<xref ref-type="bibr" rid="B47">47</xref>]. This is particularly interesting since it has been shown, <italic>via</italic> perturbative renormalization group calculations, that the critical exponents of the RFIM in <italic>D</italic> dimensions are the same as the exponents of the pure Ising model in <inline-formula id="inf85">
<mml:math id="minf85">
<mml:mrow>
<mml:mi>D</mml:mi>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>2</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula> dimensions [<xref ref-type="bibr" rid="B48">48</xref>]. Related work has been carried out by Tarjus and Tissier, but they instead resort to the use of the so-called <italic>functional renormalization group</italic> approach in the multi-copy formalism setting [<xref ref-type="bibr" rid="B49">49</xref>]. Their work has extended the predictive capabilities of MRFs by incorporating ideas from symmetry breaking allowing to characterize not just long-range order (LRO) but also intermediate states characterized by quasi-long range order (QLRO). The fact that QLRO may be attained from purely Markov statistics (localized interactions) is in itself appealing for statistical physics. The fact that local dependencies may suffice to account for LRO and QLRO under certain conditions that do not violate the Markov property of the MRFs, will have relevant consequences for the applications of MRFs outside physics, such as in the case of image reconstruction and pattern recognition in machine learning. We will come back to these ideas later&#x20;on.</p>
<p>Locality as depicted in MRFs can also have important consequences for the theory of fluctuations in fields of interacting particles. Reconstructing Boltzmann statistics from local Gibbs fields (that as we have repeatedly stated are formally equivalent to MRFs, provided strictly positive probability measures) imply that under central limit scales the fluctuation field of local functions can be represented instead as a function of the density fluctuation field, in what is known as the Boltzmann-Gibbs principle (BGP). It has been shown that the BGP induces a duality whose origins are purely probabilistic, i.e.,&#x20;is independent of the nature of the interactions provided their compliance with the tenets of MRFs&#x20;[<xref ref-type="bibr" rid="B50">50</xref>].</p>
<p>It is worth noticing that these contemporary developments in the formal theory of MRFs are actually founded on seminal work by probability theorists and mathematical physicists such as Dobrushin, Ruelle, Gudder, Kessler and others. For instance, Dobrushin laid out the essential conditions of regularity that allow to make explicit the conditional probabilities in MRF models [<xref ref-type="bibr" rid="B8">8</xref>]. This work, further developed by Lanford and Ruelle [<xref ref-type="bibr" rid="B9">9</xref>] gives rise to the so called Dobrushin-Lanford-Ruelle (DLR) equations that established, in a formal way, the properties of general Gibbs measures. Later on, Dobrushin expanded on these ideas by applying perturbation methods to generalize Gibbs measures to even wider classes of interactions (i.e.,&#x20;to include other families of potentials) [<xref ref-type="bibr" rid="B51">51</xref>]. An application of these ideas in quantum field theory can be found in [<xref ref-type="bibr" rid="B52">52</xref>] within the context of (truncated) generalized Gibbs ensembles.</p>
<p>Aside from measure-theoretical and algebraic foundations of MRFs, important developments were made by considering explicit dependency structures. In particular, the introduction of strong independence properties led to the formal definition of Gaussian random fields by Gudder [<xref ref-type="bibr" rid="B53">53</xref>]. Much of this earlier work has been summarized in the monograph by Kindermann and Laurie Snell [<xref ref-type="bibr" rid="B22">22</xref>]. The fact that MRFs are characterized by Gibbs measures even for many-body interactions (under special conditions), and not only for paired-potentials, was already envisioned by Sherman [<xref ref-type="bibr" rid="B54">54</xref>], though it remained an unfinished task for decades. Many body effects have actually been reported in the context of localization in the random field Heisenberg chain [<xref ref-type="bibr" rid="B55">55</xref>]. One step ahead toward generalizing MRFs consisted in exploring the equivalence of some properties of random fields in terms of <italic>sample functions</italic>. In this regard, Starodubov [<xref ref-type="bibr" rid="B56">56</xref>] proved that there are random fields stochastically equivalent to an MRF, but defined on another probability triple whose sample functions belong to a map associated with the original MRF. The existence of such mappings has relevant implications for applications, in particular in cases in which explicit computation of the partition function is intractable.</p>
</sec>
<sec id="s3-2">
<title>3.2 MRFs in Condensed Matter Physics and Materials Science</title>
<p>Discrete and continuous versions of random fields have been applied to model systems in condensed matter physics and materials science (CMP/MS). The relevance of MRFs and its extensions relies on their suitability to describe the onset of spatio-temporal phenomena from localized interactions. Acar and Sundararaghavan [<xref ref-type="bibr" rid="B57">57</xref>] have used MRFs to model the spatio-temporal evolution of microstructures, such as grain growth in polychrystalline microstructures as captured by videomicroscopy experiments. Experimental data is the foundation for explicit calculations of the (empirical) conditional probability distributions.</p>
<p>Gaussian random fields have been used to model quenched random potentials in fluids <italic>via</italic> mode-coupling by Konincks and Krakoviack [<xref ref-type="bibr" rid="B58">58</xref>], and to model beta-distributed material properties by Liu and coworkers [<xref ref-type="bibr" rid="B59">59</xref>]. These and other extensions in CMP/MS made use of continuous, piecewise continuous or lattice fluid extensions of Gibbs random fields. Such is also the case of the work of Chen and coworkers [<xref ref-type="bibr" rid="B60">60</xref>] who introduced stochastic harmonic potentials in random fields to account for the effects of local interactions on the properties of structured materials; of the work by Singh and Adhilari [<xref ref-type="bibr" rid="B61">61</xref>] on Brownian motion in confined active colloids and of the work of Yamazaki [<xref ref-type="bibr" rid="B62">62</xref>] on stochastic Hall magnetohydrodynamics. A semi-continuous approach (called smoothed particle hydrodynamics, SPH), using discrete MRFs and extension theorems, was used by Ullah and collaborators [<xref ref-type="bibr" rid="B63">63</xref>] in their density dependent <italic>hydrodynamic</italic> model for crowd coherency detection in active matter.</p>
<p>Extending the ideas of the classic RFIM, Tadic and collaborators [<xref ref-type="bibr" rid="B64">64</xref>] were able to describe critical Barkhausen avalanches in quasi-2D ferromagnets with an open boundary. The use of MRFs with disordered field components has also allowed to characterize embedded inhomogeneities in the spectral properties of Rayleigh waves with application to the study of the Earth&#x2019;s microseismic field [<xref ref-type="bibr" rid="B65">65</xref>]. Geoacustic measurements and its MRF modeling allowed these researchers to estimate the mechanical and structural properties of the Earth&#x2019;s crust and upper mantle. Accurate estimates of these properties are foundational to develop seismic-resistant devices and structures.</p>
</sec>
<sec id="s3-3">
<title>3.3 Applications of MRFs in Other Areas of Physics</title>
<p>MRFs have also been applied in other areas of physics aside from statistical mechanics and condensed matter. MRFs were applied for instance, in geophysical models of marine climate patterns [<xref ref-type="bibr" rid="B66">66</xref>], to study reservoir lithology [<xref ref-type="bibr" rid="B67">67</xref>] and subsurface soil patterns [<xref ref-type="bibr" rid="B68">68</xref>] from remote sensing data. Aside from geophysics, optics and acoustics have also incorporated MRF applications. In acoustics, for instance, an MRF formalism can be used for the isolation of selected signals [<xref ref-type="bibr" rid="B69">69</xref>]; or for the segmentation of sonar pulses [<xref ref-type="bibr" rid="B70">70</xref>]. In chemical physics, MRFs are applied for the analysis of molecular structures [<xref ref-type="bibr" rid="B71">71</xref>], and in the implementation of quantum information algorithms for molecular physics modeling&#x20;[<xref ref-type="bibr" rid="B72">72</xref>].</p>
<p>Disparate as the applications of MRF in the physical sciences just presented may be, these are neither a comprehensive nor even a representative list. However, we expect that some of the essential aspects of its wide range of applicability and the large room for theoretical development still available for these types of models were captured in the previous discussion. Moving on to applications and developments in other disciplines, such as Biology/Biomedicine and the Data Sciences, we will try to convey, not just the usefulness of a quintessential model in statistical physics in other realms&#x2013;which is huge, indeed&#x2013;. We also intend to show how some of the implementations and theoretical improvements in other disciplines, can be exported back to physics and may help to solve some of the many remaining conundrums of the theory and applications of random fields in the physical sciences.</p>
</sec>
</sec>
<sec id="s4">
<title>4 Markov Random Fields in Biology</title>
<p>Biology and Biomedicine are also disciplines in which MRFs have flourished in applications and theoretical development. The abundance of research problems and practical cases in which stochastic phenomena dependent in spatio-temporal localization is most surely behind. From the reconstruction of complex imaging patterns (not far from applications in geophysics/astrophysics imaging), to resolution of molecular maps in structural biology, to disentangling molecular interaction networks and ecological interactions; there are many outstanding advances involving random fields in biology. Again, we will discuss here just a few examples that will likely provide us with a panoramic view and perhaps spark interest and curiosity.</p>
<sec id="s4-1">
<title>4.1 Applications of MRFs in Biomedical Imaging</title>
<p>One somehow natural application of MRFs is imaging de-noizing or <italic>segmentation</italic>. This is a quite general problem in which one wishes to discern patterns from a <italic>blurred</italic> image. In particular an MRF is built to discern which points in imaging space (pixels, voxels) are locally correlated with each other, pointing out to their <italic>membership</italic> to the same object in the image. The Markov neighborhood structure of the MRF is hence used to <italic>un-blur</italic> patterns and being able to accurately interpret the images. Often MRFs (or its associated conditional Random fields) are used in conjunction with inference machines such as Convolutional Neural Networks (CNNs). This is the case of the work by Li and Ping [<xref ref-type="bibr" rid="B73">73</xref>] who used a neural conditional random field (NCRF) for metastasis detection from lymph node slide images. Their NCRF approach infers the spatial correlations among neighboring patches <italic>via</italic> a fully connected conditional MRF incorporated on top of a CNN feature extractor. Their modeling approach used a conditional distribution of an MRF with a Gibbs distribution. As is often the case the <italic>energy function</italic> (i.e.,&#x20;the <italic>Hamiltonian</italic>) consists of two terms, one summarizing the contributions from unary potentials characteristic for each patch, and the other one summing the pairwise potentials measuring the <italic>cost</italic> of jointly assigning two neighboring patches (i.e.,&#x20;the <italic>interaction</italic> potentials).</p>
<p>As is common in physics, estimating the marginals is an intractable problem. Li and Ping resorted to using a <italic>mean-field</italic> approach and then conditioning their results on this mean field calculations. In order to do this, they trained a CNN with the empirical data. CNN-MRF approaches have also been recently applied to successfully discern computerized tomography imaging (CT scans) [<xref ref-type="bibr" rid="B74">74</xref>] for prostate and other pelvic organs at risk. After processing the data with an encoder/decoder scheme, the output of CNN was used as the unary potential of the MRF. Then <italic>via</italic> a MRF block model based on local convolution layers, a global convolution layer, and a 3D max-pooling layer the authors were able to calculate the pairwise potential. The maximum likelihood optimization problem was then solved <italic>via</italic> an adaptive loss function.</p>
<p>A similar approach was followed by Fu and collaborators [<xref ref-type="bibr" rid="B75">75</xref>] to solve the retinal vessel segmentation problem, fundamental in the diagnostics and surgery of ophthalmological diseases, and, until quite recently <italic>manually</italic> performed by an ocular pathologist. The authors also used a two term energy function within a mean field approach. To minimize the energy function subject to empirical constraints they used a recurrent neural network based on Gaussian kernels on the feature vectors applying standard gradient descent methods. Blood vessel segmentation was also studied using conditional MRFs by Orlando and coworkers [<xref ref-type="bibr" rid="B76">76</xref>]. However, instead of using a mean-field approach and inferring the marginals using neural networks, these authors chose to perform Maximum a Posteriori (MAP) labeling with likelihood functions optimized <italic>via</italic> Support Vector Machines (SVMs). Imaging segmentation <italic>via</italic> MRFs can be applied not only at the tisular level, but also on cellular (and even supramolecular) scales. Several blood diseases, for instance, are diagnosed by discerning the quantity, morphology and other aspects of leukocytes as well as their nuclear and cytoplasmic structure. To this end, Reta and coworkers used unsupervised binary MRFs (i.e.,&#x20;classical Ising-like fields) to study leukocyte segmentation [<xref ref-type="bibr" rid="B77">77</xref>]. A Markov neighborhood and clique potential approach was followed. This <italic>classic</italic> approach has been enough since from their high quality colored imaging data, it was possible to define an energy function based on <italic>a priori</italic> Gaussian-distributed probabilities, then applying a maximum likelihood approach to calculate the posterior probability. Related ideas were used to study microvasculature disorders in glioblastomas by the group of Kurz&#x20;[<xref ref-type="bibr" rid="B78">78</xref>].</p>
<sec id="s4-1-1">
<title>Application Box I: Metastasis Detection</title>
<p>
<list list-type="simple">
<list-item>
<p>
<bold>General problem statement:</bold> Accurate detection of metastatic events is key to proper diagnostics in cancer patients. Pathologists often resort to the analysis of whole slide images (WSI). Computational histopathology aims for the automated modeling and classification of WSI to distinguish between normal and tumor cells, thus alleviating the heavy burden of manual image classification. Li and Ping [<xref ref-type="bibr" rid="B73">73</xref>] used Conditional Random Fields together with deep convolutional neural networks to approach this problem.</p>
</list-item>
<list-item>
<p>
<bold>Theoretical/Methodological approach:</bold> The approach developed by the authors consisted in using a deep convolutional neural network (CNN) for the automated detection of the relevant variables (feature extraction or feature selection). Once these relevant variables have been determined, a conditional random field (CRF) was used to consider the spatial correlations between neighboring patches. The approach used to determine tumor and non-tumor regions is similar to the one used in statistical physics of condensed matter for the determination of ferromagnetic/anti-ferromagnetic domains.</p>
</list-item>
<list-item>
<p>
<bold>Improvements/advantages:</bold> The use of CNNs to reduce the number of variables (and to find the optimal ones) is gaining relevance in computational biology and data analysis applications of random fields. It may result useful in any setting in which there are no <italic>a priori</italic> determined relevant variables. By conditioning these variables on the spatial location, the authors have turn the configuration problem into a classifier thus solving their problem.</p>
</list-item>
<list-item>
<p>
<bold>Limitations:</bold> Though not an actual limitation for their particular problem, the authors resort to the use of a mean field approach to infer the marginals. This condition can be strengthened by using approaches such as perturbative expansions or maximum entropy optimization with a suitable set of constraints.</p>
</list-item>
</list>
</p>
<p>MRFs have also been used in conjunction with deep learning approaches for the topographical reconstruction of colon structures from conventional endoscopy images. Since the colon is a deeply complex anatomical structure, accurately reconstructing its structure to detect anomalies related to, for instance, colorectal cancer is of paramount importance. Mahmmod and Durr [<xref ref-type="bibr" rid="B79">79</xref>] developed a deep convolutional neural network-conditional random field method, which uses a two-term energy function whose parameters are optimized <italic>via</italic> stochastic-descent back-propagation. Several convolution maps were used since their goal was also to estimate depth from photographic (2D) images <italic>via</italic> MAP (i.e.,&#x20;by an a posteriori maximum likelihood) optimization. This was actually possible since the authors trained their model with over 200,000 synthetic images of an anatomically realistic&#x20;colon.</p>
<p>To improve the automated evaluation of mammography, Sari and coworkers [<xref ref-type="bibr" rid="B80">80</xref>] developed an MRF approach supplemented with simulated annealing optimization (MRF/SA). Improved performance was actually attained by using pre-processing filters leading to AUC/ROC of up to 0.84, which is considered quite high since mammograms have proved to be especially hard to interpret with computer aided diagnostics. MRFs have also helped improve the estimation of cardiac strain from magnetic resonance imaging data, a relatively non-invasive test to analyze cardiac muscle mechanics&#x20;[<xref ref-type="bibr" rid="B81">81</xref>].</p>
</sec>
</sec>
<sec id="s4-2">
<title>4.2 Applications of MRFs in Computational Biology and Bioinformatics</title>
<p>Computational biology and bioinformatics are also disciplines that have widely adopted the random field formalism as a relevant component of their toolkits. There are several instances in which MRFs can be adapted to solve problems in these domains: from structural biology problems in which the spatio-temporal locality is naturally mapped onto random fields, to molecular regulatory networks in which the graph structure of the MRFs mimic the underlying connectivity of the networks, to <italic>semantic</italic> and <italic>linguistic</italic> segmentation problems in genomic sequences or biomedical&#x20;texts.</p>
<p>Regarding computational models in structural biology, Rosenberg-Johansen and his group [<xref ref-type="bibr" rid="B82">82</xref>] used a combination of deep neural networks and conditional random fields to improve predictions on the secondary structure of proteins (i.e.,&#x20;the three dimensional conformation of local protein segments, the formation of alpha helices, beta sheets and so on). The CRF approach was quite useful in this case (in general non-computationally tractable), since in protein secondary structure, there is a high degree of crosstalk between neighboring elements (residues), then the local dependency structure greatly shrinks the <italic>search space</italic>. Previously, Yanover and Fromer [<xref ref-type="bibr" rid="B83">83</xref>] applied an MRF formalism for the prediction of low energy, protein side configurations, a relevant problem fro several aspects of structural biology such as <italic>de novo</italic> protein folding, homology modeling and protein-protein docking. The different types of local interactions among amino acid residues: hydrophobic, hydrophilic, charged, polar, etc.) modeled as pairwise potentials let to semi-empirical expressions for the potential energies used in the MRF formalism. Once explicit expressions for the field have been written, the authors resort to a belief-propagation algorithm to find the optimal solution to the MRF problem given the constraints. Several improvements were actually applied to the message-passing algorithm that allow the authors to find a method to obtain the lowest energy amino acid chain configurations. This kind of approach may also be relevant to improve solving methods of random fields in statistical physics problems since it led to approximate explicit forms of the partition function.</p>
<p>Improving methods to discern the structural properties of proteins are also quite used in the context of protein homology, i.e.,&#x20;to investigate on the functions of proteins related to their structural similarity to other proteins, perhaps in different organisms. Local homology relationships can also be investigated by means of Markov random field methods. Xu and collaborators developed a method (or better, a <italic>family</italic> of methods) called MRFalign for protein homology detection based on the alignment of MRFs [<xref ref-type="bibr" rid="B84">84</xref>, <xref ref-type="bibr" rid="B85">85</xref>]. Aside from purely <italic>Ising</italic> approaches, other methods of random fields of statistical mechanics have been adopted in the computational biology community. One of them is the Potts model. Recently, Wilburn and Eddy used a Potts model with latent variables for the prediction of <italic>remote</italic> protein homology (involving changes such as insertions and deletions) [<xref ref-type="bibr" rid="B86">86</xref>] importance sampling from extensive databases was used to perform MAP optimization as commonly done in computational biology and computer science.</p>
<p>A topic related to homology, but also involving space-dependent electrostatic interactions (protein-protein interactions, in particular) is protein function prediction. Networked models of protein prediction have been developed: primitive models can be used to associate a function to a given protein given the functions of proteins in their interaction neighborhood and probabilistic models may do this by weighting interactions with an associated probability. Gehrman and collaborator devised a CRF method fro protein function prediction based on these premises [<xref ref-type="bibr" rid="B87">87</xref>]. To solve the CRF, they resort to a factor graph approach [<xref ref-type="bibr" rid="B88">88</xref>] to write down explicit contributions to the cliques [<xref ref-type="bibr" rid="B89">89</xref>] and then using an approximate Gibbs measure calculated from this clique factorization. The approximation is based on other relevant feature of Markov random fields, which we will discuss later in the context of statistics and computer science: the use of the so-called <italic>Gibbs sampler</italic> or Gibbs sampling algorithm [<xref ref-type="bibr" rid="B90">90</xref>]. The Gibbs sampler is a Markov chain Monte Carlo (MCMC) method used to obtain a sequence of observations&#x2013;approximated from a specified multivariate probability distribution&#x2013;, in those cases for which direct sampling is difficult or even impossible (e.g., NP-hard or super-combinatorial problems).</p>
<p>Perhaps not so well known as a relevant structural biology problem until recently, is the determination of three dimensional chromosome structure inside the cell&#x2019;s nucleus. Long range chromosomal interactions are believed to be ultimately related to fundamental issues on global and local gene regulation phenomena. A recently devised experimental method for global <italic>chromosome conformation capture</italic> is known as Hi-C. Nuclear DNA is subject to formaldehyde treatment to enhance covalent interactions <italic>glueing</italic> chromosome segments that are three dimensionally adjacent. Then a battery of restriction enzymes is used to cut DNA into pieces. Such pieces are sequenced and the identity of the spatially adjacent regions are then discovered. The data is noisy and often incomplete. For these reasons, a team lead by Yun Li developed a hidden Markov random field method to analyze Hi-C data to detect long range chromosomal interactions [<xref ref-type="bibr" rid="B91">91</xref>]. This method combines ideas from MRFs, Bayesian networks and Hidden Markov models. In a nutshell, they assumed a mixture of negative binomials as an Ising prior [<xref ref-type="bibr" rid="B22">22</xref>] and supplemented it with Bayesian inference to calculate the joint probabilities <italic>via</italic> a Metropolis-Hastings pseudo-likelihood approach.</p>
<sec id="s4-2-1">
<title>Application Box II: Prediction of Low Energy Protein Side Chain Configurations</title>
<p>
<list list-type="simple">
<list-item>
<p>
<bold>General problem statement:</bold> The prediction of energetically favorable aminoacid chain configurations constrained on the three-dimensional structure of a protein principal chain is a relevant problem in structural biology. Accurate side configuration predictions are key to develop approaches to <italic>de novo</italic> protein folding, to model protein homology and to study protein-protein docking. Yanover and Fromer [<xref ref-type="bibr" rid="B83">83</xref>] used a Markov Random Field with pairwise energy interactions supplemented with a belief propagation algorithm to bypass the mean field approximation.</p>
</list-item>
<list-item>
<p>
<bold>Theoretical/Methodological approach:</bold> The authors developed their approach by modeling energy levels (as obtained by simulation and calorimetric techniques) as the relevant variables in a pairwise Markov Random Field. Since local side chain configurations have inhomogeneous contributions to the global energy landscape, a mean field approach will not be accurate. In order to circumvent the other extreme of modeling all detailed molecular interactions, the authors used belief propagation algorithm (BPA), a class of message passing method that performs global optimization (in this case energy minimization) by iterative local calculations between neighboring&#x20;sites.</p>
</list-item>
<list-item>
<p>
<bold>Improvements/advantages:</bold> We can consider the use of the BPA on top of the MRF, as a compromise between mean field approach (not useful to solve the actual structural biology problem) and full-detail molecular interaction modeling (computationally intractable due to the large combinatorial search space involved).</p>
</list-item>
<list-item>
<p>
<bold>Limitations:</bold> Protein side chain prediction may in many cases be affected by subtle angular variations in the rotamer side chains. The authors have discussed that, to improve the accuracy of their predictions in such cases, it may be useful to resort to continuous-valued (Gaussian) MRFs with their associated BPAs as an avenue for further improvement within the current theoretical framework.</p>
</list-item>
</list>
</p>
<p>The spatial configuration of proteins within protein assemblies such as membranes it is also relevant to understand the functions of molecular machines in the cell. By applying a combination of deep recurring neural networks and CRFs, it was possible to predict transmembrane topology and three dimensional coupling in the important family of G-protein coupled receptors (GPCRs). These receptors are able to detect molecules outside the cell and activate cellular responses and are of paramount relevance in immune responses and intercellular signaling&#x20;[<xref ref-type="bibr" rid="B92">92</xref>].</p>
<p>As we have mentioned molecular regulatory networks are models that may conceptually map random fields almost straight forward. They have a graph-theoretical structure already and their interactions are often so complex that modeling them as stochastic dependencies is somehow natural [<xref ref-type="bibr" rid="B93">93</xref>]. Depending on the nature of the regulatory interactions to be modeled, different approaches can be followed. Gitter and coworkers, for instance, used latent tree models combining an MRF with a set of hidden (or latent) variables, factorizing the joint probability on a Markov tree [<xref ref-type="bibr" rid="B94">94</xref>]. In this work, the action of transcription factor (TFs) was mapped to a set of latent variables and the MRF was used to establish the relationships of conditional independence of groups of neighboring genes, <italic>via</italic> their gene expression patterns obtained from experimental data. Zhong and colleagues [<xref ref-type="bibr" rid="B95">95</xref>] used a related approach to infer regulatory networks <italic>via</italic> a <italic>directed</italic> random field, giving rise to a tree structure known as a directed acyclic graph (DAG). In their work, all variables follow a pairwise Markov field with conditional dependencies following parametric Gaussian or multinomial distributions. Although they resorted to a DAG modeling due to its ability to work with <italic>mixed</italic> data (usually undepowered for common MRF approaches), the limitations of these studies to account for regulatory loops has to be considered.</p>
</sec>
<sec id="s4-2-2">
<title>Application Box III: Inference of Tissue-specific Transcriptional Regulatory Networks</title>
<p>
<list list-type="simple">
<list-item>
<p>
<bold>General problem statement:</bold> Transcriptional regulatory programs determine how gene expression is regulated, thus determining cellular phenotypes and response to external stimuli. Such gene regulatory programs involve a complex network of interactions among gene regulatory elements, RNA polymerase enzymes, protein complexes such as mediator and cohesion machineries and sequence specific transcription factors. Ma and coworkers [<xref ref-type="bibr" rid="B96">96</xref>] used a Markov Random Field approach to construct tissue-specific transcriptional regulatory networks integrating gene expression and regulatory sites data from RNA-seq and DNAase-Seq experiments.</p>
</list-item>
<list-item>
<p>
<bold>Theoretical/Methodological approach:</bold> The authors developed an MRF approach with unary (node functions) and binary (edge functions, i.e.,&#x20;pairwise interactions) potentials for transcriptional interaction within a cell line and across cell lines, respectively. With these two potential functions a joint probability distribution is written. To solve the problem, the JPD is mapped to a pseudo-energy optimization (PEO) test <italic>via</italic> logarithmic. transformation. The PEO is in turn transformed into a network maximum flow problem and solved by a loopy&#x20;BPA.</p>
</list-item>
<list-item>
<p>
<bold>Improvements/advantages:</bold> An original contribution of this work is the use of belief propagation algorithms to solve for a quadratic pseudo-energy functions (with only unary and pairwise potentials) representation and then using iterated conditional modes. This may open an interesting research path for other MRF applications.</p>
</list-item>
<list-item>
<p>
<bold>Limitations:</bold> One possible shortcoming of this approach is the use of linear correlation measures (Pearson coefficients) and linear classifiers (Singular Value Decomposition) for a problem with strong non-linearities (complex biochemical kinetics associated with gene expression). The MRF structure will indeed allow for more general statistical dependency relationships, making the analysis even more robust.</p>
</list-item>
</list>
</p>
<p>Undirected graphical models in the form of usual MRFs, have been used to construct, tissue-specific transcriptional regulatory networks [<xref ref-type="bibr" rid="B96">96</xref>] in 110 cell lines and 13 different tissues, from an integrative analysis of RNASeq and DNAase-Seq data. The authors used a method to minimize the pseudo-energy function by converting the problem to a maximum flow in networks and solving the latter <italic>via</italic> a loopy belief propagation algorithm&#x20;[<xref ref-type="bibr" rid="B97">97</xref>].</p>
<p>To improve on the modeling capabilities of MRFs to describe gene regulatory networks (GRNs) it is becoming customary to include several data sources as a means to partially disambiguate the statistical dependency structures. Banf and Rhee implemented a data integration strategy to their MRF modeling of GRNs in an algorithm called GRACE which exploits the energy function based on unary and binary terms that we previously described in the context of MRF modeling in biological imaging. Low confidence pairwise interactions were removed by mapping the problem to a classification task on imbalanced sets, and following the tenets of Ridge penalized regression&#x20;[<xref ref-type="bibr" rid="B98">98</xref>].</p>
<p>A somehow related method was devised by Grimes, Potter and Datta, who integrate differential network analysis to their study of gene expression data [<xref ref-type="bibr" rid="B99">99</xref>]. Their study was based on the idea of using KEGG pathways to construct MRFs as a means to functionally improve differential expression profiling [<xref ref-type="bibr" rid="B100">100</xref>, <xref ref-type="bibr" rid="B101">101</xref>]. A similar MRF method was used to improve transcriptome analysis in model (mouse) systems for biomedical research [<xref ref-type="bibr" rid="B102">102</xref>]. Data integration can be also used to incorporate biological function information (from metabolic and signaling pathways) to the modeling of statistical Genome Wide Association Studies (GWAS) <italic>via</italic> MRFs [<xref ref-type="bibr" rid="B103">103</xref>]. The MRF was then solved by a combination of parametric (inverse gamma) distributed priors and MAP techniques to find the posterior probabilities. This is relevant since the important results of GWAS research in biomedicine (statistical in nature and often poorly informative in the biological sense) can be contextualized <italic>via</italic> pathway interactions as devised <italic>via</italic> this MRF approach.</p>
<p>Though not properly a molecular interaction network study, Long, et&#x20;al, developed a method combining graph convolutional networks with conditional random fields, to predict human microbe-drug associations [<xref ref-type="bibr" rid="B104">104</xref>]. Since there has been a growing emphasis on the ways in which the human microbiome may affect drug responses in the context of precision medicine [<xref ref-type="bibr" rid="B105">105</xref>], accurate methods to predict such associations are highly desirable for the design of tailor-made therapeutic interventions.</p>
<p>Since random fields are able to capture not only spatio-temporal and regulatory associations, but are also proper to represent semantic or <italic>grammatical</italic> relationships, they have been thoroughly used in text analysis in biology, being the subjacent texts genomic sequences or pieces of biomedical literature. The group led by Fariselli used hidden CRFs for the problem of biosequence labeling in the prediction of the topology of prokaryotic outer-membrane proteins. Their study was based on a grammatically restrained approach, using dynamic programming much in the tradition of the so-called Boltzmann machines in AI [<xref ref-type="bibr" rid="B106">106</xref>]. Poisson random fields over sequence spaces were studied by Zhang and coworkers to detect local genomic signals in large sequencing studies [<xref ref-type="bibr" rid="B107">107</xref>].</p>
<p>Moving on to data and literature mining methods based on MRFs, we can mention <italic>passage relevance models</italic> used for the integration of syntactic and semantic elements to analyze biomedical concepts and topics <italic>via</italic> a PGM. The semantic components such as topics, terms and document classes are represented as potential functions of an MRF [<xref ref-type="bibr" rid="B108">108</xref>]. Biomedical literature mining strategies using MRFs were also developed to study automated recognition of bacteria named entities [<xref ref-type="bibr" rid="B109">109</xref>] to curate experimental databases on microbial interactions. Related methods were previously used to identify gene and protein mentions in the literature using CRFs [<xref ref-type="bibr" rid="B110">110</xref>].</p>
</sec>
</sec>
<sec id="s4-3">
<title>4.3 Applications of MRFs in Ecology and Other Areas of Biology</title>
<p>Other applications of random fields in biology include demography and selection to study weakly deleterious genetic variants in complex demographic environments [<xref ref-type="bibr" rid="B111">111</xref>] and for species clustering [<xref ref-type="bibr" rid="B112">112</xref>], in population genetics. MRFs have also been applied to understand species distribution patterns and endemism and to unveil [<xref ref-type="bibr" rid="B113">113</xref>] interactions between co-occurring species in processes governing community assembly [<xref ref-type="bibr" rid="B114">114</xref>]; as well as for spatially explicit community occupancy [<xref ref-type="bibr" rid="B115">115</xref>] in ecology.</p>
<p>Another group of disciplines in which MRFs have flourished is comprised of Data Science, Computer Science and Modern statistics. The next section will be devoted to presenting and discussing some developments of random fields in that setting.</p>
</sec>
</sec>
<sec id="s5">
<title>5 Markov Random Fields in Data Science and Machine Learning</title>
<p>The term <italic>Data Science</italic> refers to a multidisciplinary field devoted to extracting knowledge and insight from structured and unstructured data. It shares commonalities and differences with its parent fields: statistics, computer and information sciences and engineering. However, much of the emphasis is on the extraction of <italic>useful</italic> knowledge from data, putting accuracy and usability above formal mathematical structure if needed. Naturally, Markov random fields as a theoretically powerful methodology that allows for the incorporation of <italic>educated intuition</italic> and has an intrinsic algorithmic nature has called the attention of data scientists. We will present here, but a handful of the many uses and implementations of MRFs in data science and computational intelligence settings. As we will see, these studies share a lot of commonalities with the applications in statistical physics and computational biology while, at the same time, incorporating elements that may cross-fertilize to the modeling schemes in the natural sciences.</p>
<sec id="s5-1">
<title>5.1 Applications of MRFs in Computer Vision and Image Classification</title>
<p>As we already mentioned in the context of applications of random field to biomedical imaging, segmentation and pattern identification to enhance the resolution of spatial and/or spatio-temporal maps is a common use of MRFs. From the many applications in the field of computerized image processing, we will discuss some that present peculiarities or distinctive features that may be of more general interest. For instance, to face the challenge of capturing three dimensional structure from two-dimensional images, the so-called <italic>depth perception</italic>, Kozik used an MRF-based methodology [<xref ref-type="bibr" rid="B116">116</xref>] in which the energy function was modeled <italic>via</italic> a polynomial regression model and a depth estimation algorithm with correlated uncertainties (a sort of twofold autoregressive model). By using these entries Kozik then solved an MAP problem to obtain the maximum likelihood solution to the&#x20;MRF.</p>
<p>In the context of AI to enhance low-resolution images (the super-resolution problem), Stephenson and Chen devised an adaptive MRF method [<xref ref-type="bibr" rid="B117">117</xref>] based on passing-message optimization by a loopy propagation algorithm. Also in the context of AI approaches to image processing Li and Wand developed a combination of MRFs as generative models and deep CNNs to discriminate two-dimensional images to try to solve the so-called <italic>image synthesis problem</italic>, a relevant problem in computer vision with applications both to photo-editing and neuroscience [<xref ref-type="bibr" rid="B118">118</xref>]. A problem related to image synthesis is image classification, in which certain features of images are discerned and used to cluster images by similitudes in these feature spaces. Applications in image recognition in security, forensics and scientific microscopy and imaging among others abound. To improve the accuracy of image classification algorithms, Wen and coworkers developed a CRF method in which machine-learned feature functions took the place of the unary and binary terms in the potential energy [<xref ref-type="bibr" rid="B119">119</xref>], as in previous cases Gaussian priors and loopy belief propagation algorithms were used to solve the random&#x20;field.</p>
</sec>
<sec id="s5-2">
<title>5.2 Applications of MRFs in Statistics and Geostatistics</title>
<p>Geostatistics and geographical information systems are also quite amenable to be modeled within the MRF paradigm due to their natural spatio-temporal dependency structures. In the context of prediction of environmental risks and the effects of limited sampling, Bohorquez and colleagues developed an approach based on multivariate functional random fields for the spatial prediction of functional features at unsampled locations by resorting to covariates [<xref ref-type="bibr" rid="B120">120</xref>]. As in the case of random field hydrodynamics (mentioned in the physics section), an empirical approach based on continuous field estimators was chosen. Continuous spatio-temporal correlation structures <italic>via</italic> so-called Kriging methods extending the ideas of discrete random fields are commonly used in environmental analysis and risk assessment [<xref ref-type="bibr" rid="B121">121</xref>,&#x20;<xref ref-type="bibr" rid="B122">122</xref>].</p>
<p>Geological modeling is another field at the intersection of geostatistics and geophysics which has adopted the MRF formalism to deal with their problems. A segmentation approach was used for stochastic geological modeling with the use of hidden MRFs [<xref ref-type="bibr" rid="B123">123</xref>]. Using a methodological approximation similar to the one used in computer vision and biomedical imaging, latent variable MRFs are used to perform three-dimensional segmentation. The model is supplemented with finite Gaussian mixture models for the parameter calculations and a Gibbs sampling inference framework, following a similar approach to the one developed by the group of Li [<xref ref-type="bibr" rid="B124">124</xref>], based on the methods of Rue and Held [<xref ref-type="bibr" rid="B125">125</xref>] and by Solberg et&#x20;al [<xref ref-type="bibr" rid="B126">126</xref>] and further developed by Toftaker and Tjelmeland [<xref ref-type="bibr" rid="B127">127</xref>]. More refined geostatistical methods have been based on a clever combination of several developments of Markov random field theory. Along these lines, the work by Reuschen, Xu and Nowak [<xref ref-type="bibr" rid="B128">128</xref>] is noteworthy, since they used Bayesian inversion (based on Markov conditional independence) to develop a random field approach to hierarchical geostatistical models and used Gibbs sampling MCMC to solve&#x20;them.</p>
<p>The combined use of ideas from Markov and Gibbs random fields in statistical learning and other approaches in modern statistics has indeed become a fruitful line of research with important theoretical developments and a multitude of applications [<xref ref-type="bibr" rid="B24">24</xref>, <xref ref-type="bibr" rid="B34">34</xref>, <xref ref-type="bibr" rid="B129">129</xref>]. The use of MRFs and CRFs as tools for statistical learning has been used in a multitude of settings in both generative and discriminative models [<xref ref-type="bibr" rid="B33">33</xref>]. Aside Ising models and MRFs, perhaps the most widely used applications of the random fields are Gibbs sampling and Markov chain Monte Carlo methods that we already mentioned. Due to the generality and the relatively low computational complexity of these sampling/simulation methods, several methods have been developed based on&#x20;them.</p>
<p>Gibbs sampling is a form of Markov chain Monte Carlo (MCMC) algorithm. MCMC methods are used to obtain a sequence of <italic>observations</italic> of a random experiment by an approximation from a given (specified) multivariate probability distribution when direct sampling is challenging (computationally or otherwise). The essence of the method is building a Markov chain whose equilibrium distribution is precisely the specified multivariate distribution. Then, a sample of such distribution is just a sequence of states of the Markov chain. The use of the Markov property of an MRF allows to use Gibbs sampling as an MCMC method, when the joint probability distribution is not known (or is very complex) but the conditional distributions are known (or easier). Due to this, by using the pairwise Markov property, Gibbs sampling is particularly fit to sampling the posterior distribution of Bayesian networks (understood as a collection of conditional distributions), a quite relevant problem in both, statistical learning and in large computer simulation problems.</p>
<p>Aside from these basic issues, Gibbs sampling has been extensively enhanced over the years. One important improvement has been the incorporation of adaptive rejection sampling [<xref ref-type="bibr" rid="B130">130</xref>, <xref ref-type="bibr" rid="B131">131</xref>], particularly useful for situations in which evaluation of the density distribution function is computationally expensive (e.g., non-conjugated Bayesian models). Adaptive rejection sampling can be even applied to modeling <italic>via</italic> non-linear mixed models [<xref ref-type="bibr" rid="B131">131</xref>]. To further minimize the computational burden of Gibbs sampling, Meyer and collaborators [<xref ref-type="bibr" rid="B132">132</xref>] developed an algorithm which samples <italic>via</italic> Lagrange interpolation polynomials, instead of exponential distributions. Convergence can be also improved by double-adaptive independent rejection sampling [<xref ref-type="bibr" rid="B133">133</xref>] which is based on a scheme of minimizing the correlation among samples. Gibbs sampling approaches also allow for the determination of dense distribution simulated sampling from sparse sampled data [<xref ref-type="bibr" rid="B134">134</xref>], even in high dimensional latent fields over large datasets [<xref ref-type="bibr" rid="B135">135</xref>].</p>
<p>Gao and Gormley implemented a Gibbs sampling scheme based on CRFs weighted <italic>via</italic> neural scoring factors (implemented as parameters in factor graphs) with applications to Natural Language Processing (NLP) [<xref ref-type="bibr" rid="B136">136</xref>]. MCMC has also been used, in the context of Gibbs random fields in data pre-processing, to reduce the computational burden of data intensive signal processing [<xref ref-type="bibr" rid="B137">137</xref>, <xref ref-type="bibr" rid="B138">138</xref>]. Gibbs sampling can also be applied in parallel within the context of Gaussian MRFs on large grids or lattice models [<xref ref-type="bibr" rid="B139">139</xref>]. Parallel Gibbs sampling methods can also be developed in the context of sampling acceleration for structured graphs [<xref ref-type="bibr" rid="B140">140</xref>].</p>
<p>Markov random fields and its associated Gibbs measures can also be used to advance statistical methods in large deviation theory [<xref ref-type="bibr" rid="B141">141</xref>] and to develop methods of joint probability decomposition based on product measures [<xref ref-type="bibr" rid="B142">142</xref>]. Exact factorizability of joint probability distributions is a most relevant question in modern probability [<xref ref-type="bibr" rid="B143">143</xref>&#x2013;<xref ref-type="bibr" rid="B146">146</xref>] with important applications in data analytics [<xref ref-type="bibr" rid="B147">147</xref>], applied mathematics [<xref ref-type="bibr" rid="B148">148</xref>], computational biology [<xref ref-type="bibr" rid="B149">149</xref>] and network science [<xref ref-type="bibr" rid="B150">150</xref>], among other fields. MRFs also have been applied to embed filtrations on high dimensional hyperparameter spaces. The main idea is using random fields as hierarchical models <italic>projecting</italic> the relevant hyper-parameter space to a lower dimensional filtration [<xref ref-type="bibr" rid="B135">135</xref>]. This general problem is closely related with the <italic>feature selection problem</italic> in computer science and data analytics. We will discuss applications of the MRF formalism in that context in the next subsection.</p>
</sec>
<sec id="s5-3">
<title>5.3 Applications of MRFs in Feature Selection and AI</title>
<p>Feature selection (FS) refers to a quite general class of problems in computer science, data analysis and AI. Feature selection aims to find the minimum number of maximal relevant features to characterize a high dimensional data set. One outstanding family of methods of feature selection is <italic>regression methods</italic> in which a set of <italic>regression variables</italic> is used to predict one (or a few) dependent variables <italic>via</italic> functional relationships (commonly linear combinations with a distribution of weights). A subset of the whole set of regression variables is considered statistically significant, in that context those are the <italic>selected features</italic>. FS is a more general problem than linear, multivariate or even non-linear regression. MRF can be used to generalize regression procedures to more complex situations. One notable method was developed by Stoehr, Marin and Pudio [<xref ref-type="bibr" rid="B151">151</xref>] who used hidden Gibbs random fields to implement model selection <italic>via</italic> an information theoretical optimization criterion known as <italic>Block likelihood information</italic>. Cilla and coworkers [<xref ref-type="bibr" rid="B152">152</xref>] developed a FS method to be used in sequence classification based on hidden CRFs supplemented with a generalized Lasso group regularization method that instead of the colinearity condition employs L1-norm optimization of the parameters. The authors showed that FS outcomes with this method outperforms standard conditional random field approaches.</p>
<p>Feature selection efficacy of MRFs is closely related to the actual structure of the underlying adjacency matrices. Especially relevant is the issue of <italic>separability</italic>. Although non-trivial separability does not preclude the use of MRFs in large datasets, as long as the positive definite nature of the measures is ensured; there may be computational complexity limitations for practical uses. Recently, Sain and Furrer [<xref ref-type="bibr" rid="B153">153</xref>] discussed on some general properties of random fields (in particular for multivariate Gaussian MRFs) that need to be taken into account in the design of computationally efficient modeling strategies with such random fields. By designing FS schemes with MRFs based on the optimization of parameter estimation, for instance <italic>via structured learning</italic> it is possible to improve substantially on the computational complexity of such algorithms [<xref ref-type="bibr" rid="B154">154</xref>&#x2013;<xref ref-type="bibr" rid="B158">158</xref>]. The graph structure of MRFs can also be optimized to enhance the FS capabilities of the algorithms [<xref ref-type="bibr" rid="B159">159</xref>&#x2013;<xref ref-type="bibr" rid="B163">163</xref>]. More information along these lines can be found in the comprehensive review by Adams and Beling [<xref ref-type="bibr" rid="B164">164</xref>] and in the one by Vergara and Estevez [<xref ref-type="bibr" rid="B165">165</xref>].</p>
<p>As already mentioned the structure of MRF may result advantageous to solve <italic>segmentation</italic> problems or delimitation of statistical dependencies. These are problems that are extremely relevant in the context of computational linguistics and natural language processing applications. We will discuss these in the following subsection.</p>
</sec>
<sec id="s5-4">
<title>5.4 Applications of MRFs in Computational Linguistics and NLP</title>
<p>Automated textual identification and <italic>meaning discernment</italic> are extremely complex (and very useful) tasks in current artificial intelligence research and applications. The ability to detect text <italic>patches</italic> with semantic similarity is one of the founding steps in the ability to process natural language by a computer. By combing a deep learning approach (a convolutional neural network) with MRF models, Liu and collaborators [<xref ref-type="bibr" rid="B166">166</xref>] devised an effective algorithm for <italic>semantic segmentation</italic> [<xref ref-type="bibr" rid="B167">167</xref>], which they called a Deep Parsing Network (DPN). Within the DPN scheme, a CNN is used to calculate the unary terms of a two-term energy function, while the pairwise terms were approximated with a mean-field model. The mean field contributions were iteratively optimized using a back-propagation algorithm able to generalize to higher order <italic>perturbative contributions</italic>. Although the original application of semantic segmentation has been applied to image segmentation, its applications to NLP are somehow straight forward [<xref ref-type="bibr" rid="B168">168</xref>,&#x20;<xref ref-type="bibr" rid="B169">169</xref>].</p>
<p>A similar method was developed earlier by Mai, Wu and Cui and applied to improve word segmentation disambiguation in the Chinese language [<xref ref-type="bibr" rid="B170">170</xref>]. Main and colleagues, however, decided to use a CRF on top of a bidirectional maximum matching algorithm. Parameter estimation for the CRF was performed <italic>via</italic> maximum likelihood estimates. These ideas were further advanced by Qiu, et&#x20;al [<xref ref-type="bibr" rid="B171">171</xref>] who used CRFs for clinical entity recognition in Chinese. Speech tagging from voice recordings was performed using a CRF devised by Khan and collaborators [<xref ref-type="bibr" rid="B172">172</xref>]. Even computer assisted <italic>fake news</italic> detection [<xref ref-type="bibr" rid="B173">173</xref>] and headline prediction [<xref ref-type="bibr" rid="B174">174</xref>] can be achieved using CNNs and&#x20;MRFs.</p>
</sec>
<sec id="s5-5">
<title>5.5 Applications of MRFs in the Analysis of Social Networks</title>
<p>Social network analysis, including online social networks, other forms of interpersonal interaction networks and even some social networks in non-human creatures, have become a relevant field of research in recent times (though the subject has been relevant in the contexts of sociology and animal behavior for decades) [<xref ref-type="bibr" rid="B175">175</xref>]. The analysis of social network <italic>via</italic> MRFs is becoming more and more common also. As an example, Jia and collaborators have used MRFs to infer attributes in online social network data [<xref ref-type="bibr" rid="B176">176</xref>]. Their model used the social network structure itself to develop a pairwise MRF. From empirical training data, the authors used the individual behaviors to learn a probability that each user has a given attribute. Then used that as an a priori probability, compute the posterior probabilities by a loopy belief propagation algorithm over the MRF, to, finally, optimizing the belief propagation algorithm by a <italic>second neighbor</italic> criteria that sparsifies the adjacency matrix. Further optimization of similar ideas was obtained by using graph convolutional networks, i.e.,&#x20;CNNs over CRFs [<xref ref-type="bibr" rid="B177">177</xref>]. Attribute inference in social network data <italic>via</italic> MRFs can also be used to improve cybersecurity algorithms [<xref ref-type="bibr" rid="B178">178</xref>], to learn consumer intentions [<xref ref-type="bibr" rid="B179">179</xref>], to study the epidemiology of depression [<xref ref-type="bibr" rid="B180">180</xref>] among other issues. Social networks as well as some classes of molecular interaction and ecological networks are also relevant to the development and improvement of MRF and CRF learning algorithms. This is so since often a sketch (sometimes a detailed one) of the network dependency structure is known a priori [<xref ref-type="bibr" rid="B181">181</xref>, <xref ref-type="bibr" rid="B182">182</xref>]. This is yet another instance in which applications may nurture back the formal theory of random fields.</p>
<sec id="s5-5-1">
<title>Application Box IV: Inference of User Attributes in Online Social Networks</title>
<p>
<list list-type="simple">
<list-item>
<p>
<bold>General problem statement:</bold> The attribute inference problem (AIP), i.e.,&#x20;the discovery of personality traits from data on social networks, is a central question on computational social science. It is indeed an (unsupervised) extension of the personality analysis tests of classical psychology with important applications from sociological modeling to commercial and political marketing, and even national security issues. Jia and collaborators [<xref ref-type="bibr" rid="B176">176</xref>] developed an approach to the AIP from public data on online social networks using an MRF with pairwise interactions.</p>
</list-item>
<list-item>
<p>
<bold>Theoretical/Methodological approach:</bold> Given a training dataset, behaviors are used to learn the probabilities that each user (node) has a considered attribute, these are the prior probabilities. Based on the neighborhood structure of a pairwise Markov random field, posterior probabilities are computed <italic>via</italic> a loopy belief propagation algorithm. The MRF has a quadratic pseudo-energy function with node potentials (unary contributions) for each user and edge potentials (pairwise interactions) for every connected pair of nodes, as defined by node correlations. Edge potentials are defined as discrete-valued <italic>spin-like</italic> states <inline-formula id="inf90">
<mml:math id="minf90">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3bb;</mml:mi>
<mml:mrow>
<mml:mi>u</mml:mi>
<mml:mi>v</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula> if nodes <italic>u</italic> and <italic>v</italic> have the same attribute state and <inline-formula id="inf91">
<mml:math id="minf91">
<mml:mrow>
<mml:msub>
<mml:mi>&#x3bb;</mml:mi>
<mml:mrow>
<mml:mi>u</mml:mi>
<mml:mi>v</mml:mi>
</mml:mrow>
</mml:msub>
<mml:mo>&#x3d;</mml:mo>
<mml:mo>&#x2212;</mml:mo>
<mml:mn>1</mml:mn>
</mml:mrow>
</mml:math>
</inline-formula> if they do not. This way, <italic>homophily</italic> in the social networks mimics spin-alignment in lattice models of magnetism.</p>
</list-item>
<list-item>
<p>
<bold>Improvements/advantages:</bold> To optimize computational performance in large networks, the authors modified the BPA by using a loop renormalization strategy. Hence, circular node correlations are locally computed for each pair of nodes <italic>prior to move to another edge</italic> and then using a linear optimization approach. Thus, there is no need to allocate memory for all circular correlations (loops).</p>
</list-item>
<list-item>
<p>
<bold>Limitations:</bold> More than a limitation itself, an avenue of predictive improvement may be given by extending their MRF approach to allow multi-categorical (or even continuous) state variables. Doing this will make possible to capture the fact that most behavioral attributes are not simply present/absent, but may occur over a range of possibilities.</p>
</list-item>
</list>
</p>
</sec>
</sec>
<sec id="s5-6">
<title>5.6 Random Fields and Graph Signal Theory</title>
<p>Graph signal theory, also called graph signal processing (GSP) is a field of signal analytics that deals with signals whose domain (as identified by a graph) is irregular [<xref ref-type="bibr" rid="B183">183</xref>&#x2013;<xref ref-type="bibr" rid="B185">185</xref>]. In the context of GSP, the vertices or nodes represent probes in which the signal has been evaluated or sensed and the edges are relationships between these vertices. Data processing of the signals exploits the structure of the associated graph. GSP is often seen as an intermediate step between single channel signal processing and spatio-temporal signal analysis. The nature of the edges is determined by the relationship (spatial, contextual, relational, etc.) between the vertices. Whenever edges are defined <italic>via</italic> a statistical dependence structure, GSP can be mapped to either an MRF or a CRF, thus allowing the use of all the tools of random field theory to perform GSP [<xref ref-type="bibr" rid="B186">186</xref>, <xref ref-type="bibr" rid="B187">187</xref>]. The networked nature of the domain of signals embedded in a graph, allows the use of spectral graph theoretical methods for signal processing [<xref ref-type="bibr" rid="B188">188</xref>&#x2013;<xref ref-type="bibr" rid="B190">190</xref>]. Conversely, correlations between features on the signals are also useful to identify the structure of the underlying graph [<xref ref-type="bibr" rid="B191">191</xref>,&#x20;<xref ref-type="bibr" rid="B192">192</xref>].</p>
<p>GSP has a number of relevant applications, from spatio-temporal analysis of brain data [<xref ref-type="bibr" rid="B193">193</xref>]; to analyze vulnerabilities in power grid data [<xref ref-type="bibr" rid="B194">194</xref>]; to topological data analysis [<xref ref-type="bibr" rid="B195">195</xref>], chemoinformatics [<xref ref-type="bibr" rid="B196">196</xref>] and single cell transcriptomic analysis [<xref ref-type="bibr" rid="B197">197</xref>], to mention but a few examples. Statistical learning techniques have also being founded on a combination of MRFs and GSP [<xref ref-type="bibr" rid="B198">198</xref>, <xref ref-type="bibr" rid="B199">199</xref>], taking advantage of both the networked structure, the statistical dependence relationships and the temporal correlations of the signals [<xref ref-type="bibr" rid="B200">200</xref>&#x2013;<xref ref-type="bibr" rid="B202">202</xref>]. Random field approaches to GSP have also been applied in the context of deep convolutional networks [<xref ref-type="bibr" rid="B203">203</xref>, <xref ref-type="bibr" rid="B204">204</xref>], often invoking features of the underlying joint conditional probability distributions such as ergodicity [<xref ref-type="bibr" rid="B205">205</xref>] and stationarity [<xref ref-type="bibr" rid="B206">206</xref>].</p>
</sec>
</sec>
<sec id="s6">
<title>6 Concluding Remarks</title>
<p>As already known in statistical physics for decades, random fields are a quite powerful and versatile theoretical analytical framework. We have discussed here some fundamental ideas of the theory of Markov-Gibbs random fields, namely the notions of statistical dependency on neighborhoods, of potentials and local interactions, of conditional independence relationships and so on. After that, we discussed a handful of (mostly recent) advances and applications of Markov random fields in different physics subdisciplines, as well as in several areas of biology and the data sciences. The main goal of this presentation was not to be comprehensive but to be illustrative of the many ways in which research and applications of random field may be advancing both, inside and outside traditional statistical physics.</p>
<p>In the theoretical and conceptual advances side, we mentioned how random fields may be embedded in general manifolds, how by incorporating quenched fields (or somehow equivalently, by adding quenching potentials) to the usual Izing random field, a whole new phenomenology can be discovered in RFIMs. How Markov and Bayesian networks may be combined in HRFs and how gauge symmetries and other extended fields may broad the scope of&#x20;MRFs.</p>
<p>By examining the applications in physics and in other disciplines, we discover (or often re-discover) methodological and computational improvements to the inference, analysis and solutions of problems within the MRF/GRF/CRF settings. In these regards, we can mention the use of CNNs as feature extractors on top of random fields, to refine hypotheses about marginals and (<italic>via</italic> convolution) to improve the accuracy of pairwise potential terms. We re-examined how to extend beyond mean-field approaches, either <italic>via</italic> MAP optimization, <italic>via</italic> higher order perturbations solved by neural networks or maximum likelihood approaches (depending on data availability). How, under certain circumstances (still dictated by physical intuition and data constrains) factorization of the partition function may be attained <italic>via</italic> clique potentials obtained from Gaussian (or other multivariate parametric distributions) or even from empirical distributions.</p>
<p>We also analyzed how simulations in random fields may be supplemented with well known methods&#x2013;within the statistical physics community&#x2013;, such as simulated annealing, Markov Chain Monte Carlo and importance sampling, but also from methods of wide use in other fields such as stochastic descent back-propagation, factor graph approaches, Gibbs sampling, pseudo-likelihood methods, latent models or loopy belief propagation algorithms to name a few. And how, under some circumstances, parameter estimation (fundamental in applications involving non-trivial partition functions) can be reframed as a regression problem and benefit from the use of the Ridge and Lasso optimization techniques, dynamic programming and autoregressive modeling.</p>
<p>We want to highlight that, in spite of being a hundred-plus year developed formalism in statistical physics, the theory of Markov-Gibbs random fields is indeed a flourishing one, with many theoretical advances and applications within and outside physics.</p>
</sec>
</body>
<back>
<sec id="s7">
<title>Author Contributions</title>
<p>EH performed research and wrote the manuscript.</p>
</sec>
<sec id="s8">
<title>Funding</title>
<p>This work was supported by the Consejo Nacional de Ciencia y Tecnolog&#xed;a (SEP-CONACYT-2016-285544 and FRONTERAS-2017-2115), and the National Institute of Genomic Medicine, M&#xe9;xico. Additional support has been granted by the Laboratorio Nacional de Ciencias de la Complejidad, from the Universidad Nacional Aut&#xf3;noma de M&#xe9;xico. EH is recipient of the 2016 Marcos Moshinsky Fellowship in the Physical Sciences.</p>
</sec>
<sec sec-type="COI-statement" id="s9">
<title>Conflict of Interest</title>
<p>The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.</p>
</sec>
<ack>
<p>The author is grateful to the lively and brilliant academic community that has been behind the Winter Meeting on Statistical Physics for five decades&#x20;now.</p>
</ack>
<ref-list>
<title>References</title>
<ref id="B1">
<label>1.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ising</surname>
<given-names>E</given-names>
</name>
</person-group>. <article-title>Beitrag zur theorie des ferromagnetismus</article-title>. <source>Z Physik</source> (<year>1925</year>) <volume>31</volume>:<fpage>253</fpage>&#x2013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1007/bf02980577</pub-id> </citation>
</ref>
<ref id="B2">
<label>2.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Averintsev</surname>
<given-names>MB</given-names>
</name>
</person-group>. <article-title>Description of Markovian random fields by gibbsian conditional probabilities</article-title>. <source>Theor Probab Appl</source> (<year>1972</year>) <volume>17</volume>:<fpage>20</fpage>&#x2013;<lpage>33</lpage>. <pub-id pub-id-type="doi">10.1137/1117002</pub-id> </citation>
</ref>
<ref id="B3">
<label>3.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Averintsev</surname>
<given-names>M</given-names>
</name>
</person-group>. <article-title>Gibbsian distribution of random fields whose conditional probabilities may vanish</article-title>. <source>Problemy Peredachi Informatsii</source> (<year>1975</year>) <volume>11</volume>:<fpage>86</fpage>&#x2013;<lpage>96</lpage>. </citation>
</ref>
<ref id="B4">
<label>4.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Dobrushin</surname>
<given-names>RL</given-names>
</name>
<name>
<surname>Kryukov</surname>
<given-names>V</given-names>
</name>
<name>
<surname>Toom</surname>
<given-names>AL</given-names>
</name>
</person-group>. <source>Locally interacting systems and their application in biology</source>. <publisher-name>Springer</publisher-name> (<year>1978</year>).</citation>
</ref>
<ref id="B5">
<label>5.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Stavskaya</surname>
<given-names>ON</given-names>
</name>
</person-group>. <article-title>Markov fields as invariant states for local processes</article-title>. <conf-name>Locally Interacting Systems and Their Application in Biology</conf-name>. <publisher-name>Springer</publisher-name> (<year>1978</year>). <fpage>113</fpage>&#x2013;<lpage>121</lpage>. <pub-id pub-id-type="doi">10.1007/bfb0070088</pub-id> </citation>
</ref>
<ref id="B6">
<label>6.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stavskaya</surname>
<given-names>ON</given-names>
</name>
</person-group>. <article-title>Sufficient conditions for the uniqueness of a probability field and estimates for correlations</article-title>. <source>Math Notes Acad Sci USSR</source> (<year>1975</year>) <volume>18</volume>:<fpage>950</fpage>&#x2013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1007/bf01153051</pub-id> </citation>
</ref>
<ref id="B7">
<label>7.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Vasilyev</surname>
<given-names>NB</given-names>
</name>
</person-group>. <article-title>Bernoulli and Markov stationary measures in discrete local interactions</article-title>. <conf-name>Locally interacting systems and their application in biology</conf-name>. <publisher-name>Springer</publisher-name> (<year>1978</year>). <fpage>99</fpage>&#x2013;<lpage>112</lpage>. <pub-id pub-id-type="doi">10.1007/bfb0070087</pub-id> </citation>
</ref>
<ref id="B8">
<label>8.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dobruschin</surname>
<given-names>PL</given-names>
</name>
</person-group>. <article-title>The description of a random field by means of conditional probabilities and conditions of its regularity</article-title>. <source>Theor Probab Appl</source> (<year>1968</year>) <volume>13</volume>:<fpage>197</fpage>&#x2013;<lpage>224</lpage>. <pub-id pub-id-type="doi">10.1137/1113026</pub-id> </citation>
</ref>
<ref id="B9">
<label>9.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lanford</surname>
<given-names>OE</given-names>
</name>
<name>
<surname>Ruelle</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>Observables at infinity and states with short range correlations in statistical mechanics</article-title>. <source>Commun Math Phys</source> (<year>1969</year>) <volume>13</volume>:<fpage>194</fpage>&#x2013;<lpage>215</lpage>. <pub-id pub-id-type="doi">10.1007/bf01645487</pub-id> </citation>
</ref>
<ref id="B10">
<label>10.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Hammersley</surname>
<given-names>JM</given-names>
</name>
<name>
<surname>Clifford</surname>
<given-names>P</given-names>
</name>
</person-group>. <source>Markov fields on finite graphs and lattices</source>. <publisher-name>Unpublished manuscript</publisher-name> (<year>1971</year>).</citation>
</ref>
<ref id="B11">
<label>11.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Koller</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Friedman</surname>
<given-names>N</given-names>
</name>
</person-group>. <source>Probabilistic graphical models: principles and techniques (adaptive computation and machine learning series)</source>. <publisher-name>MIT Press</publisher-name> (<year>2009</year>).</citation>
</ref>
<ref id="B12">
<label>12.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grimmett</surname>
<given-names>GR</given-names>
</name>
</person-group>. <article-title>A theorem about random fields</article-title>. <source>Bull Lond Math Soc</source> (<year>1973</year>) <volume>5</volume>:<fpage>81</fpage>&#x2013;<lpage>84</lpage>. <pub-id pub-id-type="doi">10.1112/blms/5.1.81</pub-id> </citation>
</ref>
<ref id="B13">
<label>13.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Besag</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Spatial interaction and the statistical analysis of lattice systems</article-title>. <source>J&#x20;R Stat Soc Ser B (Methodological)</source> (<year>1974</year>) <volume>36</volume>:<fpage>192</fpage>&#x2013;<lpage>225</lpage>. <pub-id pub-id-type="doi">10.1111/j.2517-6161.1974.tb00999.x</pub-id> </citation>
</ref>
<ref id="B14">
<label>14.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Baxter</surname>
<given-names>RJ</given-names>
</name>
</person-group>. <source>Exactly solved models in statistical mechanics</source>. <publisher-name>Elsevier</publisher-name> (<year>2016</year>).</citation>
</ref>
<ref id="B15">
<label>15.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cipra</surname>
<given-names>BA</given-names>
</name>
</person-group>. <article-title>An introduction to the Ising model</article-title>. <source>The Am Math Monthly</source> (<year>1987</year>) <volume>94</volume>:<fpage>937</fpage>&#x2013;<lpage>59</lpage>. <pub-id pub-id-type="doi">10.1080/00029890.1987.12000742</pub-id> </citation>
</ref>
<ref id="B16">
<label>16.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>McCoy</surname>
<given-names>BM</given-names>
</name>
<name>
<surname>Wu</surname>
<given-names>TT</given-names>
</name>
</person-group>. <source>The two-dimensional Ising model</source>. <publisher-loc>North Chelmsford, MA</publisher-loc>: <publisher-name>Courier Corporation</publisher-name> (<year>2014</year>).</citation>
</ref>
<ref id="B17">
<label>17.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Thompson</surname>
<given-names>CJ</given-names>
</name>
</person-group>. <source>Mathematical statistical mechanics</source>. <publisher-name>Princeton University Press</publisher-name> (<year>2015</year>).</citation>
</ref>
<ref id="B18">
<label>18.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Adler</surname>
<given-names>M</given-names>
</name>
</person-group>. <source>Monte Carlo simulations of the Ising model</source>. <publisher-name>Anchor Academic Publishing</publisher-name> (<year>2016</year>).</citation>
</ref>
<ref id="B19">
<label>19.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hopfield</surname>
<given-names>JJ</given-names>
</name>
</person-group>. <article-title>Neural networks and physical systems with emergent collective computational abilities</article-title>. <source>Proc Natl Acad Sci</source> (<year>1982</year>) <volume>79</volume>:<fpage>2554</fpage>&#x2013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1073/pnas.79.8.2554</pub-id> </citation>
</ref>
<ref id="B20">
<label>20.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ackley</surname>
<given-names>DH</given-names>
</name>
<name>
<surname>Hinton</surname>
<given-names>GE</given-names>
</name>
<name>
<surname>Sejnowski</surname>
<given-names>TJ</given-names>
</name>
</person-group>. <article-title>A learning algorithm for Boltzmann machines&#x2a;</article-title>. <source>Cogn Sci</source> (<year>1985</year>) <volume>9</volume>:<fpage>147</fpage>&#x2013;<lpage>69</lpage>. <pub-id pub-id-type="doi">10.1207/s15516709cog0901_7</pub-id> </citation>
</ref>
<ref id="B21">
<label>21.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Salakhutdinov</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Larochelle</surname>
<given-names>H</given-names>
</name>
</person-group>. <article-title>Efficient learning of deep Boltzmann machines</article-title>. <conf-name>Proceedings of the thirteenth international conference on artificial intelligence and statistics</conf-name>. <publisher-loc>Sardinia, Italy</publisher-loc>: <publisher-name>DBLP</publisher-name> (<year>2010</year>) <fpage>693</fpage>&#x2013;<lpage>700</lpage>. </citation>
</ref>
<ref id="B22">
<label>22.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Ross</surname>
<given-names>KJ</given-names>
</name>
<name>
<surname>Snell</surname>
<given-names>L</given-names>
</name>
</person-group>. <source>Markov random fields and their applications</source>. <publisher-name>American Mathematical Society</publisher-name> (<year>1980</year>).</citation>
</ref>
<ref id="B23">
<label>23.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Essler</surname>
<given-names>FHL</given-names>
</name>
<name>
<surname>Mussardo</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Panfil</surname>
<given-names>M</given-names>
</name>
</person-group>. <article-title>Generalized Gibbs ensembles for quantum field theories</article-title>. <source>Phys Rev A</source> (<year>2015</year>) <volume>91</volume>:<fpage>051602</fpage>. <pub-id pub-id-type="doi">10.1103/physreva.91.051602</pub-id> </citation>
</ref>
<ref id="B24">
<label>24.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Murphy</surname>
<given-names>K</given-names>
</name>
</person-group>. <source>Machine learning: a probabilistic perspective</source>. <publisher-name>MIT Press</publisher-name> (<year>2012</year>).</citation>
</ref>
<ref id="B25">
<label>25.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Williams</surname>
<given-names>D</given-names>
</name>
</person-group>. <source>Probability with martingales</source>. <publisher-name>Cambridge University Press</publisher-name> (<year>1991</year>). </citation>
</ref>
<ref id="B26">
<label>26.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Canonne</surname>
<given-names>CL</given-names>
</name>
<name>
<surname>Diakonikolas</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Kane</surname>
<given-names>DM</given-names>
</name>
<name>
<surname>Stewart</surname>
<given-names>A</given-names>
</name>
</person-group>. <source>Testing conditional independence of discrete distributions</source>. <publisher-name>Information Theory and Applications Workshop (ITA) (IEEE)</publisher-name> (<year>2018</year>). <fpage>1</fpage>&#x2013;<lpage>57</lpage>.</citation>
</ref>
<ref id="B27">
<label>27.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Schultz</surname>
<given-names>TD</given-names>
</name>
<name>
<surname>Mattis</surname>
<given-names>DC</given-names>
</name>
<name>
<surname>Lieb</surname>
<given-names>EH</given-names>
</name>
</person-group>. <article-title>Two-dimensional Ising model as a soluble problem of many fermions</article-title>. <source>Rev Mod Phys</source> (<year>1964</year>) <volume>36</volume>:<fpage>856</fpage>. <pub-id pub-id-type="doi">10.1103/revmodphys.36.856</pub-id> </citation>
</ref>
<ref id="B28">
<label>28.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Brush</surname>
<given-names>SG</given-names>
</name>
</person-group>. <article-title>History of the lenz-ising model</article-title>. <source>Rev Mod Phys</source> (<year>1967</year>) <volume>39</volume>:<fpage>883</fpage>. <pub-id pub-id-type="doi">10.1103/revmodphys.39.883</pub-id> </citation>
</ref>
<ref id="B29">
<label>29.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Isichenko</surname>
<given-names>MB</given-names>
</name>
</person-group>. <article-title>Percolation, statistical topography, and transport in random media</article-title>. <source>Rev Mod Phys</source> (<year>1992</year>) <volume>64</volume>:<fpage>961</fpage>. <pub-id pub-id-type="doi">10.1103/revmodphys.64.961</pub-id> </citation>
</ref>
<ref id="B30">
<label>30.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sornette</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>Physics and financial economics (1776-2014): puzzles, Ising and agent-based models</article-title>. <source>Rep Prog Phys</source> (<year>2014</year>) <volume>77</volume>:<fpage>062001</fpage>. <pub-id pub-id-type="doi">10.1088/0034-4885/77/6/062001</pub-id> </citation>
</ref>
<ref id="B31">
<label>31.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Adler</surname>
<given-names>RJ</given-names>
</name>
<name>
<surname>Taylor</surname>
<given-names>JE</given-names>
</name>
</person-group>. <source>Random fields and geometry</source>. <publisher-name>Springer Science &#x26; Business Media</publisher-name> (<year>2009</year>).</citation>
</ref>
<ref id="B32">
<label>32.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Ganchev</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>About Markov, Gibbs,&#x2026; gauge theory&#x2026; finance</article-title>. <conf-name>Quantum Theory And Symmetries</conf-name>. <publisher-name>Springer</publisher-name> (<year>2017</year>) <fpage>403</fpage>&#x2013;<lpage>12</lpage>. </citation>
</ref>
<ref id="B33">
<label>33.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Freno</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Trentin</surname>
<given-names>E</given-names>
</name>
</person-group>. <source>Hybrid random fields</source>. <publisher-name>Springer</publisher-name> (<year>2011</year>).</citation>
</ref>
<ref id="B34">
<label>34.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Friedman</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Hastie</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Tibshirani</surname>
<given-names>R</given-names>
</name>
</person-group>. <source>The elements of statistical learning</source>. <publisher-loc>New York</publisher-loc>: <publisher-name>Springer series in Statistics</publisher-name> (<year>2001</year>).</citation>
</ref>
<ref id="B35">
<label>35.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hern&#xe1;ndez-Lemus</surname>
<given-names>E</given-names>
</name>
</person-group>. <article-title>On a class of tensor Markov fields</article-title>. <source>Entropy</source> (<year>2020</year>) <volume>22</volume>:<fpage>451</fpage>. <pub-id pub-id-type="doi">10.3390/e22040451</pub-id> </citation>
</ref>
<ref id="B36">
<label>36.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Hern&#xe1;ndez-Lemus</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Espinal-Enr&#xed;quez</surname>
<given-names>J</given-names>
</name>
<name>
<surname>de Anda-J&#xe1;uregui</surname>
<given-names>G</given-names>
</name>
</person-group>. <source>Probabilistic multilayer networks</source>. <publisher-loc>Ithaca, NY</publisher-loc>: <publisher-name>arXiv:1808.07857</publisher-name> (<year>2018</year>).</citation>
</ref>
<ref id="B37">
<label>37.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>De Domenico</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Sol&#xe9;-Ribalta</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Cozzo</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Kivel&#xe4;</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Moreno</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Porter</surname>
<given-names>MA</given-names>
</name>
<etal/>
</person-group> <article-title>Mathematical formulation of multilayer networks</article-title>. <source>Phys Rev X</source> (<year>2013</year>) <volume>3</volume>:<fpage>041022</fpage>. <pub-id pub-id-type="doi">10.1103/physrevx.3.041022</pub-id> </citation>
</ref>
<ref id="B38">
<label>38.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kivel&#xe4;</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Arenas</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Barthelemy</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Gleeson</surname>
<given-names>JP</given-names>
</name>
<name>
<surname>Moreno</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Porter</surname>
<given-names>MA</given-names>
</name>
</person-group>. <article-title>Multilayer networks</article-title>. <source>J&#x20;Complex Networks</source> (<year>2014</year>) <volume>2</volume>:<fpage>203</fpage>&#x2013;<lpage>71</lpage>. <pub-id pub-id-type="doi">10.1093/comnet/cnu016</pub-id> </citation>
</ref>
<ref id="B39">
<label>39.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Boccaletti</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Bianconi</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Criado</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Del Genio</surname>
<given-names>CI</given-names>
</name>
<name>
<surname>G&#xf3;mez-Garde&#xf1;es</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Romance</surname>
<given-names>M</given-names>
</name>
<etal/>
</person-group> <article-title>The structure and dynamics of multilayer networks</article-title>. <source>Phys Rep</source> (<year>2014</year>) <volume>544</volume>:<fpage>1</fpage>&#x2013;<lpage>122</lpage>. <pub-id pub-id-type="doi">10.1016/j.physrep.2014.07.001</pub-id> </citation>
</ref>
<ref id="B40">
<label>40.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Aizenman</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Peled</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>A power-law upper bound on the correlations in the 2d random field Ising model</article-title>. <source>Commun Math Phys</source> (<year>2019</year>) <volume>372</volume>:<fpage>865</fpage>&#x2013;<lpage>92</lpage>. <pub-id pub-id-type="doi">10.1007/s00220-019-03450-3</pub-id> </citation>
</ref>
<ref id="B41">
<label>41.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Imry</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Ma</surname>
<given-names>S-K.</given-names>
</name>
</person-group> <article-title>Random-field instability of the ordered state of continuous symmetry</article-title>. <source>Phys Rev Lett</source> (<year>1975</year>) <volume>35</volume>:<fpage>1399</fpage>. <pub-id pub-id-type="doi">10.1103/physrevlett.35.1399</pub-id> </citation>
</ref>
<ref id="B42">
<label>42.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Berzin</surname>
<given-names>AA</given-names>
</name>
<name>
<surname>Morosov</surname>
<given-names>AI</given-names>
</name>
<name>
<surname>Sigov</surname>
<given-names>AS</given-names>
</name>
</person-group>. <article-title>Long-range order induced by random fields in two-dimensional O(n) models, and the imry-ma state</article-title>. <source>Phys Solid State</source> (<year>2020</year>) <volume>62</volume>:<fpage>332</fpage>&#x2013;<lpage>7</lpage>. <pub-id pub-id-type="doi">10.1134/s1063783420020055</pub-id> </citation>
</ref>
<ref id="B43">
<label>43.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Berzin</surname>
<given-names>AA</given-names>
</name>
<name>
<surname>Morosov</surname>
<given-names>AI</given-names>
</name>
<name>
<surname>Sigov</surname>
<given-names>AS</given-names>
</name>
</person-group>. <article-title>A mechanism of long-range order induced by random fields: effective anisotropy created by defects</article-title>. <source>Phys Solid State</source> (<year>2016</year>) <volume>58</volume>:<fpage>1846</fpage>&#x2013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1134/s1063783416090109</pub-id> </citation>
</ref>
<ref id="B44">
<label>44.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bunde</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Havlin</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Roman</surname>
<given-names>HE</given-names>
</name>
<name>
<surname>Schildt</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Stanley</surname>
<given-names>HE</given-names>
</name>
</person-group>. <article-title>On the field dependence of random walks in the presence of random fields</article-title>. <source>J&#x20;Stat Phys</source> (<year>1988</year>) <volume>50</volume>:<fpage>1271</fpage>&#x2013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1007/bf01019166</pub-id> </citation>
</ref>
<ref id="B45">
<label>45.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chatterjee</surname>
<given-names>S</given-names>
</name>
</person-group>. <article-title>On the decay of correlations in the random field Ising model</article-title>. <source>Commun Math Phys</source> (<year>2018</year>) <volume>362</volume>:<fpage>253</fpage>&#x2013;<lpage>67</lpage>. <pub-id pub-id-type="doi">10.1007/s00220-018-3085-0</pub-id> </citation>
</ref>
<ref id="B46">
<label>46.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Aizenman</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Wehr</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Rounding of first-order phase transitions in systems with quenched disorder</article-title>. <source>Phys Rev Lett</source> (<year>1989</year>) <volume>62</volume>:<fpage>2503</fpage>. <pub-id pub-id-type="doi">10.1103/physrevlett.62.2503</pub-id> </citation>
</ref>
<ref id="B47">
<label>47.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fytas</surname>
<given-names>NG</given-names>
</name>
<name>
<surname>Mart&#xed;n-Mayor</surname>
<given-names>V</given-names>
</name>
<name>
<surname>Picco</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Sourlas</surname>
<given-names>N</given-names>
</name>
</person-group>. <article-title>Specific-heat exponent and modified hyperscaling in the 4d random-field Ising model</article-title>. <source>J&#x20;Stat Mech</source> (<year>2017</year>) <volume>2017</volume>:<fpage>033302</fpage>. <pub-id pub-id-type="doi">10.1088/1742-5468/aa5dc3</pub-id> </citation>
</ref>
<ref id="B48">
<label>48.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fytas</surname>
<given-names>NG</given-names>
</name>
<name>
<surname>Mart&#xed;n-Mayor</surname>
<given-names>V</given-names>
</name>
<name>
<surname>Picco</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Sourlas</surname>
<given-names>N</given-names>
</name>
</person-group>. <article-title>Review of recent developments in the random-field Ising model</article-title>. <source>J&#x20;Stat Phys</source> (<year>2018</year>) <volume>172</volume>:<fpage>665</fpage>&#x2013;<lpage>72</lpage>. <pub-id pub-id-type="doi">10.1007/s10955-018-1955-7</pub-id> </citation>
</ref>
<ref id="B49">
<label>49.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tarjus</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Tissier</surname>
<given-names>M</given-names>
</name>
</person-group>. <article-title>Random-field Ising and o (n) models: theoretical description through the functional renormalization grou</article-title>. <source>The Eur Phys J&#x20;B</source> (<year>2020</year>) <volume>93</volume>:<fpage>1</fpage>&#x2013;<lpage>19</lpage>. <pub-id pub-id-type="doi">10.1140/epjb/e2020-100489-1</pub-id> </citation>
</ref>
<ref id="B50">
<label>50.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ayala</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Carinci</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Redig</surname>
<given-names>F</given-names>
</name>
</person-group>. <article-title>Quantitative Boltzmann-gibbs principles via orthogonal polynomial duality</article-title>. <source>J&#x20;Stat Phys</source> (<year>2018</year>) <volume>171</volume>:<fpage>980</fpage>&#x2013;<lpage>99</lpage>. <pub-id pub-id-type="doi">10.1007/s10955-018-2060-7</pub-id> </citation>
</ref>
<ref id="B51">
<label>51.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Dobrushin</surname>
<given-names>RL</given-names>
</name>
</person-group>. <article-title>Perturbation methods of the theory of gibbsian fields</article-title>. <conf-name>Lectures on probability theory and statistics</conf-name>. <publisher-name>Springer</publisher-name> (<year>1996</year>) <fpage>1</fpage>&#x2013;<lpage>66</lpage>. <pub-id pub-id-type="doi">10.1007/bfb0095674</pub-id> </citation>
</ref>
<ref id="B52">
<label>52.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Essler</surname>
<given-names>FHL</given-names>
</name>
<name>
<surname>Mussardo</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Panfil</surname>
<given-names>M</given-names>
</name>
</person-group>. <article-title>On truncated generalized Gibbs ensembles in the Ising field theory</article-title>. <source>J&#x20;Stat Mech</source> (<year>2017</year>) <volume>2017</volume>:<fpage>013103</fpage>. <pub-id pub-id-type="doi">10.1088/1742-5468/aa53f4</pub-id> </citation>
</ref>
<ref id="B53">
<label>53.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gudder</surname>
<given-names>SP</given-names>
</name>
</person-group>. <article-title>Gaussian random fields</article-title>. <source>Found Phys</source> (<year>1978</year>) <volume>8</volume>:<fpage>295</fpage>&#x2013;<lpage>302</lpage>. <pub-id pub-id-type="doi">10.1007/bf00715214</pub-id> </citation>
</ref>
<ref id="B54">
<label>54.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sherman</surname>
<given-names>S</given-names>
</name>
</person-group>. <article-title>Markov random fields and Gibbs random fields</article-title>. <source>Isr J&#x20;Math</source> (<year>1973</year>) <volume>14</volume>:<fpage>92</fpage>&#x2013;<lpage>103</lpage>. <pub-id pub-id-type="doi">10.1007/bf02761538</pub-id> </citation>
</ref>
<ref id="B55">
<label>55.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Luitz</surname>
<given-names>DJ</given-names>
</name>
<name>
<surname>Laflorencie</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Alet</surname>
<given-names>F</given-names>
</name>
</person-group>. <article-title>Many-body localization edge in the random-field heisenberg chain</article-title>. <source>Phys Rev B</source> (<year>2015</year>) <volume>91</volume>:<fpage>081103</fpage>. <pub-id pub-id-type="doi">10.1103/physrevb.91.081103</pub-id> </citation>
</ref>
<ref id="B56">
<label>56.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Starodubov</surname>
<given-names>SL</given-names>
</name>
</person-group>. <source>A theorem on properties of sample functions of a random field and generalized random fields</source>. <publisher-loc>Moscow, Russia</publisher-loc>: <publisher-name>Izvestiya Vysshikh Uchebnykh Zavedenii. Matematika</publisher-name> (<year>2011</year>) <fpage>48</fpage>&#x2013;<lpage>56</lpage>.</citation>
</ref>
<ref id="B57">
<label>57.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Acar</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Sundararaghavan</surname>
<given-names>V</given-names>
</name>
</person-group>. <article-title>A Markov random field approach for modeling spatio-temporal evolution of microstructures</article-title>. <source>Model Simul Mater Sci Eng</source> (<year>2016</year>) <volume>24</volume>:<fpage>075005</fpage>. <pub-id pub-id-type="doi">10.1088/0965-0393/24/7/075005</pub-id> </citation>
</ref>
<ref id="B58">
<label>58.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Konincks</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Krakoviack</surname>
<given-names>V</given-names>
</name>
</person-group>. <article-title>Dynamics of fluids in quenched-random potential energy landscapes: a mode-coupling theory approach</article-title>. <source>Soft matter</source> (<year>2017</year>) <volume>13</volume>:<fpage>5283</fpage>&#x2013;<lpage>97</lpage>. <pub-id pub-id-type="doi">10.1039/c7sm00984d</pub-id> </citation>
</ref>
<ref id="B59">
<label>59.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Hu</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Wei</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Saw</surname>
<given-names>A-L</given-names>
</name>
</person-group>. <article-title>A direct simulation algorithm for a class of beta random fields in modelling material properties</article-title>. <source>Comput Methods Appl Mech Eng</source> (<year>2017</year>) <volume>326</volume>:<fpage>642</fpage>&#x2013;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1016/j.cma.2017.08.001</pub-id> </citation>
</ref>
<ref id="B60">
<label>60.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chen</surname>
<given-names>J</given-names>
</name>
<name>
<surname>He</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Ren</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Stochastic harmonic function representation of random fields for material properties of structures</article-title>. <source>J&#x20;Eng Mech</source> (<year>2018</year>) <volume>144</volume>:<fpage>04018049</fpage>. <pub-id pub-id-type="doi">10.1061/(asce)em.1943-7889.0001469</pub-id> </citation>
</ref>
<ref id="B61">
<label>61.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Singh</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Adhikari</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>Fluctuating hydrodynamics and the brownian motion of an active colloid near a wall</article-title>. <source>Eur J&#x20;Comput Mech</source> (<year>2017</year>) <volume>26</volume>:<fpage>78</fpage>&#x2013;<lpage>97</lpage>. <pub-id pub-id-type="doi">10.1080/17797179.2017.1294829</pub-id> </citation>
</ref>
<ref id="B62">
<label>62.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yamazaki</surname>
<given-names>K</given-names>
</name>
</person-group>. <article-title>Stochastic hall-magneto-hydrodynamics system in three and two and a half dimensions</article-title>. <source>J&#x20;Stat Phys</source> (<year>2017</year>) <volume>166</volume>:<fpage>368</fpage>&#x2013;<lpage>97</lpage>. <pub-id pub-id-type="doi">10.1007/s10955-016-1683-9</pub-id> </citation>
</ref>
<ref id="B63">
<label>63.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ullah</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Uzair</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Ullah</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Khan</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Ahmad</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Khan</surname>
<given-names>W</given-names>
</name>
</person-group>. <article-title>Density independent hydrodynamics model for crowd coherency detection</article-title>. <source>Neurocomputing</source> (<year>2017</year>) <volume>242</volume>:<fpage>28</fpage>&#x2013;<lpage>39</lpage>. <pub-id pub-id-type="doi">10.1016/j.neucom.2017.02.023</pub-id> </citation>
</ref>
<ref id="B64">
<label>64.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tadi&#x107;</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Mijatovi&#x107;</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Jani&#x107;evi&#x107;</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Spasojevi&#x107;</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Rodgers</surname>
<given-names>GJ</given-names>
</name>
</person-group>. <article-title>The critical barkhausen avalanches in thin random-field ferromagnets with an open boundary</article-title>. <source>Scientific Rep</source> (<year>2019</year>) <volume>9</volume>:<fpage>1</fpage>&#x2013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1038/s41598-019-42802-w</pub-id> </citation>
</ref>
<ref id="B65">
<label>65.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Tsukanov</surname>
<given-names>AA</given-names>
</name>
<name>
<surname>Gorbatnikov</surname>
<given-names>AV</given-names>
</name>
</person-group>. <article-title>Influence of embedded inhomogeneities on the spectral ratio of the horizontal components of a random field of&#x20;Rayleigh waves</article-title>. <source>Acoust Phys</source> (<year>2018</year>) <volume>64</volume>:<fpage>70</fpage>&#x2013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1134/s1063771018010189</pub-id> </citation>
</ref>
<ref id="B66">
<label>66.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Shadaydeh</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Guanche</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Denzler</surname>
<given-names>J</given-names>
</name>
</person-group>. <source>Classification of spatiotemporal marine climate patterns using wavelet coherence and markov random field</source>. <publisher-name>American Geophysical Union</publisher-name> (<year>2018</year>). <comment>Fall Meeting 2018IN31C&#x2013;0824</comment>.</citation>
</ref>
<ref id="B67">
<label>67.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Feng</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Luthi</surname>
<given-names>SM</given-names>
</name>
<name>
<surname>Gisolf</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Angerer</surname>
<given-names>E</given-names>
</name>
</person-group>. <article-title>Reservoir lithology determination by hidden Markov random fields based on a Gaussian mixture model</article-title>. <source>IEEE Trans Geosci Remote Sensing</source> (<year>2018</year>) <volume>56</volume>:<fpage>6663</fpage>&#x2013;<lpage>73</lpage>. <pub-id pub-id-type="doi">10.1109/tgrs.2018.2841059</pub-id> </citation>
</ref>
<ref id="B68">
<label>68.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Wang</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Wellmann</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Verweij</surname>
<given-names>E</given-names>
</name>
<name>
<surname>von Hebel</surname>
<given-names>C</given-names>
</name>
<name>
<surname>van der Kruk</surname>
<given-names>J</given-names>
</name>
</person-group>. <source>Identification and simulation of subsurface soil patterns using hidden markov random fields and remote sensing and geophysical emi data sets</source>. <publisher-loc>Vienna, Austria</publisher-loc>: <publisher-name>EGUGA</publisher-name> (<year>2017</year>) <fpage>6530</fpage>.</citation>
</ref>
<ref id="B69">
<label>69.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Ko</surname>
<given-names>GG</given-names>
</name>
<name>
<surname>Rutenbar</surname>
<given-names>RA</given-names>
</name>
</person-group>. <article-title>A case study of machine learning hardware: real-time source separation using Markov random fields via sampling-based inference</article-title>. <conf-name>IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)</conf-name>. <publisher-name>(IEEE)</publisher-name> (<year>2017</year>) <fpage>2477</fpage>&#x2013;<lpage>81</lpage>. </citation>
</ref>
<ref id="B70">
<label>70.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Li</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Jiang</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Zhu</surname>
<given-names>H</given-names>
</name>
</person-group>. <source>A local region-based level set method with markov random field for side-scan sonar image multi-level segmentation</source>. <publisher-name>IEEE Sensors Journal</publisher-name> (<year>2020</year>).</citation>
</ref>
<ref id="B71">
<label>71.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ziatdinov</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Maksov</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Kalinin</surname>
<given-names>SV</given-names>
</name>
</person-group>. <article-title>Learning surface molecular structures via machine vision</article-title>. <source>npj&#x20;Comput Mater</source> (<year>2017</year>) <volume>3</volume>:<fpage>1</fpage>&#x2013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1038/s41524-017-0038-7</pub-id> </citation>
</ref>
<ref id="B72">
<label>72.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ciliberto</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Herbster</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Ialongo</surname>
<given-names>AD</given-names>
</name>
<name>
<surname>Pontil</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Rocchetto</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Severini</surname>
<given-names>S</given-names>
</name>
<etal/>
</person-group> <article-title>Quantum machine learning: a classical perspective</article-title>. <source>Proc R Soc A</source> (<year>2018</year>) <volume>474</volume>:<fpage>20170551</fpage>. <pub-id pub-id-type="doi">10.1098/rspa.2017.0551</pub-id> </citation>
</ref>
<ref id="B73">
<label>73.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Li</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Ping</surname>
<given-names>W</given-names>
</name>
</person-group>. <source>Cancer metastasis detection with neural conditional random field</source>. <publisher-loc>Ithaca, NY</publisher-loc>: <publisher-name>arXiv:1806.07064</publisher-name> (<year>2018</year>). </citation>
</ref>
<ref id="B74">
<label>74.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Zhao</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Gay</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Sun</surname>
<given-names>B</given-names>
</name>
</person-group>. <article-title>Arpm-net: a novel cnn-based adversarial method with Markov random field enhancement for prostate and organs at risk segmentation in pelvic ct images</article-title>. <source>Med Phys</source> (<year>2020</year>). <pub-id pub-id-type="doi">10.1002/mp.14580</pub-id> </citation>
</ref>
<ref id="B75">
<label>75.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Fu</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Lin</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Kee Wong</surname>
<given-names>DW</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Deepvessel: retinal vessel segmentation via deep learning and conditional random field</article-title>. <conf-name>International conference on medical image computing and computer-assisted intervention</conf-name>. <publisher-name>Springer</publisher-name> (<year>2016</year>) <fpage>132</fpage>&#x2013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-319-46723-8_16</pub-id> </citation>
</ref>
<ref id="B76">
<label>76.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Orlando</surname>
<given-names>JI</given-names>
</name>
<name>
<surname>Prokofyeva</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Blaschko</surname>
<given-names>MB</given-names>
</name>
</person-group>. <article-title>A discriminatively trained fully connected conditional random field model for blood vessel segmentation in fundus images</article-title>. <source>IEEE Trans Biomed Eng</source> (<year>2016</year>) <volume>64</volume>:<fpage>16</fpage>&#x2013;<lpage>27</lpage>. <pub-id pub-id-type="doi">10.1109/TBME.2016.2535311</pub-id> </citation>
</ref>
<ref id="B77">
<label>77.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Reta</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Gonzalez</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Diaz</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Guichard</surname>
<given-names>J</given-names>
</name>
</person-group>. <source>Leukocytes segmentation using markov random fields. Software Tools and Algorithms for Biological Systems</source>. <publisher-name>Springer</publisher-name> (<year>2011</year>) <fpage>345</fpage>&#x2013;<lpage>53</lpage>. </citation>
</ref>
<ref id="B78">
<label>78.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Hahn</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Bode</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Kr&#xfc;wel</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Kampf</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Buschle</surname>
<given-names>LR</given-names>
</name>
<name>
<surname>Sturm</surname>
<given-names>VJF</given-names>
</name>
<etal/>
</person-group> <article-title>Gibbs point field model quantifies disorder in microvasculature of u87-glioblastoma</article-title>. <source>J&#x20;Theor Biol</source> (<year>2020</year>) <volume>494</volume>:<fpage>110230</fpage>. <pub-id pub-id-type="doi">10.1016/j.jtbi.2020.110230</pub-id> </citation>
</ref>
<ref id="B79">
<label>79.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mahmood</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Durr</surname>
<given-names>NJ</given-names>
</name>
</person-group>. <article-title>Deep learning and conditional random fields-based depth estimation and topographical reconstruction from conventional endoscopy</article-title>. <source>Med image Anal</source> (<year>2018</year>) <volume>48</volume>:<fpage>230</fpage>&#x2013;<lpage>43</lpage>. <pub-id pub-id-type="doi">10.1016/j.media.2018.06.005</pub-id> </citation>
</ref>
<ref id="B80">
<label>80.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sari</surname>
<given-names>NLK</given-names>
</name>
<name>
<surname>Prajitno</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Lubis</surname>
<given-names>LE</given-names>
</name>
<name>
<surname>Soejoko</surname>
<given-names>DS</given-names>
</name>
</person-group>. <article-title>Computer aided diagnosis (cad) for mammography with Markov random field method with simulated annealing optimization</article-title>. <source>J&#x20;Med Phys Biophys</source> (<year>2017</year>) <volume>4</volume>:<fpage>84</fpage>&#x2013;<lpage>93</lpage>. </citation>
</ref>
<ref id="B81">
<label>81.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Nitzken</surname>
<given-names>MJ</given-names>
</name>
<name>
<surname>El-Baz</surname>
<given-names>AS</given-names>
</name>
<name>
<surname>Beache</surname>
<given-names>GM</given-names>
</name>
</person-group>. <article-title>Markov-gibbs random field model for improved full-cardiac cycle strain estimation from tagged cmr</article-title>. <source>J&#x20;Cardiovasc Magn Reson</source> (<year>2012</year>) <volume>14</volume>:<fpage>1</fpage>&#x2013;<lpage>2</lpage>. <pub-id pub-id-type="doi">10.1186/1532-429x-14-s1-p258</pub-id> </citation>
</ref>
<ref id="B82">
<label>82.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Johansen</surname>
<given-names>AR</given-names>
</name>
<name>
<surname>S&#xf8;nderby</surname>
<given-names>CK</given-names>
</name>
<name>
<surname>S&#xf8;nderby</surname>
<given-names>SK</given-names>
</name>
<name>
<surname>Winther</surname>
<given-names>O</given-names>
</name>
</person-group>. <article-title>Deep recurrent conditional random field network for protein secondary prediction</article-title>. <conf-name>Proceedings of the 8th ACM international conference on bioinformatics, computational biology, and health informatics</conf-name>, <publisher-loc>Boston, MA</publisher-loc>: <publisher-name>ACM-BCB &#x27;17</publisher-name> (<year>2017</year>) <fpage>73</fpage>&#x2013;<lpage>8</lpage>. </citation>
</ref>
<ref id="B83">
<label>83.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Yanover</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Fromer</surname>
<given-names>M</given-names>
</name>
</person-group>. <article-title>Prediction of low energy protein side chain configurations using Markov random fields</article-title>. <conf-name>Bayesian Methods in Structural Bioinformatics</conf-name>. <publisher-name>Springer</publisher-name> (<year>2012</year>) <fpage>255</fpage>&#x2013;<lpage>84</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-642-27225-7_11</pub-id> </citation>
</ref>
<ref id="B84">
<label>84.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Xu</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Ma</surname>
<given-names>J</given-names>
</name>
</person-group>. <source>Protein homology detection through alignment of markov random fields: using MRFalign</source>. <publisher-name>Springer</publisher-name> (<year>2015</year>). </citation>
</ref>
<ref id="B85">
<label>85.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ma</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Mrfalign: protein homology detection through alignment of Markov random fields</article-title>. <source>Plos Comput Biol</source> (<year>2014</year>) <volume>10</volume>:<fpage>e1003500</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pcbi.1003500</pub-id> </citation>
</ref>
<ref id="B86">
<label>86.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wilburn</surname>
<given-names>GW</given-names>
</name>
<name>
<surname>Eddy</surname>
<given-names>SR</given-names>
</name>
</person-group>. <article-title>Remote homology search with hidden potts models</article-title>. <source>Plos Comput Biol</source> (<year>2020</year>) <volume>16</volume>:<fpage>e1008085</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pcbi.1008085</pub-id> </citation>
</ref>
<ref id="B87">
<label>87.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Gehrmann</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Loog</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Reinders</surname>
<given-names>MJT</given-names>
</name>
<name>
<surname>de Ridder</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>Conditional random fields for protein function prediction</article-title>. <conf-name>IAPR International Conference on Pattern Recognition in Bioinformatics</conf-name>. <publisher-name>Springer</publisher-name> (<year>2013</year>) <fpage>184</fpage>&#x2013;<lpage>95</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-642-39159-0_17</pub-id> </citation>
</ref>
<ref id="B88">
<label>88.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Loeliger</surname>
<given-names>H-A</given-names>
</name>
<name>
<surname>Dauwels</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Hu</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Korl</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Ping</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Kschischang</surname>
<given-names>FR</given-names>
</name>
</person-group>. <article-title>The factor graph approach to model-based signal processing</article-title>. <source>Proc IEEE</source> (<year>2007</year>) <volume>95</volume>:<fpage>1295</fpage>&#x2013;<lpage>322</lpage>. <pub-id pub-id-type="doi">10.1109/jproc.2007.896497</pub-id> </citation>
</ref>
<ref id="B89">
<label>89.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ray</surname>
<given-names>WC</given-names>
</name>
<name>
<surname>Wolock</surname>
<given-names>SL</given-names>
</name>
<name>
<surname>Callahan</surname>
<given-names>NW</given-names>
</name>
<name>
<surname>Dong</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>QQ</given-names>
</name>
<name>
<surname>Liang</surname>
<given-names>C</given-names>
</name>
<etal/>
</person-group> <article-title>Addressing the unmet need for visualizing conditional random fields in biological data</article-title>. <source>BMC bioinformatics</source> (<year>2014</year>) <volume>15</volume>:<fpage>202</fpage>. <pub-id pub-id-type="doi">10.1186/1471-2105-15-202</pub-id> </citation>
</ref>
<ref id="B90">
<label>90.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Geman</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Geman</surname>
<given-names>D</given-names>
</name>
</person-group>. <source>Stochastic relaxation, gibbs distributions, and the bayesian restoration of images</source>. <publisher-name>IEEE Transactions on pattern analysis and machine intelligence</publisher-name> (<year>1984</year>) <fpage>721</fpage>&#x2013;<lpage>41</lpage>. </citation>
</ref>
<ref id="B91">
<label>91.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Xu</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Jin</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Furey</surname>
<given-names>TS</given-names>
</name>
<name>
<surname>Sullivan</surname>
<given-names>PF</given-names>
</name>
<etal/>
</person-group> <article-title>A hidden Markov random field-based bayesian method for the detection of long-range chromosomal interactions in hi-c data</article-title>. <source>Bioinformatics</source> (<year>2016</year>) <volume>32</volume>:<fpage>650</fpage>&#x2013;<lpage>6</lpage>. <pub-id pub-id-type="doi">10.1093/bioinformatics/btv650</pub-id> </citation>
</ref>
<ref id="B92">
<label>92.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wu</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Lu</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Xue</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Lyu</surname>
<given-names>Q</given-names>
</name>
<name>
<surname>Jiang</surname>
<given-names>M</given-names>
</name>
</person-group>. <article-title>Deep conditional random field approach to transmembrane topology prediction and application to gpcr three-dimensional structure modeling</article-title>. <source>Ieee/acm Trans Comput Biol Bioinform</source> (<year>2016</year>) <volume>14</volume>:<fpage>1106</fpage>&#x2013;<lpage>14</lpage>. <pub-id pub-id-type="doi">10.1109/TCBB.2016.2602872</pub-id> </citation>
</ref>
<ref id="B93">
<label>93.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kordmahalleh</surname>
<given-names>MM</given-names>
</name>
<name>
<surname>Sefidmazgi</surname>
<given-names>MG</given-names>
</name>
<name>
<surname>Harrison</surname>
<given-names>SH</given-names>
</name>
<name>
<surname>Homaifar</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>Identifying time-delayed gene regulatory networks via an evolvable hierarchical recurrent neural network</article-title>. <source>BioData mining</source> (<year>2017</year>) <volume>10</volume>:<fpage>29</fpage>. <pub-id pub-id-type="doi">10.1186/s13040-017-0146-4</pub-id> </citation>
</ref>
<ref id="B94">
<label>94.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Gitter</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Huang</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Valluvan</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Fraenkel</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Anandkumar</surname>
<given-names>A</given-names>
</name>
</person-group>. <source>Unsupervised learning of transcriptional regulatory networks via latent tree graphical models</source>. <publisher-loc>Ithaca, NY</publisher-loc>: <publisher-name>arXiv:1609.06335</publisher-name> (<year>2016</year>).</citation>
</ref>
<ref id="B95">
<label>95.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhong</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Dong</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Poston</surname>
<given-names>TB</given-names>
</name>
<name>
<surname>Darville</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Spracklen</surname>
<given-names>CN</given-names>
</name>
<name>
<surname>Wu</surname>
<given-names>D</given-names>
</name>
<etal/>
</person-group> <article-title>Inferring regulatory networks from mixed observational data using directed acyclic graphs</article-title>. <source>Front Genet</source> (<year>2020</year>) <volume>11</volume>:<fpage>8</fpage>. <pub-id pub-id-type="doi">10.3389/fgene.2020.00008</pub-id> </citation>
</ref>
<ref id="B96">
<label>96.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ma</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Jiang</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Jiang</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>Constructing tissue-specific transcriptional regulatory networks via a Markov random field</article-title>. <source>BMC genomics</source> (<year>2018</year>) <volume>19</volume>:<fpage>65</fpage>&#x2013;<lpage>77</lpage>. <pub-id pub-id-type="doi">10.1186/s12864-018-5277-6</pub-id> </citation>
</ref>
<ref id="B97">
<label>97.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kolmogorov</surname>
<given-names>V</given-names>
</name>
<name>
<surname>Zabih</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>What energy functions can be minimized via graph cuts?</article-title>. <source>IEEE Trans Pattern Anal Machine Intell</source> (<year>2004</year>) <volume>26</volume>:<fpage>147</fpage>&#x2013;<lpage>59</lpage>. <pub-id pub-id-type="doi">10.1109/tpami.2004.1262177</pub-id> </citation>
</ref>
<ref id="B98">
<label>98.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Banf</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Rhee</surname>
<given-names>SY</given-names>
</name>
</person-group>. <article-title>Enhancing gene regulatory network inference through data integration with Markov random fields</article-title>. <source>Scientific Rep</source> (<year>2017</year>) <volume>7</volume>:<fpage>1</fpage>&#x2013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1038/srep41174</pub-id> </citation>
</ref>
<ref id="B99">
<label>99.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Grimes</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Potter</surname>
<given-names>SS</given-names>
</name>
<name>
<surname>Datta</surname>
<given-names>S</given-names>
</name>
</person-group>. <article-title>Integrating gene regulatory pathways into differential network analysis of gene expression data</article-title>. <source>Scientific Rep</source> (<year>2019</year>) <volume>9</volume>:<fpage>1</fpage>&#x2013;<lpage>12</lpage>. <pub-id pub-id-type="doi">10.1038/s41598-019-41918-3</pub-id> </citation>
</ref>
<ref id="B100">
<label>100.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wei</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>H</given-names>
</name>
</person-group>. <article-title>A Markov random field model for network-based analysis of&#x20;genomic data</article-title>. <source>Bioinformatics</source> (<year>2007</year>) <volume>23</volume>:<fpage>1537</fpage>&#x2013;<lpage>44</lpage>. <pub-id pub-id-type="doi">10.1093/bioinformatics/btm129</pub-id> </citation>
</ref>
<ref id="B101">
<label>101.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gomez-Romero</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Lopez-Reyes</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Hernandez-Lemus</surname>
<given-names>E</given-names>
</name>
</person-group>. <article-title>The large scale structure of human metabolism reveals resilience via extensive signaling crosstalk</article-title>. <source>Front Physiol</source> (<year>2020</year>) <volume>11</volume>:<fpage>1667</fpage>. <pub-id pub-id-type="doi">10.3389/fphys.2020.588012</pub-id> </citation>
</ref>
<ref id="B102">
<label>102.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Lin</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Sestan</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Zhao</surname>
<given-names>H</given-names>
</name>
</person-group>. <article-title>A Markov random field-based approach for joint estimation of differentially expressed genes in mouse transcriptome data</article-title>. <source>Stat Appl Genet Mol Biol</source> (<year>2016</year>) <volume>15</volume>:<fpage>139</fpage>&#x2013;<lpage>50</lpage>. <pub-id pub-id-type="doi">10.1515/sagmb-2015-0070</pub-id> </citation>
</ref>
<ref id="B103">
<label>103.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Chen</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Cho</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Zhao</surname>
<given-names>H</given-names>
</name>
</person-group>. <article-title>Incorporating biological pathways via a Markov random field model in genome-wide association studies</article-title>. <source>Plos Genet</source> (<year>2011</year>) <volume>7</volume>:<fpage>e1001353</fpage>. <pub-id pub-id-type="doi">10.1371/journal.pgen.1001353</pub-id> </citation>
</ref>
<ref id="B104">
<label>104.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Long</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Wu</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Kwoh</surname>
<given-names>CK</given-names>
</name>
<name>
<surname>Luo</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>X</given-names>
</name>
</person-group>. <article-title>Predicting human microbe-drug associations via graph convolutional network with conditional random field</article-title>. <source>Bioinformatics</source> (<year>2020</year>) <volume>36</volume>:<fpage>4918</fpage>&#x2013;<lpage>27</lpage>. <pub-id pub-id-type="doi">10.1093/bioinformatics/btaa598</pub-id> </citation>
</ref>
<ref id="B105">
<label>105.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Xu</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Yang</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Xue</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Sharma</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Sanchez-Martin</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>F</given-names>
</name>
<etal/>
</person-group> <article-title>Translating cancer genomics into precision medicine with artificial intelligence: applications, challenges and future perspectives</article-title>. <source>Hum Genet</source> (<year>2019</year>) <volume>138</volume>:<fpage>109</fpage>&#x2013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.1007/s00439-019-01970-5</pub-id> </citation>
</ref>
<ref id="B106">
<label>106.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fariselli</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Savojardo</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Martelli</surname>
<given-names>PL</given-names>
</name>
<name>
<surname>Casadio</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>Grammatical-restrained hidden conditional random fields for bioinformatics applications</article-title>. <source>Algorithms Mol Biol</source> (<year>2009</year>) <volume>4</volume>:<fpage>13</fpage>. <pub-id pub-id-type="doi">10.1186/1748-7188-4-13</pub-id> </citation>
</ref>
<ref id="B107">
<label>107.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname>
<given-names>NR</given-names>
</name>
<name>
<surname>Yakir</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Xia</surname>
<given-names>LC</given-names>
</name>
<name>
<surname>Siegmund</surname>
<given-names>D.</given-names>
</name>
</person-group> <article-title>Scan statistics on Poisson random fields with applications in genomics</article-title>. <source>Ann Appl Stat</source> (<year>2016</year>) <volume>10</volume>:<fpage>726</fpage>&#x2013;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1214/15-aoas892</pub-id> </citation>
</ref>
<ref id="B108">
<label>108.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Urbain</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Frieder</surname>
<given-names>O</given-names>
</name>
<name>
<surname>Goharian</surname>
<given-names>N</given-names>
</name>
</person-group>. <article-title>Passage relevance models for genomics search</article-title>. <conf-name>Proceedings of the 2nd international workshop on Data and text mining in bioinformatics</conf-name>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>DTMBIO &#x27;08</publisher-name> (<year>2008</year>) <fpage>45</fpage>&#x2013;<lpage>52</lpage>. </citation>
</ref>
<ref id="B109">
<label>109.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wang</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>He</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Jiang</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Hu</surname>
<given-names>X</given-names>
</name>
</person-group>. <article-title>Recognition of bacteria named entity using conditional random fields in spark</article-title>. <source>BMC Syst Biol</source> (<year>2018</year>) <volume>12</volume>:<fpage>106</fpage>. <pub-id pub-id-type="doi">10.1186/s12918-018-0625-3</pub-id> </citation>
</ref>
<ref id="B110">
<label>110.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>McDonald</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Pereira</surname>
<given-names>F</given-names>
</name>
</person-group>. <article-title>Identifying gene and protein mentions in text using conditional random fields</article-title>. <source>BMC bioinformatics</source> (<year>2005</year>) <volume>6</volume>:<fpage>S6</fpage>. <pub-id pub-id-type="doi">10.1186/1471-2105-6-s1-s6</pub-id> </citation>
</ref>
<ref id="B111">
<label>111.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vecchyo</surname>
<given-names>OD</given-names>
</name>
<name>
<surname>Marsden</surname>
<given-names>CD</given-names>
</name>
<name>
<surname>Lohmueller</surname>
<given-names>KE.</given-names>
</name>
</person-group> <article-title>Prefersim: fast simulation of demography and selection under the Poisson random field model</article-title>. <source>Bioinformatics</source> (<year>2016</year>) <volume>32</volume>:<fpage>3516</fpage>&#x2013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1093/bioinformatics/btw478</pub-id> </citation>
</ref>
<ref id="B112">
<label>112.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Fran&#xe7;ois</surname>
<given-names>O</given-names>
</name>
<name>
<surname>Ancelet</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Guillot</surname>
<given-names>G</given-names>
</name>
</person-group>. <article-title>Bayesian clustering using hidden Markov random fields in spatial population genetics</article-title>. <source>Genetics</source> (<year>2006</year>) <volume>174</volume>:<fpage>805</fpage>&#x2013;<lpage>16</lpage>. <pub-id pub-id-type="doi">10.1534/genetics.106.059923</pub-id> </citation>
</ref>
<ref id="B113">
<label>113.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Clark</surname>
<given-names>NJ</given-names>
</name>
<name>
<surname>Wells</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Lindberg</surname>
<given-names>O</given-names>
</name>
</person-group>. <article-title>Unravelling changing interspecific interactions across environmental gradients using Markov random fields</article-title>. <source>Ecology</source> (<year>2018</year>) <volume>99</volume>:<fpage>1277</fpage>&#x2013;<lpage>83</lpage>. <pub-id pub-id-type="doi">10.1002/ecy.2221</pub-id> </citation>
</ref>
<ref id="B114">
<label>114.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Salinas</surname>
<given-names>NR</given-names>
</name>
<name>
<surname>Wheeler</surname>
<given-names>WC</given-names>
</name>
</person-group>. <article-title>Statistical modeling of distribution patterns: a Markov random field implementation and its application on areas of endemism</article-title>. <source>Syst Biol</source> (<year>2020</year>) <volume>69</volume>:<fpage>76</fpage>&#x2013;<lpage>90</lpage>. <pub-id pub-id-type="doi">10.1093/sysbio/syz033</pub-id> </citation>
</ref>
<ref id="B115">
<label>115.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Shen</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Van Deelen</surname>
<given-names>TR</given-names>
</name>
</person-group>. <source>Spatially explicit modeling of community occupancy using markov random field models with imperfect observation: mesocarnivores in apostle islands national lakeshore</source>. <publisher-loc>Cold Spring Harbor, NY</publisher-loc>: <publisher-name>BioRxiv</publisher-name> (<year>2020</year>).</citation>
</ref>
<ref id="B116">
<label>116.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Kozik</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>Improving depth map quality with Markov random fields</article-title>. <conf-name>Image Processing and Communications Challenges</conf-name>. <publisher-name>Springer</publisher-name> (<year>2011</year>) <fpage>149</fpage>&#x2013;<lpage>56</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-642-23154-4_17</pub-id> </citation>
</ref>
<ref id="B117">
<label>117.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stephenson</surname>
<given-names>TA</given-names>
</name>
<name>
<surname>Chen</surname>
<given-names>T</given-names>
</name>
</person-group>. <article-title>Adaptive Markov random fields for example-based super-resolution of faces</article-title>. <source>EURASIP J&#x20;Adv Signal Process</source> (<year>2006</year>) <volume>2006</volume>:<fpage>031062</fpage>. <pub-id pub-id-type="doi">10.1155/asp/2006/31062</pub-id> </citation>
</ref>
<ref id="B118">
<label>118.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Li</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Wand</surname>
<given-names>M</given-names>
</name>
</person-group>. <article-title>Combining Markov random fields and convolutional neural networks for image synthesis</article-title>. <conf-name>Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition</conf-name>. <publisher-loc>Ithaca, NY</publisher-loc>: <publisher-name>arXiv:1601.04589</publisher-name> (<year>2016</year>) <fpage>2479</fpage>&#x2013;<lpage>86</lpage>. </citation>
</ref>
<ref id="B119">
<label>119.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Wen</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Han</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>W</given-names>
</name>
</person-group>. <article-title>2d conditional random fields for image classification</article-title>. <conf-name>International Conference on Intelligent Information Processing</conf-name>. <publisher-name>Springer</publisher-name> (<year>2006</year>) <fpage>383</fpage>&#x2013;<lpage>90</lpage>. </citation>
</ref>
<ref id="B120">
<label>120.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Bohorquez</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Giraldo</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Mateu</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Multivariate functional random fields: prediction and optimal sampling</article-title>. <source>Stoch Environ Res Risk Assess</source> (<year>2017</year>) <volume>31</volume>:<fpage>53</fpage>&#x2013;<lpage>70</lpage>. <pub-id pub-id-type="doi">10.1007/s00477-016-1266-y</pub-id> </citation>
</ref>
<ref id="B121">
<label>121.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Baca-Lopez</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Fresno</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Espinal-Enr&#xed;quez</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Martinez-Garcia</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Camacho-Lopez</surname>
<given-names>MA</given-names>
</name>
<name>
<surname>Flores-Merino</surname>
<given-names>MV</given-names>
</name>
<etal/>
</person-group> <article-title>Spatio-temporal representativeness of air quality monitoring stations in Mexico city: implications for public health</article-title>. <source>Front Public Health</source> (<year>2020</year>) <volume>8</volume>:<fpage>849</fpage>. <pub-id pub-id-type="doi">10.3389/fpubh.2020.536174</pub-id> </citation>
</ref>
<ref id="B122">
<label>122.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Baca-Lopez</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Fresno</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Espinal-Enriquez</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Flores-Merino</surname>
<given-names>MV</given-names>
</name>
<name>
<surname>Camacho-Lopez</surname>
<given-names>MA</given-names>
</name>
<name>
<surname>Hernandez-Lemus</surname>
<given-names>E</given-names>
</name>
</person-group>. <source>Metropolitan age-specific mortality trends at borough and neighbourhood level: the case of Mexico city</source> (<year>2020</year>).</citation>
</ref>
<ref id="B123">
<label>123.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wang</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Wellmann</surname>
<given-names>JF</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Liang</surname>
<given-names>RY</given-names>
</name>
</person-group>. <article-title>A segmentation approach for stochastic geological modeling using hidden Markov random fields</article-title>. <source>Math Geosci</source> (<year>2017</year>) <volume>49</volume>:<fpage>145</fpage>&#x2013;<lpage>77</lpage>. <pub-id pub-id-type="doi">10.1007/s11004-016-9663-9</pub-id> </citation>
</ref>
<ref id="B124">
<label>124.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Li</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Liang</surname>
<given-names>RY</given-names>
</name>
</person-group>. <article-title>Quantifying stratigraphic uncertainties by stochastic simulation techniques based on Markov random field</article-title>. <source>Eng Geology</source> (<year>2016</year>) <volume>201</volume>:<fpage>106</fpage>&#x2013;<lpage>22</lpage>. <pub-id pub-id-type="doi">10.1016/j.enggeo.2015.12.017</pub-id> </citation>
</ref>
<ref id="B125">
<label>125.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Rue</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Held</surname>
<given-names>L</given-names>
</name>
</person-group>. <source>Gaussian Markov random fields: theory and applications</source>. <publisher-loc>Boca Raton, FL</publisher-loc>: <publisher-name>CRC Press</publisher-name> (<year>2005</year>).</citation>
</ref>
<ref id="B126">
<label>126.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Solberg</surname>
<given-names>AHS</given-names>
</name>
<name>
<surname>Taxt</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Jain</surname>
<given-names>AK</given-names>
</name>
</person-group>. <article-title>A Markov random field model for classification of multisource satellite imagery</article-title>. <source>IEEE Trans Geosci Remote Sensing</source> (<year>1996</year>) <volume>34</volume>:<fpage>100</fpage>&#x2013;<lpage>13</lpage>. <pub-id pub-id-type="doi">10.1109/36.481897</pub-id> </citation>
</ref>
<ref id="B127">
<label>127.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Toftaker</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Tjelmeland</surname>
<given-names>H</given-names>
</name>
</person-group>. <article-title>Construction of binary multi-grid Markov random field prior models from training images</article-title>. <source>Math Geosci</source> (<year>2013</year>) <volume>45</volume>:<fpage>383</fpage>&#x2013;<lpage>409</lpage>. <pub-id pub-id-type="doi">10.1007/s11004-013-9456-3</pub-id> </citation>
</ref>
<ref id="B128">
<label>128.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Reuschen</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Nowak</surname>
<given-names>W</given-names>
</name>
</person-group>. <article-title>Bayesian inversion of hierarchical geostatistical models using a parallel-tempering sequential Gibbs mcmc</article-title>. <source>Adv Water Resour</source> (<year>2020</year>) <volume>141</volume>:<fpage>103614</fpage>. <pub-id pub-id-type="doi">10.1016/j.advwatres.2020.103614</pub-id> </citation>
</ref>
<ref id="B129">
<label>129.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sutton</surname>
<given-names>C</given-names>
</name>
<name>
<surname>McCallum</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>An introduction to conditional random fields for relational learning</article-title>. <source>Introduction Stat relational Learn</source> (<year>2006</year>) <volume>2</volume>:<fpage>93</fpage>&#x2013;<lpage>128</lpage>. </citation>
</ref>
<ref id="B130">
<label>130.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gilks</surname>
<given-names>WR</given-names>
</name>
<name>
<surname>Wild</surname>
<given-names>P</given-names>
</name>
</person-group>. <article-title>Adaptive rejection sampling for Gibbs sampling</article-title>. <source>Appl Stat</source> (<year>1992</year>) <volume>41</volume>:<fpage>337</fpage>&#x2013;<lpage>48</lpage>. <pub-id pub-id-type="doi">10.2307/2347565</pub-id> </citation>
</ref>
<ref id="B131">
<label>131.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gilks</surname>
<given-names>WR</given-names>
</name>
<name>
<surname>Best</surname>
<given-names>NG</given-names>
</name>
<name>
<surname>Tan</surname>
<given-names>KKC</given-names>
</name>
</person-group>. <article-title>Adaptive rejection metropolis sampling within Gibbs sampling</article-title>. <source>Appl Stat</source> (<year>1995</year>) <volume>44</volume>:<fpage>455</fpage>&#x2013;<lpage>72</lpage>. <pub-id pub-id-type="doi">10.2307/2986138</pub-id> </citation>
</ref>
<ref id="B132">
<label>132.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Meyer</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Cai</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Perron</surname>
<given-names>F</given-names>
</name>
</person-group>. <article-title>Adaptive rejection metropolis sampling using Lagrange interpolation polynomials of degree 2</article-title>. <source>Comput Stat Data Anal</source> (<year>2008</year>) <volume>52</volume>:<fpage>3408</fpage>&#x2013;<lpage>23</lpage>. <pub-id pub-id-type="doi">10.1016/j.csda.2008.01.005</pub-id> </citation>
</ref>
<ref id="B133">
<label>133.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Martino</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Read</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Luengo</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>Independent doubly adaptive rejection metropolis sampling within Gibbs sampling</article-title>. <source>IEEE Trans Signal Process</source> (<year>2015</year>) <volume>63</volume>:<fpage>3123</fpage>&#x2013;<lpage>38</lpage>. <pub-id pub-id-type="doi">10.1109/tsp.2015.2420537</pub-id> </citation>
</ref>
<ref id="B134">
<label>134.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Papanikolaou</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Foulds</surname>
<given-names>JR</given-names>
</name>
<name>
<surname>Rubin</surname>
<given-names>TN</given-names>
</name>
<name>
<surname>Tsoumakas</surname>
<given-names>G</given-names>
</name>
</person-group>. <article-title>Dense distributions from sparse samples: improved Gibbs sampling parameter estimators for lda</article-title>. <source>J&#x20;Machine Learn Res</source> (<year>2017</year>) <volume>18</volume>:<fpage>2058</fpage>&#x2013;<lpage>115</lpage>. </citation>
</ref>
<ref id="B135">
<label>135.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Norton</surname>
<given-names>RA</given-names>
</name>
<name>
<surname>Christen</surname>
<given-names>JA</given-names>
</name>
<name>
<surname>Fox</surname>
<given-names>C</given-names>
</name>
</person-group>. <article-title>Sampling hyperparameters in hierarchical models: improving on Gibbs for high-dimensional latent fields and large datasets</article-title>. <source>Commun Stat - Simulation Comput</source> (<year>2018</year>) <volume>47</volume>:<fpage>2639</fpage>&#x2013;<lpage>55</lpage>. <pub-id pub-id-type="doi">10.1080/03610918.2017.1353618</pub-id> </citation>
</ref>
<ref id="B136">
<label>136.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Gao</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Gormley</surname>
<given-names>MR</given-names>
</name>
</person-group>. <article-title>Training for Gibbs sampling on conditional random fields with neural scoring factors</article-title>. <conf-name>Proceedings of the 2020 Conference on Empirical Methods in Natural Language Processing</conf-name>. <publisher-loc>Punta Cana, Dominican Republic</publisher-loc>: <publisher-name>EMNLP</publisher-name> (<year>2020</year>) <fpage>4999</fpage>&#x2013;<lpage>5011</lpage>. </citation>
</ref>
<ref id="B137">
<label>137.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Boland</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Friel</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Maire</surname>
<given-names>F.</given-names>
</name>
</person-group> <article-title>Efficient mcmc for Gibbs random fields using pre-computation</article-title>. <source>Electron J&#x20;Statist</source> (<year>2018</year>) <volume>12</volume>:<fpage>4138</fpage>&#x2013;<lpage>79</lpage>. <pub-id pub-id-type="doi">10.1214/18-ejs1504</pub-id> </citation>
</ref>
<ref id="B138">
<label>138.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kaplan</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Kaiser</surname>
<given-names>MS</given-names>
</name>
<name>
<surname>Lahiri</surname>
<given-names>SN</given-names>
</name>
<name>
<surname>Nordman</surname>
<given-names>DJ</given-names>
</name>
</person-group>. <article-title>Simulating Markov random fields with a conclique-based Gibbs sampler</article-title>. <source>J&#x20;Comput Graphical Stat</source> (<year>2020</year>) <volume>29</volume>:<fpage>286</fpage>&#x2013;<lpage>96</lpage>. <pub-id pub-id-type="doi">10.1080/10618600.2019.1668800</pub-id> </citation>
</ref>
<ref id="B139">
<label>139.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Marcotte</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Allard</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>Gibbs sampling on large lattice with gmrf</article-title>. <source>Comput Geosciences</source> (<year>2018</year>) <volume>111</volume>:<fpage>190</fpage>&#x2013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1016/j.cageo.2017.11.012</pub-id> </citation>
</ref>
<ref id="B140">
<label>140.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Ko</surname>
<given-names>GG</given-names>
</name>
<name>
<surname>Chai</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Rutenbar</surname>
<given-names>RA</given-names>
</name>
<name>
<surname>Brooks</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Wei</surname>
<given-names>GY</given-names>
</name>
</person-group>. <article-title>Flexgibbs: reconfigurable parallel Gibbs sampling accelerator for structured graphs</article-title>. <conf-name>IEEE 27th Annual International Symposium on Field-Programmable Custom Computing Machines (FCCM)</conf-name>. <publisher-name>IEEE</publisher-name> (<year>2019</year>) <fpage>334</fpage>. </citation>
</ref>
<ref id="B141">
<label>141.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Wu</surname>
<given-names>L</given-names>
</name>
</person-group>. <article-title>Large deviations for empirical measures of mean-field Gibbs measures</article-title>. <source>Stochastic Process their Appl</source> (<year>2020</year>) <volume>130</volume>:<fpage>503</fpage>&#x2013;<lpage>20</lpage>. <pub-id pub-id-type="doi">10.1016/j.spa.2019.01.008</pub-id> </citation>
</ref>
<ref id="B142">
<label>142.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Eldan</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Gross</surname>
<given-names>R.</given-names>
</name>
</person-group> <article-title>Decomposition of mean-field Gibbs distributions into product measures</article-title>. <source>Electron J&#x20;Probab</source> (<year>2018</year>) <volume>23</volume>. <pub-id pub-id-type="doi">10.1214/18-ejp159</pub-id> </citation>
</ref>
<ref id="B143">
<label>143.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Shafer</surname>
<given-names>GR</given-names>
</name>
<name>
<surname>Shenoy</surname>
<given-names>PP</given-names>
</name>
</person-group>. <article-title>Probability propagation</article-title>. <source>Ann Math Artif Intell</source> (<year>1990</year>) <volume>2</volume>:<fpage>327</fpage>&#x2013;<lpage>51</lpage>. <pub-id pub-id-type="doi">10.1007/bf01531015</pub-id> </citation>
</ref>
<ref id="B144">
<label>144.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname>
<given-names>NL</given-names>
</name>
<name>
<surname>Poole</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>Intercausal independence and heterogeneous factorization</article-title>. <conf-name>Uncertainty Proceedings</conf-name>. <publisher-name>Elsevier</publisher-name> (<year>1994</year>) <fpage>606</fpage>&#x2013;<lpage>14</lpage>. <pub-id pub-id-type="doi">10.1016/b978-1-55860-332-5.50082-1</pub-id> </citation>
</ref>
<ref id="B145">
<label>145.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Kompass</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>A generalized divergence measure for nonnegative matrix factorization</article-title>. <source>Neural Comput</source> (<year>2007</year>) <volume>19</volume>:<fpage>780</fpage>&#x2013;<lpage>91</lpage>. <pub-id pub-id-type="doi">10.1162/neco.2007.19.3.780</pub-id> </citation>
</ref>
<ref id="B146">
<label>146.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cichocki</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Lee</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Kim</surname>
<given-names>Y-D</given-names>
</name>
<name>
<surname>Choi</surname>
<given-names>S</given-names>
</name>
</person-group>. <article-title>Non-negative matrix factorization with &#x3b1;-divergence</article-title>. <source>Pattern Recognition Lett</source> (<year>2008</year>) <volume>29</volume>:<fpage>1433</fpage>&#x2013;<lpage>40</lpage>. <pub-id pub-id-type="doi">10.1016/j.patrec.2008.02.016</pub-id> </citation>
</ref>
<ref id="B147">
<label>147.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ding</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Peng</surname>
<given-names>W</given-names>
</name>
</person-group>. <article-title>On the equivalence between non-negative matrix factorization and probabilistic latent semantic indexing</article-title>. <source>Comput Stat Data Anal</source> (<year>2008</year>) <volume>52</volume>:<fpage>3913</fpage>&#x2013;<lpage>27</lpage>. <pub-id pub-id-type="doi">10.1016/j.csda.2008.01.011</pub-id> </citation>
</ref>
<ref id="B148">
<label>148.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Xie</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Berkowitz</surname>
<given-names>CM</given-names>
</name>
</person-group>. <article-title>The use of positive matrix factorization with conditional probability functions in air quality studies: an application to hydrocarbon emissions in houston, Texas</article-title>. <source>Atmos Environ</source> (<year>2006</year>) <volume>40</volume>:<fpage>3070</fpage>&#x2013;<lpage>91</lpage>. <pub-id pub-id-type="doi">10.1016/j.atmosenv.2005.12.065</pub-id> </citation>
</ref>
<ref id="B149">
<label>149.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Xu</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Cai</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Liao</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Zhu</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Meng</surname>
<given-names>Y</given-names>
</name>
<etal/>
</person-group> <article-title>Identifying potential mirna-disease associations with probability matrix factorization</article-title>. <source>Front Genet</source> (<year>2019</year>) <volume>10</volume>:<fpage>1234</fpage>. <pub-id pub-id-type="doi">10.3389/fgene.2019.01234</pub-id> </citation>
</ref>
<ref id="B150">
<label>150.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Wang</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Liang</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>A fusion probability matrix factorization framework for link prediction</article-title>. <source>Knowledge-Based Syst</source> (<year>2018</year>) <volume>159</volume>:<fpage>72</fpage>&#x2013;<lpage>85</lpage>. <pub-id pub-id-type="doi">10.1016/j.knosys.2018.06.005</pub-id> </citation>
</ref>
<ref id="B151">
<label>151.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stoehr</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Marin</surname>
<given-names>J-M</given-names>
</name>
<name>
<surname>Pudlo</surname>
<given-names>P</given-names>
</name>
</person-group>. <article-title>Hidden Gibbs random fields model selection using block likelihood information criterion</article-title>. <source>Stat</source> (<year>2016</year>) <volume>5</volume>:<fpage>158</fpage>&#x2013;<lpage>72</lpage>. <pub-id pub-id-type="doi">10.1002/sta4.112</pub-id> </citation>
</ref>
<ref id="B152">
<label>152.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Cilla</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Patricio</surname>
<given-names>MA</given-names>
</name>
<name>
<surname>Berlanga</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Molina</surname>
<given-names>JM</given-names>
</name>
</person-group>. <article-title>Model and feature selection in hidden conditional random fields with group regularization</article-title>. <conf-name>International Conference on Hybrid Artificial Intelligence Systems</conf-name>. <publisher-name>Springer</publisher-name> (<year>2013</year>) <fpage>140</fpage>&#x2013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-642-40846-5_15</pub-id> </citation>
</ref>
<ref id="B153">
<label>153.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Sain</surname>
<given-names>SR</given-names>
</name>
<name>
<surname>Furrer</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>Comments on: some recent work on multivariate Gaussian Markov random fields</article-title>. <source>Test</source> (<year>2018</year>) <volume>27</volume>:<fpage>545</fpage>&#x2013;<lpage>8</lpage>. <pub-id pub-id-type="doi">10.1007/s11749-018-0609-z</pub-id> </citation>
</ref>
<ref id="B154">
<label>154.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhu</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Lao</surname>
<given-names>N</given-names>
</name>
<name>
<surname>Xing</surname>
<given-names>E</given-names>
</name>
</person-group>. <article-title>Grafting-light: fast, incremental feature selection and structure learning of Markov random fields</article-title>. <conf-name>Proceedings of the 16th ACM SIGKDD international conference on Knowledge discovery and data mining</conf-name> (<year>2010</year>), <fpage>303</fpage>&#x2013;<lpage>12</lpage>. </citation>
</ref>
<ref id="B155">
<label>155.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liao</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Choudhury</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Fox</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Kautz</surname>
<given-names>HA</given-names>
</name>
</person-group>. <article-title>Training conditional random fields using virtual evidence boosting</article-title>. <source>Ijcai</source> (<year>2007</year>) <volume>7</volume>:<fpage>2530</fpage>&#x2013;<lpage>5</lpage>. </citation>
</ref>
<ref id="B156">
<label>156.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Lafferty</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Zhu</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>Y</given-names>
</name>
</person-group>. <article-title>Kernel conditional random fields: representation and clique selection</article-title>. <conf-name>Proceedings of the twenty-first international conference on Machine learning</conf-name>. <publisher-loc>New York, NY</publisher-loc>: <publisher-name>ICML &#x27;04</publisher-name> (<year>2004</year>) <fpage>64</fpage>. </citation>
</ref>
<ref id="B157">
<label>157.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Zhu</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Mao</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Sentiment classification using genetic algorithm and conditional random fields</article-title>. <conf-name>IEEE international conference on information management and engineering</conf-name>. <publisher-name>IEEE</publisher-name> (<year>2010</year>) <fpage>193</fpage>&#x2013;<lpage>6</lpage>. </citation>
</ref>
<ref id="B158">
<label>158.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Metzler</surname>
<given-names>DA</given-names>
</name>
</person-group>. <article-title>Automatic feature selection in the Markov random field model for information retrieval</article-title>. <conf-name>Proceedings of the sixteenth ACM conference on Conference on information and knowledge management</conf-name>. (<year>2007</year>) <fpage>253</fpage>&#x2013;<lpage>62</lpage>. </citation>
</ref>
<ref id="B159">
<label>159.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Aliferis</surname>
<given-names>CF</given-names>
</name>
<name>
<surname>Statnikov</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Tsamardinos</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Mani</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Koutsoukos</surname>
<given-names>XD</given-names>
</name>
</person-group>. <article-title>Local causal and Markov blanket induction for causal discovery and feature selection for classification part i: algorithms and empirical evaluation</article-title>. <source>J&#x20;Machine Learn Res</source> (<year>2010</year>) <volume>11</volume>. </citation>
</ref>
<ref id="B160">
<label>160.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Adams</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Beling</surname>
<given-names>PA</given-names>
</name>
<name>
<surname>Cogill</surname>
<given-names>R</given-names>
</name>
</person-group>. <article-title>Feature selection for hidden Markov models and hidden semi-markov models</article-title>. <source>IEEE Access</source> (<year>2016</year>) <volume>4</volume>:<fpage>1642</fpage>&#x2013;<lpage>57</lpage>. <pub-id pub-id-type="doi">10.1109/access.2016.2552478</pub-id> </citation>
</ref>
<ref id="B161">
<label>161.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Brownlee</surname>
<given-names>AEI</given-names>
</name>
<name>
<surname>Regnier-Coudert</surname>
<given-names>O</given-names>
</name>
<name>
<surname>McCall</surname>
<given-names>JAW</given-names>
</name>
<name>
<surname>Massie</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Stulajter</surname>
<given-names>S</given-names>
</name>
</person-group>. <article-title>An application of a ga with Markov network surrogate to feature selection</article-title>. <source>Int J&#x20;Syst Sci</source> (<year>2013</year>) <volume>44</volume>:<fpage>2039</fpage>&#x2013;<lpage>56</lpage>. <pub-id pub-id-type="doi">10.1080/00207721.2012.684449</pub-id> </citation>
</ref>
<ref id="B162">
<label>162.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yu</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>H</given-names>
</name>
</person-group>. <article-title>Efficient feature selection via analysis of relevance and redundancy</article-title>. <source>J&#x20;machine Learn Res</source> (<year>2004</year>) <volume>5</volume>:<fpage>1205</fpage>&#x2013;<lpage>24</lpage>. </citation>
</ref>
<ref id="B163">
<label>163.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Slawski</surname>
<given-names>M</given-names>
</name>
<name>
<surname>zu Castell</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Tutz</surname>
<given-names>G.</given-names>
</name>
</person-group> <article-title>Feature selection guided by structural information</article-title>. <source>Ann Appl Stat</source> (<year>2010</year>) <volume>4</volume>:<fpage>1056</fpage>&#x2013;<lpage>80</lpage>. <pub-id pub-id-type="doi">10.1214/09-aoas302</pub-id> </citation>
</ref>
<ref id="B164">
<label>164.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Adams</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Beling</surname>
<given-names>PA</given-names>
</name>
</person-group>. <article-title>A survey of feature selection methods for Gaussian mixture models and hidden Markov models</article-title>. <source>Artif Intell Rev</source> (<year>2019</year>) <volume>52</volume>:<fpage>1739</fpage>&#x2013;<lpage>79</lpage>. <pub-id pub-id-type="doi">10.1007/s10462-017-9581-3</pub-id> </citation>
</ref>
<ref id="B165">
<label>165.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Vergara</surname>
<given-names>JR</given-names>
</name>
<name>
<surname>Est&#xe9;vez</surname>
<given-names>PA</given-names>
</name>
</person-group>. <article-title>A review of feature selection methods based on mutual information</article-title>. <source>Neural Comput Applic</source> (<year>2014</year>) <volume>24</volume>:<fpage>175</fpage>&#x2013;<lpage>86</lpage>. <pub-id pub-id-type="doi">10.1007/s00521-013-1368-0</pub-id> </citation>
</ref>
<ref id="B166">
<label>166.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Liu</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Luo</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Change Loy</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Tang</surname>
<given-names>X</given-names>
</name>
</person-group>. <article-title>Deep learning Markov random field for semantic segmentation</article-title>. <source>IEEE Trans Pattern Anal Mach Intell</source> (<year>2017</year>) <volume>40</volume>:<fpage>1814</fpage>&#x2013;<lpage>28</lpage>. <pub-id pub-id-type="doi">10.1109/TPAMI.2017.2737535</pub-id> </citation>
</ref>
<ref id="B167">
<label>167.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Hu</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Rohrbach</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Darrell</surname>
<given-names>T</given-names>
</name>
</person-group>. <article-title>Segmentation from natural language expressions</article-title>. <conf-name>European Conference on Computer Vision</conf-name>. <publisher-name>Springer</publisher-name> (<year>2016</year>) <fpage>108</fpage>&#x2013;<lpage>24</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-319-46448-0_7</pub-id> </citation>
</ref>
<ref id="B168">
<label>168.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Guo</surname>
<given-names>J</given-names>
</name>
<name>
<surname>He</surname>
<given-names>H</given-names>
</name>
<name>
<surname>He</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Lausen</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Lin</surname>
<given-names>H</given-names>
</name>
<etal/>
</person-group> <article-title>Gluoncv and gluonnlp: deep learning in computer vision and natural language processing</article-title>. <source>J&#x20;Machine Learn Res</source> (<year>2020</year>) <volume>21</volume>:<fpage>1</fpage>&#x2013;<lpage>7</lpage>. </citation>
</ref>
<ref id="B169">
<label>169.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhang</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>H</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Xie</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Co-occurrent features in semantic segmentation</article-title>. <conf-name>Proceedings of the IEEE Conference on Computer Vision and Pattern Recognition</conf-name> (<year>2019</year>), <fpage>548</fpage>&#x2013;<lpage>57</lpage>. </citation>
</ref>
<ref id="B170">
<label>170.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Mai</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Wu</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Cui</surname>
<given-names>T</given-names>
</name>
</person-group>. <article-title>Improved Chinese word segmentation disambiguation model based on conditional random fields</article-title>. <conf-name>Proceedings of the 4th International Conference on Computer Engineering and Networks</conf-name>. <publisher-name>Springer</publisher-name> (<year>2015</year>) <fpage>599</fpage>&#x2013;<lpage>605</lpage>. <pub-id pub-id-type="doi">10.1007/978-3-319-11104-9_70</pub-id> </citation>
</ref>
<ref id="B171">
<label>171.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Qiu</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Zhou</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>Q</given-names>
</name>
<name>
<surname>Ruan</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Gao</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Chinese clinical named entity recognition using residual dilated convolutional neural network with conditional random field</article-title>. <source>IEEE Trans.on Nanobioscience</source> (<year>2019</year>) <volume>18</volume>:<fpage>306</fpage>&#x2013;<lpage>15</lpage>. <pub-id pub-id-type="doi">10.1109/tnb.2019.2908678</pub-id> </citation>
</ref>
<ref id="B172">
<label>172.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Khan</surname>
<given-names>W</given-names>
</name>
<name>
<surname>Daud</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Nasir</surname>
<given-names>JA</given-names>
</name>
<name>
<surname>Amjad</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Arafat</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Aljohani</surname>
<given-names>N</given-names>
</name>
<etal/>
</person-group> <article-title>Urdu part of speech tagging using conditional random fields</article-title>. <source>Lang Resour Eval</source> (<year>2019</year>) <volume>53</volume>:<fpage>331</fpage>&#x2013;<lpage>62</lpage>. <pub-id pub-id-type="doi">10.1007/s10579-018-9439-6</pub-id> </citation>
</ref>
<ref id="B173">
<label>173.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Nguyen</surname>
<given-names>DM</given-names>
</name>
<name>
<surname>Do</surname>
<given-names>TH</given-names>
</name>
<name>
<surname>Calderbank</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Deligiannis</surname>
<given-names>N</given-names>
</name>
</person-group>. <article-title>Fake news detection using deep Markov random fields</article-title>. <conf-name>Proceedings of the 2019 Conference of the North American Chapter of the Association for Computational Linguistics: Human Language Technologies</conf-name>. <publisher-loc>Minneapolis, MN</publisher-loc>: <publisher-name>Long and Short Papers</publisher-name> (<year>2019</year>) <fpage>1391</fpage>&#x2013;<lpage>400</lpage>. </citation>
</ref>
<ref id="B174">
<label>174.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Colmenares</surname>
<given-names>CA</given-names>
</name>
<name>
<surname>Litvak</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Mantrach</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Silvestri</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Rodr&#xed;guez</surname>
<given-names>H</given-names>
</name>
</person-group>. <source>Headline generation as a sequence prediction with conditional random fields</source>. <publisher-loc>Singapore City, Singapore</publisher-loc>: <publisher-name>Multilingual Text Analysis: Challenges, Models, and Approaches</publisher-name> (<year>2019</year>) <fpage>201</fpage>. </citation>
</ref>
<ref id="B175">
<label>175.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Knoke</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Yang</surname>
<given-names>S</given-names>
</name>
</person-group>. <source>Social network analysis</source>. <publisher-loc>Perth, Australia</publisher-loc>: <publisher-name>Sage Publications</publisher-name> (<year>2019</year>).</citation>
</ref>
<ref id="B176">
<label>176.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jia</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Gong</surname>
<given-names>NZ</given-names>
</name>
</person-group>. <article-title>Attriinfer: inferring user attributes in online social networks using Markov random fields</article-title>. <conf-name>Proceedings of the 26th International Conference on World Wide Web</conf-name> (<year>2017</year>) <fpage>1561</fpage>&#x2013;<lpage>9</lpage>. </citation>
</ref>
<ref id="B177">
<label>177.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Jin</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>W</given-names>
</name>
<name>
<surname>He</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>W</given-names>
</name>
</person-group>. <article-title>Graph convolutional networks meet Markov random fields: semi-supervised community detection in attribute networks</article-title>. <source>Aaai</source> (<year>2019</year>) <volume>33</volume>:<fpage>152</fpage>&#x2013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1609/aaai.v33i01.3301152</pub-id> </citation>
</ref>
<ref id="B178">
<label>178.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Feng</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>Q</given-names>
</name>
<name>
<surname>Ji</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Guo</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Meng</surname>
<given-names>X</given-names>
</name>
</person-group>. <source>Stopping the cyberattack in the early stage: assessing the security risks of social network users</source>. <publisher-name>Security and Communication Networks</publisher-name> (<year>2019</year>).</citation>
</ref>
<ref id="B179">
<label>179.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Zhou</surname>
<given-names>Q</given-names>
</name>
<name>
<surname>Xu</surname>
<given-names>Z</given-names>
</name>
<name>
<surname>Yen</surname>
<given-names>NY</given-names>
</name>
</person-group>. <article-title>User sentiment analysis based on social network information and its application in consumer reconstruction intention</article-title>. <source>Comput Hum Behav</source> (<year>2019</year>) <volume>100</volume>:<fpage>177</fpage>&#x2013;<lpage>83</lpage>. <pub-id pub-id-type="doi">10.1016/j.chb.2018.07.006</pub-id> </citation>
</ref>
<ref id="B180">
<label>180.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Yoon</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Kleinman</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Mertz</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Brannick</surname>
<given-names>M</given-names>
</name>
</person-group>. <article-title>Is social network site usage related to depression? A meta-analysis of Facebook-depression relations</article-title>. <source>J&#x20;affective Disord</source> (<year>2019</year>) <volume>248</volume>:<fpage>65</fpage>&#x2013;<lpage>72</lpage>. <pub-id pub-id-type="doi">10.1016/j.jad.2019.01.026</pub-id> </citation>
</ref>
<ref id="B181">
<label>181.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>&#xd6;</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Alexander</surname>
<given-names>SM</given-names>
</name>
<name>
<surname>Baggio</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Barnes</surname>
<given-names>ML</given-names>
</name>
<name>
<surname>Berardo</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Cumming</surname>
<given-names>GS</given-names>
</name>
<etal/>
</person-group> <article-title>Improving network approaches to the study of complex social&#x2013;ecological interdependencies</article-title>. <source>Nat Sustainability</source> (<year>2019</year>) <volume>2</volume>:<fpage>551</fpage>&#x2013;<lpage>9</lpage>. <pub-id pub-id-type="doi">10.1038/s41893-019-0308-0</pub-id> </citation>
</ref>
<ref id="B182">
<label>182.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Bhattacharya</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Malinsky</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Shpitser</surname>
<given-names>I</given-names>
</name>
</person-group>. <article-title>Causal inference under interference and network uncertainty</article-title>. <conf-name>Uncertainty in Artificial Intelligence (PMLR)</conf-name>. <publisher-loc>Ithaca, NY</publisher-loc>: <publisher-name>arXiv:1907.00221</publisher-name> (<year>2020</year>) <fpage>1028</fpage>&#x2013;<lpage>38</lpage>. </citation>
</ref>
<ref id="B183">
<label>183.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Stankovi&#x107;</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Dakovi&#x107;</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Sejdi&#x107;</surname>
<given-names>E</given-names>
</name>
</person-group>. <article-title>Introduction to graph signal processing</article-title>. <conf-name>Vertex-Frequency Analysis of Graph Signals</conf-name>. <publisher-name>Springer</publisher-name> (<year>2019</year>) <fpage>3</fpage>&#x2013;<lpage>108</lpage>. </citation>
</ref>
<ref id="B184">
<label>184.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Stankovic</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Mandic</surname>
<given-names>DP</given-names>
</name>
<name>
<surname>Dakovic</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Kisil</surname>
<given-names>I</given-names>
</name>
<name>
<surname>Sejdic</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Constantinides</surname>
<given-names>AG</given-names>
</name>
</person-group>. <article-title>Understanding the basis of graph signal processing via an intuitive example-driven approach [lecture notes]</article-title>. <source>IEEE Signal Process Mag</source> (<year>2019</year>) <volume>36</volume>:<fpage>133</fpage>&#x2013;<lpage>45</lpage>. <pub-id pub-id-type="doi">10.1109/msp.2019.2929832</pub-id> </citation>
</ref>
<ref id="B185">
<label>185.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ortega</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Frossard</surname>
<given-names>P</given-names>
</name>
<name>
<surname>Kovacevic</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Moura</surname>
<given-names>JMF</given-names>
</name>
<name>
<surname>Vandergheynst</surname>
<given-names>P</given-names>
</name>
</person-group>. <article-title>Graph signal processing: overview, challenges, and applications</article-title>. <source>Proc IEEE</source> (<year>2018</year>) <volume>106</volume>:<fpage>808</fpage>&#x2013;<lpage>28</lpage>. <pub-id pub-id-type="doi">10.1109/jproc.2018.2820126</pub-id> </citation>
</ref>
<ref id="B186">
<label>186.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Gadde</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Ortega</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>A probabilistic interpretation of sampling theory of graph signals</article-title>. <conf-name>IEEE international conference on Acoustics, Speech and Signal Processing (ICASSP)</conf-name>. <publisher-name>(IEEE)</publisher-name> (<year>2015</year>) <fpage>3257</fpage>&#x2013;<lpage>61</lpage>. </citation>
</ref>
<ref id="B187">
<label>187.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Chen</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Sandryhaila</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Kova&#x10d;evi&#x107;</surname>
<given-names>J</given-names>
</name>
</person-group>. <article-title>Sampling theory for graph signals</article-title>. <conf-name>IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)</conf-name>. <publisher-name>(IEEE)</publisher-name> (<year>2015</year>) <fpage>3392</fpage>&#x2013;<lpage>6</lpage>. </citation>
</ref>
<ref id="B188">
<label>188.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Stankovi&#x107;</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Sejdi&#x107;</surname>
<given-names>E</given-names>
</name>
</person-group>. <source>Vertex-frequency analysis of graph signals</source>. <publisher-name>Springer</publisher-name> (<year>2019</year>).</citation>
</ref>
<ref id="B189">
<label>189.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Pavez</surname>
<given-names>E</given-names>
</name>
<name>
<surname>Ortega</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>Generalized laplacian precision matrix estimation for graph signal processing</article-title>. <conf-name>IEEE International Conference on Acoustics, Speech and Signal Processing (ICASSP)</conf-name>. <publisher-name>(IEEE)</publisher-name> (<year>2016</year>) <fpage>6350</fpage>&#x2013;<lpage>4</lpage>. </citation>
</ref>
<ref id="B190">
<label>190.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Sandryhaila</surname>
<given-names>A</given-names>
</name>
<name>
<surname>Moura</surname>
<given-names>JM</given-names>
</name>
</person-group>. <article-title>Discrete signal processing on graphs: graph fourier transform</article-title>. <conf-name>IEEE International Conference on Acoustics, Speech and Signal Processing</conf-name>. <publisher-name>IEEE</publisher-name> (<year>2013</year>) <fpage>6167</fpage>&#x2013;<lpage>70</lpage>. </citation>
</ref>
<ref id="B191">
<label>191.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Mateos</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Segarra</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Marques</surname>
<given-names>AG</given-names>
</name>
<name>
<surname>Ribeiro</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>Connecting the dots: identifying network structure via graph signal processing</article-title>. <source>IEEE Signal Process Mag</source> (<year>2019</year>) <volume>36</volume>:<fpage>16</fpage>&#x2013;<lpage>43</lpage>. <pub-id pub-id-type="doi">10.1109/msp.2018.2890143</pub-id> </citation>
</ref>
<ref id="B192">
<label>192.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Ji</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Tay</surname>
<given-names>WP</given-names>
</name>
</person-group>. <article-title>A hilbert space theory of generalized graph signal processing</article-title>. <source>IEEE Trans Signal Process</source> (<year>2019</year>) <volume>67</volume>:<fpage>6188</fpage>&#x2013;<lpage>203</lpage>. <pub-id pub-id-type="doi">10.1109/tsp.2019.2952055</pub-id> </citation>
</ref>
<ref id="B193">
<label>193.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Itani</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Thanou</surname>
<given-names>D</given-names>
</name>
</person-group>. <article-title>A graph signal processing framework for the classification of temporal brain data</article-title>. <conf-name>28th European Signal Processing Conference (EUSIPCO)</conf-name>. <publisher-name>(IEEE)</publisher-name> (<year>2021</year>) <fpage>1180</fpage>&#x2013;<lpage>4</lpage>. </citation>
</ref>
<ref id="B194">
<label>194.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Ramakrishna</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Scaglione</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>Detection of false data injection attack using graph signal processing for the power grid</article-title>. <conf-name>IEEE Global Conference on Signal and Information Processing (GlobalSIP)</conf-name>. <publisher-name>(IEEE)</publisher-name> (<year>2019</year>) <fpage>1</fpage>&#x2013;<lpage>5</lpage>. </citation>
</ref>
<ref id="B195">
<label>195.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Stankovic</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Mandic</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Dakovic</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Brajovic</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Scalzo</surname>
<given-names>B</given-names>
</name>
<name>
<surname>Li</surname>
<given-names>S</given-names>
</name>
<etal/>
</person-group> <source>Graph signal processing&#x2013;part iii: machine learning on graphs, from graph topology to applications</source>. <publisher-loc>Ithaca, NY</publisher-loc>: <publisher-name>arXiv:2001.00426</publisher-name> (<year>2020</year>).</citation>
</ref>
<ref id="B196">
<label>196.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Song</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Chai</surname>
<given-names>L</given-names>
</name>
<name>
<surname>Zhang</surname>
<given-names>J</given-names>
</name>
</person-group>. <source>Graph signal processing approach to qsar/qspr model learning of compounds</source>. <publisher-name>IEEE Transactions on Pattern Analysis and Machine Intelligence</publisher-name> (<year>2020</year>).</citation>
</ref>
<ref id="B197">
<label>197.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Burkhardt</surname>
<given-names>DB</given-names>
</name>
<name>
<surname>Stanley</surname>
<given-names>JS</given-names>
</name>
<name>
<surname>Perdigoto</surname>
<given-names>AL</given-names>
</name>
<name>
<surname>Gigante</surname>
<given-names>SA</given-names>
</name>
<name>
<surname>Herold</surname>
<given-names>KC</given-names>
</name>
<name>
<surname>Wolf</surname>
<given-names>G</given-names>
</name>
<etal/>
</person-group> <source>Quantifying the effect of experimental perturbations in single-cell rna-sequencing data using graph signal processing</source>. <publisher-loc>Cold Spring Harbor, NY</publisher-loc>: <publisher-name>bioRxiv</publisher-name> (<year>2019</year>) <fpage>532846</fpage>.</citation>
</ref>
<ref id="B198">
<label>198.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Colonnese</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Pagliari</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Biagi</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Cusani</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Scarano</surname>
<given-names>G</given-names>
</name>
</person-group>. <article-title>Compound Markov random field model of signals on graph: an application to graph learning</article-title>. <conf-name>7th European Workshop on Visual Information Processing (EUVIP)</conf-name>. <publisher-name>(IEEE)</publisher-name> (<year>2018</year>) <fpage>1</fpage>&#x2013;<lpage>5</lpage>. </citation>
</ref>
<ref id="B199">
<label>199.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Torkamani</surname>
<given-names>R</given-names>
</name>
<name>
<surname>Zayyani</surname>
<given-names>H</given-names>
</name>
</person-group>. <source>Statistical graph signal recovery using variational bayes</source>. <publisher-name>IEEE Transactions on Circuits and Systems II: Express Briefs</publisher-name> (<year>2020</year>).</citation>
</ref>
<ref id="B200">
<label>200.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Ramezani-Mayiami</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Hajimirsadeghi</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Skretting</surname>
<given-names>K</given-names>
</name>
<name>
<surname>Blum</surname>
<given-names>RS</given-names>
</name>
<name>
<surname>Poor</surname>
<given-names>HV</given-names>
</name>
</person-group>. <source>Graph topology learning and signal recovery via bayesian inference</source>. <publisher-name>IEEE Data Science Workshop (DSW) (IEEE)</publisher-name> (<year>2019</year>) <fpage>52</fpage>&#x2013;<lpage>6</lpage>.</citation>
</ref>
<ref id="B201">
<label>201.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Colonnese</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Lorenzo</surname>
<given-names>PD</given-names>
</name>
<name>
<surname>Cattai</surname>
<given-names>T</given-names>
</name>
<name>
<surname>Scarano</surname>
<given-names>G</given-names>
</name>
<name>
<surname>Fallani</surname>
<given-names>FDV</given-names>
</name>
</person-group>. <article-title>A joint Markov model for communities, connectivity and signals defined over graphs</article-title>. <source>IEEE Signal Process Lett</source> (<year>2020</year>) <volume>27</volume>:<fpage>1160</fpage>&#x2013;<lpage>4</lpage>. <pub-id pub-id-type="doi">10.1109/lsp.2020.3005053</pub-id> </citation>
</ref>
<ref id="B202">
<label>202.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Dong</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Thanou</surname>
<given-names>D</given-names>
</name>
<name>
<surname>Rabbat</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Frossard</surname>
<given-names>P</given-names>
</name>
</person-group>. <article-title>Learning graphs from data: a signal representation perspective</article-title>. <source>IEEE Signal Process Mag</source> (<year>2019</year>) <volume>36</volume>:<fpage>44</fpage>&#x2013;<lpage>63</lpage>. <pub-id pub-id-type="doi">10.1109/msp.2018.2887284</pub-id> </citation>
</ref>
<ref id="B203">
<label>203.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Cheung</surname>
<given-names>M</given-names>
</name>
<name>
<surname>Shi</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Wright</surname>
<given-names>O</given-names>
</name>
<name>
<surname>Jiang</surname>
<given-names>LY</given-names>
</name>
<name>
<surname>Liu</surname>
<given-names>X</given-names>
</name>
<name>
<surname>Moura</surname>
<given-names>JMF</given-names>
</name>
</person-group>. <article-title>Graph signal processing and deep learning: convolution, pooling, and topology</article-title>. <source>IEEE Signal Process Mag</source> (<year>2020</year>) <volume>37</volume>:<fpage>139</fpage>&#x2013;<lpage>49</lpage>. <pub-id pub-id-type="doi">10.1109/msp.2020.3014594</pub-id> </citation>
</ref>
<ref id="B204">
<label>204.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Jia</surname>
<given-names>J</given-names>
</name>
<name>
<surname>Benson</surname>
<given-names>AR</given-names>
</name>
</person-group>. <source>A unifying generative model for graph learning algorithms: label propagation, graph convolutions, and combinations</source>. <publisher-loc>Ithaca, NY</publisher-loc>: <publisher-name>arXiv:2101.07730</publisher-name> (<year>2021</year>).</citation>
</ref>
<ref id="B205">
<label>205.</label>
<citation citation-type="journal">
<person-group person-group-type="author">
<name>
<surname>Gama</surname>
<given-names>F</given-names>
</name>
<name>
<surname>Ribeiro</surname>
<given-names>A</given-names>
</name>
</person-group>. <article-title>Ergodicity in stationary graph processes: a weak law of large numbers</article-title>. <source>IEEE Trans Signal Process</source> (<year>2019</year>) <volume>67</volume>:<fpage>2761</fpage>&#x2013;<lpage>74</lpage>. <pub-id pub-id-type="doi">10.1109/tsp.2019.2908909</pub-id> </citation>
</ref>
<ref id="B206">
<label>206.</label>
<citation citation-type="book">
<person-group person-group-type="author">
<name>
<surname>Segarra</surname>
<given-names>S</given-names>
</name>
<name>
<surname>Wang</surname>
<given-names>Y</given-names>
</name>
<name>
<surname>Uhler</surname>
<given-names>C</given-names>
</name>
<name>
<surname>Marques</surname>
<given-names>AG</given-names>
</name>
</person-group>. <article-title>Joint inference of networks from stationary graph signals</article-title>. <conf-name>51st Asilomar Conference on Signals, Systems, and Computers</conf-name>. <publisher-name>(IEEE)</publisher-name> (<year>2017</year>) <fpage>975</fpage>&#x2013;<lpage>979</lpage>. </citation>
</ref>
</ref-list>
</back>
</article>