1 \chapter{Probabilistic representation and reasoning (and burglars)
}
2 \section{Formal description
}
3 In our representation of the model we introduced a
\textit{Noisy OR
} to
4 represent the causal independence of
\textit{Burglar
} and
\textit{Earthquake
}
5 on
\textit{Alarm
}. The representation of the network is displayed in
6 Figure~
\ref{bnetwork21
}
9 \caption{Bayesian network alarmsystem
}
12 \includegraphics[scale=
0.5]{d1.eps
}
15 Days were chosen as unit to model the story. Calculation of the probability of
16 a
\textit{Burglar
} event happening at some day is then (assuming a gregorian
17 calendar and leap days):
18 $$
\frac{1}{365 +
0.25 -
0.01 -
0.0025}=
\frac{1}{365.2425}$$
20 The resultant probability distributions can be found in Table~
\ref{probdist
},
21 in order to avoid a unclear graph.
25 \begin{tabular
}{|l|l|
}
34 \begin{tabular
}{|l|l|
}
43 \begin{tabular
}{|l|ll|
}
45 &
\multicolumn{2}{c|
}{$I_1$
}\\
52 \begin{tabular
}{|l|ll|
}
54 &
\multicolumn{2}{c|
}{$I_2$
}\\
61 \begin{tabular
}{|ll|ll|
}
63 &&
\multicolumn{2}{c|
}{Alarm
}\\
64 $I_1$ & $I_2$ & T & F\\
73 \begin{tabular
}{|l|ll|
}
75 &
\multicolumn{2}{c|
}{Watson
}\\
82 \begin{tabular
}{|l|ll|
}
84 &
\multicolumn{2}{c|
}{Gibbons
}\\
91 \begin{tabular
}{|l|ll|
}
93 &
\multicolumn{2}{c|
}{Radio
}\\
96 T & $
0.9998$ & $
0.0002$\\
97 F & $
0.0002$ & $
0.9998$\\
102 \textit{If there is a burglar present (which could happen once every ten
103 years), the alarm is known to go off
95\% of the time.
} We modelled this by
104 setting the value for Burglar True and $I_2$ True on
0,
95.\\
105 \textit{There’s a
40\% chance that Watson is joking and the alarm is in fact
106 off.
} This is modelled by putting the value for Watson True and Alarm F on
0,
4.
107 As Holmes expects Watson to call in
80\% of the time, the value for alarm True
108 and Watson True is set
0,
2. Because the rows have to sum to
1, the other values
109 are easily calculated.\\
110 \textit{She may not have heard the alarm in
1\% of the cases and is thought to
111 erroneously
report an alarm when it is in fact off in
4\% of the cases.
} We
112 modelled this by assuming that when Mrs. Gibbons hears the alarm, she calls
113 Holmes. Meaning that the value for Gibbons False and Alarm true is
0,
01. As
114 she reports when the alarm is in fact off in
4\% of the cases, the value for
115 Gibbons True and alarm False is
0,
04.\\
118 \section{Implementation
}
119 We implemented the distributions in
\textit{AILog
}, see Listing~
\ref{alarm.ail
}
124 \inputminted[linenos,fontsize=
\footnotesize]{prolog
}{./src/alarm.ail
}
128 Now that we have modelled the story with the corresponding probabilities, we
129 can have ailog calculate some other probabilities given a some observations.
130 Down below we wrote down some probabilties and the associated ailog output.\\
131 The chance that a burglary happens given that Watson calls is greater than the
132 chance that a burglary happens without this observations, as is observerd by
133 the difference between a and b. This makes sense as Watson calls rightly in
134 80\% of the time. So when Holmes receives a call by Watson, the chance that the
135 alarm goes of increases.\\
136 When we compare b to c, the same mechanisme holds. There are more observations
137 that give evidence for a burglary as both Watson and Gibbons have called in the
139 When you take a look at the last case, d, you see that the probability has
140 decreased compared to c. This can be explained by an observation that is added
141 on top of the observations of b; the radio. The variable Radio means that the
142 newcast tells that there was an earhquake. As that is also a reason why the
143 alarm could go of, but has nothing to do with a burglary, it decreases the
144 probability of a burglary.
145 %We kunnen misschien de kans uitrekenen dat Watson en Gibbons allebei foutief bellen? Daar mis je denk ik info over of niet?
146 \begin{enumerate
}[a)
]
147 \item $P(
\text{Burglary
})=
148 0.002737757092501968$
149 \item $P(
\text{Burglary
}|
\text{Watson called
})=
150 0.005321803679438259$
151 \item $P(
\text{Burglary
}|
\text{Watson called
},
\text{Gibbons called
})=
153 \item $P(
\text{Burglary
}|
\text{Watson called
},
\text{Gibbons called
}
154 ,
\text{Radio
})=
0.01179672476662423$
158 \begin{minted
}[fontsize=
\footnotesize]{prolog
}
159 ailog: predict burglar.
160 Answer: P(burglar|Obs)=
0.002737757092501968.
161 [ok,more,explanations,worlds,help
]: ok.
163 ailog: observe watson.
164 Answer: P(watson|Obs)=
0.4012587986186947.
165 [ok,more,explanations,worlds,help
]: ok.
167 ailog: predict burglar.
168 Answer: P(burglar|Obs)=
[0.005321803679438259,
0.005321953115441623].
169 [ok,more,explanations,worlds,help
]: ok.
171 ailog: observe gibbons.
172 Answer: P(gibbons|Obs)=
[0.04596053565368094,
0.045962328885721306].
173 [ok,more,explanations,worlds,help
]: ok.
175 ailog: predict burglar.
176 Answer: P(burglar|Obs)=
[0.11180941544755249,
0.1118516494624678].
177 [ok,more,explanations,worlds,help
]: ok.
179 ailog: observe radio.
180 Answer: P(radio|Obs)=
[0.02582105837443645,
0.025915745316785182].
181 [ok,more,explanations,worlds,help
]: ok.
183 ailog: predict burglar.
184 Answer: P(burglar|Obs)=
[0.01179672476662423,
0.015584580594335082].
185 [ok,more,explanations,worlds,help
]: ok.
189 ToDO write down the most probable explanation for the observed evidence
191 \section{Comparison with manual calculation
}
193 When we let ailog calculate the probability of alarm.
%Wat is Obs hier??
194 Querying the
\textit{Alarm
} variable gives the following answer:
195 \begin{minted
}{prolog
}
196 ailog: predict alarm.
197 Answer: P(alarm|Obs)=
0.0031469965467367292.
199 [ok,more,explanations,worlds,help
]: ok.
202 Using the formula for causal independence with a logical OR:\\
203 $P(Alarm|C_1, C_2) = P(i_1|C_1)+P(i_2|C_2)(
1-P(i_1|C_1))$ we can calculate the
204 probability of the
\textit{Alarm
} variable using variable elimination. This
205 results in the following answer:\\
206 $P(Alarm|burglar, earthquake) =
207 P(i_1|burglar)+P(i_2|earthquake)(
1-P(i_1|burglar)) =
208 0.2*
0.0027+
0.95*
0.0027*(
1-
0.2*
0.0027)=
0.00314699654673673$ \\
209 TODOOOOOOOOOOO
%Ik weet niet of we i_1 en i_2 nog door iets anders vervangen
211 When you compare the output of AILog and of the variable elimination, you see
212 that they are similar.
215 \section{Burglary problem with extended information
}
216 Extending the problem with multiple houses, dependencies and cold night we get
217 the following AILog representation:
218 \inputminted[linenos,fontsize=
\footnotesize]{prolog
}{./src/burglary.ail
}
219 When thinking about the dependencies and successful burglaries we found out that
220 there are only four possible successful burglaries. In the model we abstracted
221 from the dependency layer and implemented the model in three layers. The first
222 layer is the initial probability of every burglar. The second layer is the
223 possible groups that lead to a successful burglary. The chances that Holmes'
224 house is hit is the third layer. This results in the following probability for
225 a burglary in Holmes' house.
228 P(
\text{first house Holmes'
})+
229 P(
\text{second house Holmes'
})+
230 P(
\text{third house Holmes'
}))=\\
231 0.655976676\cdot\left(
233 \frac{9999}{10000}\cdot\frac{1}{9999}+
234 \frac{9999}{10000}\cdot\frac{9998}{9999}\cdot\frac{1}{9998}\right)
237 \section{Bayesian networks
}
238 A Bayesian network representation of the burglary problem with a multitude of
239 houses and burglars is possible but would be very big and tedious because all
240 the constraints about the burglars must be incorporated in the network.
241 The network would look something like in figure~
\ref{bnnetworkhouses
}
243 \begin{tabular
}{|l|l|
}
247 T & $
\nicefrac{5}{7}$\\
248 F & $
\nicefrac{2}{7}$\\
251 \begin{tabular
}{|l|l|
}
255 T & $
\nicefrac{5}{7}$\\
256 F & $
\nicefrac{2}{7}$\\
259 \begin{tabular
}{|l|l|
}
263 T & $
\nicefrac{5}{7}$\\
264 F & $
\nicefrac{2}{7}$\\
267 \begin{tabular
}{|l|l|
}
271 T & $
\nicefrac{5}{7}$\\
272 F & $
\nicefrac{2}{7}$\\
276 \begin{tabular
}{|llll|ll|
}
279 Joe & William & Jack & Averall & T & F\\
281 F& F& F& F & $
0$ & $
1$\\
282 F& F& F& T & $
0$ & $
1$\\
283 F& F& T& F & $
0$ & $
1$\\
284 F& F& T& T & $
0$ & $
1$\\
285 F& T& F& F & $
0$ & $
1$\\
286 F& T& F& T & $
0$ & $
1$\\
287 F& T& T& F & $
0$ & $
1$\\
288 F& T& T& T & $
0$ & $
1$\\
289 T& F& F& F & $
0$ & $
1$\\
290 T& F& F& T & $
0$ & $
1$\\
291 T& F& T& F & $
1$ & $
0$\\
292 T& F& T& T & $
0$ & $
1$\\
293 T& T& F& F & $
1$ & $
0$\\
294 T& T& F& T & $
0$ & $
1$\\
295 T& T& T& F & $
1$ & $
0$\\
296 T& T& T& T & $
1$ & $
0$\\
299 \begin{tabular
}{|lll|
}
304 T & $
0.000153$ & $
0.999847$\\
310 \caption{Bayesian network of burglars and houses
}
311 \label{bnnetworkhouses
}
313 \includegraphics[scale=
0.5]{d2.eps
}