From: Mart Lubbers Date: Tue, 3 Feb 2015 22:13:36 +0000 (+0100) Subject: ass11ding X-Git-Url: https://git.martlubbers.net/?a=commitdiff_plain;h=0c11b4aa656c14fb10cd6d9e448033ff34e327bf;p=ker2014-2.git ass11ding --- diff --git a/report/ass2-1.tex b/report/ass2-1.tex index 6a07cda..fff6138 100644 --- a/report/ass2-1.tex +++ b/report/ass2-1.tex @@ -186,11 +186,7 @@ Answer: P(burglar|Obs)=[0.01179672476662423,0.015584580594335082]. \end{minted} \end{listing} -ToDO write down the most probable explanation for the observed evidence - \section{Comparison with manual calculation} -ToDO: english. -When we let ailog calculate the probability of alarm. %Wat is Obs hier?? Querying the \textit{Alarm} variable gives the following answer: \begin{minted}{prolog} ailog: predict alarm. @@ -206,10 +202,13 @@ results in the following answer:\\ $P(Alarm|burglar, earthquake) = P(i_1|burglar)+P(i_2|earthquake)(1-P(i_1|burglar)) = 0.2*0.0027+0.95*0.0027*(1-0.2*0.0027)=0.00314699654673673$ \\ -TODOOOOOOOOOOO %Ik weet niet of we i_1 en i_2 nog door iets anders vervangen -% moeten worden. + When you compare the output of AILog and of the variable elimination, you see -that they are exactly the same. +that they are exactly the same. The method with which AILog calculates the +probability is almost the same but that is mainly because we did not use any +techniques that are not available in AILog. When we would have done the same +task with a Bayesian network and the use of Bayes' rule we would have had a +different method. \newpage \section{Burglary problem with extended information} @@ -334,5 +333,3 @@ A Bayesian network representation of the extended story is possible, but could F & $0$ & $1$\\ \hline \end{tabular} - -