-In order to be allow learnlib to learn the TCP model it was necessary to have a deterministic model.\r
-We accomplished this by modifying the adapter so it can reach a \emph{ERROR} or \emph{CLOSED} state. In these states all inputs are discarded and a default output is returned.\r
-In the case of a state where an input results in a non-deterministic output we jump to the \emph{ERROR} state for additional this given input. When the connection is successfully closed using a \emph{FIN} packet we move the adapter to the \emph{CLOSED} state.\r
+In order to be allow learnlib to learn the TCP model it was necessary to have a\r
+deterministic model. We accomplished this by modifying the adapter so it can\r
+reach a \texttt{ERROR} or \texttt{CLOSED} state. In these states all inputs are\r
+discarded and a default output is returned. In the case of a state where an\r
+input results in a non-deterministic output we jump to the \texttt{ERROR} state\r
+for additional this given input. When the connection is successfully closed\r
+using a \texttt{FIN} packet we move the adapter to the \texttt{CLOSED} state.\r
\r
-We divided the input alphabet into three sets, this way we can control the size of the model learned by learnlib.\r
+We divided the input alphabet into three sets, this way we can control the size\r
+of the model learned by learnlib.\r
\r
-\begin{longtable}{|c|l|}\r
- \caption{Different input alphabets used during learning.} \\\hline\r
- Alphabet & Inputs \\\hline \hline\r
- small & SYN, ACK \\\hline\r
- partial & SYN, ACK, DATA \\\hline\r
- full & SYN, ACK, DATA, RST, FIN \\\hline\r
-\end{longtable}\r
+\begin{table}[H]\r
+ \begin{tabular}{cl}\r
+ \toprule\r
+ Alphabet & Inputs \\\r
+ \midrule\r
+ small & \texttt{SYN}, \texttt{ACK} \\\r
+ partial & \texttt{SYN}, \texttt{ACK}, \texttt{DATA} \\\r
+ full & \texttt{SYN}, \texttt{ACK}, \texttt{DATA}, \texttt{RST},\r
+ \texttt{FIN} \\\r
+ \bottomrule\r
+ \end{tabular}\r
+ \caption{Different input alphabets used during learning.}\r
+\end{table}\r
\r
-Just as in our previous assignment the \emph{DATA} packet is actually a \emph{ACK} with an user data payload and the \emph{push} flag set. \r
-These input alphabets will influence the size of the model produced. \emph{small} will result in a 2 state model, \emph{partial} will be the full model without the \emph{CLOSED} state and \emph{full} should result in the full model as used in the previous assignment.\r
+Just as in our previous assignment the \texttt{DATA} packet is actually a\r
+\texttt{ACK} with an user data payload and the \emph{push} flag set. These\r
+input alphabets will influence the size of the model produced. \emph{small}\r
+will result in a 2 state model, \emph{partial} will be the full model without\r
+the \texttt{CLOSED} state and \emph{full} should result in the full model as\r
+used in the previous assignment.\r
\r
\paragraph{Model learned with small input alphabet}\r
-\includegraphics{model.small.LStar.rand.eps}\r
-\r
+%\includegraphics{model.small.LStar.rand.eps}\r
\r
\paragraph{Model learned with partial input alphabet}\r
-\includegraphics{model.partial.LStar.rand.eps}\r
+%\includegraphics{model.partial.LStar.rand.eps}\r
\r
\paragraph{Model learned with full input alphabet}\r
-\includegraphics{model.full.LStar.rand.eps}
\ No newline at end of file
+%\includegraphics{model.full.LStar.rand.eps}\r
-The table below contains some statistics about all the different parameter configurations we ran learnlib with.\r
-All except \emph{RivestSchapire} using the Random test method result in the correct model being learned. \r
-When \emph{WMethod} is selected as the testing method \emph{RivestSchapire} is also able to learn the correct model.\r
-\emph{WMethod} does however increase the time needed to learn the model significantly, when a different learner is used there is no reason not to use the Random testing method.\r
-\r
-\begin{longtable}{| l | l | l | c | c | c |}\r
- \caption{Learning parameters and resulting model size.} \\\hline\r
- Alphabet & Method & Test method & States & Time \\\hline \hline\r
- small & LStar & Random & 2 & 12 sec \\\hline\r
- small & TTT & Random & 2 & 5 sec \\\hline\r
- small & RivestSchapire & Random & 2 & 6 sec \\\hline\r
- small & KearnsVazirani & Random & 2 & 5 sec \\\hline\r
- small & LStar & WMethod & 2 & 35 sec \\\hline\r
- small & TTT & WMethod & 2 & 32 sec \\\hline\r
- small & RivestSchapire & WMethod & 2 & 33 sec \\\hline\r
- small & KearnsVazirani & WMethod & 2 & 33 sec \\\hline\r
- \r
- partial & LStar & Random & 4 & 18 sec \\\hline\r
- partial & TTT & Random & 4 & 16 sec \\\hline\r
- partial & RivestSchapire & Random & 4 & 13 sec \\\hline\r
- partial & KearnsVazirani & Random & 4 & 22 sec \\\hline\r
- partial & LStar & WMethod & 4 & 384 sec \\\hline\r
- partial & TTT & WMethod & 4 & 390 sec \\\hline\r
- partial & RivestSchapire & WMethod & 4 & 384 sec \\\hline\r
- partial & KearnsVazirani & WMethod & 4 & 383 sec \\\hline\r
- \r
- full & LStar & Random & 5 & 44 sec \\\hline\r
- full & TTT & Random & 5 & 25 sec \\\hline\r
- full & RivestSchapire & Random & 4 & 12 sec \\\hline\r
- full & KearnsVazirani & Random & 5 & 19 sec \\\hline\r
- full & LStar & WMethod & 5 & 2666 sec \\\hline\r
- full & TTT & WMethod & 5 & 2632 sec \\\hline\r
- full & RivestSchapire & WMethod & 5 & 2638 sec \\\hline\r
- full & KearnsVazirani & WMethod & - & - \\\hline\r
-\end{longtable}\r
+The table below contains some statistics about all the different parameter\r
+configurations we ran learnlib with. All except \emph{RivestSchapire} using\r
+the Random test method result in the correct model being learned. When\r
+\emph{WMethod} is selected as the testing method \emph{RivestSchapire} is also\r
+able to learn the correct model.\r
+\emph{WMethod} does however increase the time needed to learn the model\r
+significantly, when a different learner is used there is no reason not to use\r
+the Random testing method.\r
\r
+\begin{table}[H]\r
+ \begin{tabular}{lllccc}\r
+ \toprule\r
+ Alphabet & Method & Test method & States & Time \\\r
+ \midrule\r
+ small & LStar & Random & 2 & 12 sec \\\r
+ small & TTT & Random & 2 & 5 sec \\\r
+ small & RivestSchapire & Random & 2 & 6 sec \\\r
+ small & KearnsVazirani & Random & 2 & 5 sec \\\r
+ small & LStar & WMethod & 2 & 35 sec \\\r
+ small & TTT & WMethod & 2 & 32 sec \\\r
+ small & RivestSchapire & WMethod & 2 & 33 sec \\\r
+ small & KearnsVazirani & WMethod & 2 & 33 sec \\\r
+ \r
+ partial & LStar & Random & 4 & 18 sec \\\r
+ partial & TTT & Random & 4 & 16 sec \\\r
+ partial & RivestSchapire & Random & 4 & 13 sec \\\r
+ partial & KearnsVazirani & Random & 4 & 22 sec \\\r
+ partial & LStar & WMethod & 4 & 384 sec \\\r
+ partial & TTT & WMethod & 4 & 390 sec \\\r
+ partial & RivestSchapire & WMethod & 4 & 384 sec \\\r
+ partial & KearnsVazirani & WMethod & 4 & 383 sec \\\r
+ \r
+ full & LStar & Random & 5 & 44 sec \\\r
+ full & TTT & Random & 5 & 25 sec \\\r
+ full & RivestSchapire & Random & 4 & 12 sec \\\r
+ full & KearnsVazirani & Random & 5 & 19 sec \\\r
+ full & LStar & WMethod & 5 & 2666 sec \\\r
+ full & TTT & WMethod & 5 & 2632 sec \\\r
+ full & RivestSchapire & WMethod & 5 & 2638 sec \\\r
+ full & KearnsVazirani & WMethod & - & - \\\r
+ \bottomrule\r
+ \end{tabular}\r
+ \caption{Learning parameters and resulting model size.}\r
+\end{table}\r