\section{Application overview}
+The backend consists of several processing steps before a crawler specification
+is ready.
+
+\begin{figure}[H]
+ \label{dawg1}
+ \centering
+ \includegraphics[width=\linewidth]{backend.eps}
+ \strut\\
+ \caption{Backend overview}
+\end{figure}
+
\section{Internals of the crawler generation module}
Data marked by the user is in principle just raw html data that contains a
table with for every RSS feed entry a table row. Within the html the code
-severel markers are placed to make the parsing more easy and removing the need
+several markers are placed to make the parsing more easy and removing the need
of an advanced HTML parser. When the user presses submit a http POST request is
prepared to send all the gathered data to the backend to get processed. The
sending does not happen asynchronously, so the user has to wait for the data to