\subsubsection{Add new crawler}
\label{addcrawler}
-TODOOO
+The addition or generation of crawlers is the key feature of the program and it
+is the smartest part of whole system as it includes the graph optimization
+algorithm to recognize user specified patterns in the data. The user has to
+assign a name to a RSS feed in the boxes and when the user presses submit the
+RSS feed is downloaded and prepared to be shown in the interactive editor. The
+editor consists of two components. The top most component allows the user to
+enter several fields of data concerning the venue, these are things like:
+address, crawling frequency and website. Below there is a table containing the
+processed RSS feed entries and a row of buttons allowing the user to mark
+certain parts of the entries as certain types. The user has to select a piece
+of an entry and then press the appropriate category button. The text will
+become highlighted and by doing this for several entries the program will have
+enough information to crawl the feed as shown in Figure~\ref{addcrawl}
+
+\begin{figure}
+ \label{addcrawl}
+ \caption{Example of a pattern in the add crawler component}
+\end{figure}
+
\subsubsection{Test crawler}
+The test crawler component is a very simple non interactive component that
+allows the user to verify if a crawler functions properly without having to
+need to access the database or the command line utilities. Via a dropdown menu
+the user selects the crawler and when submit is pressed the backend generates a
+results page that shows a small log of the crawler, a summary of the results
+and most importantly the results, in this way the user can see in a few gazes
+if the crawler functions properly. Humans are very fast in detecting patterns
+and therefore the error checking goes very fast. Because the log of the crawl
+operation is shown this page can also be used for diagnostic information about
+the backends crawling system. The logging is pretty in depth and also shows
+possible exceptions and is therefore also usable for the developers to diagnose
+problems.
\subsection{Backend}
\subsubsection{Program description}
\subsubsection{Libraries}
The libraries are called by the main program and take care of all the hard
-work. Basically the libaries are a group of python scripts that for example
+work. Basically the libraries are a group of python scripts that for example
minimize the graphs, transform the user data into machine readable data, export
the crawled data to XML and much more.