From defc0b8b1d6cab9643d5f7ee78b76228f730e3e6 Mon Sep 17 00:00:00 2001 From: Mart Lubbers Date: Tue, 9 Dec 2014 13:03:22 +0100 Subject: [PATCH] small frontend update --- thesis2/2.requirementsanddesign.tex | 34 +++++++++++++++++++++++++++-- 1 file changed, 32 insertions(+), 2 deletions(-) diff --git a/thesis2/2.requirementsanddesign.tex b/thesis2/2.requirementsanddesign.tex index 57af052..94a7c28 100644 --- a/thesis2/2.requirementsanddesign.tex +++ b/thesis2/2.requirementsanddesign.tex @@ -137,8 +137,38 @@ thus be adapted to change the crawler for possible source changes for example. \subsubsection{Add new crawler} \label{addcrawler} -TODOOO +The addition or generation of crawlers is the key feature of the program and it +is the smartest part of whole system as it includes the graph optimization +algorithm to recognize user specified patterns in the data. The user has to +assign a name to a RSS feed in the boxes and when the user presses submit the +RSS feed is downloaded and prepared to be shown in the interactive editor. The +editor consists of two components. The top most component allows the user to +enter several fields of data concerning the venue, these are things like: +address, crawling frequency and website. Below there is a table containing the +processed RSS feed entries and a row of buttons allowing the user to mark +certain parts of the entries as certain types. The user has to select a piece +of an entry and then press the appropriate category button. The text will +become highlighted and by doing this for several entries the program will have +enough information to crawl the feed as shown in Figure~\ref{addcrawl} + +\begin{figure} + \label{addcrawl} + \caption{Example of a pattern in the add crawler component} +\end{figure} + \subsubsection{Test crawler} +The test crawler component is a very simple non interactive component that +allows the user to verify if a crawler functions properly without having to +need to access the database or the command line utilities. Via a dropdown menu +the user selects the crawler and when submit is pressed the backend generates a +results page that shows a small log of the crawler, a summary of the results +and most importantly the results, in this way the user can see in a few gazes +if the crawler functions properly. Humans are very fast in detecting patterns +and therefore the error checking goes very fast. Because the log of the crawl +operation is shown this page can also be used for diagnostic information about +the backends crawling system. The logging is pretty in depth and also shows +possible exceptions and is therefore also usable for the developers to diagnose +problems. \subsection{Backend} \subsubsection{Program description} @@ -164,7 +194,7 @@ function without intervention of a programmer that needs to adapt the source. \subsubsection{Libraries} The libraries are called by the main program and take care of all the hard -work. Basically the libaries are a group of python scripts that for example +work. Basically the libraries are a group of python scripts that for example minimize the graphs, transform the user data into machine readable data, export the crawled data to XML and much more. -- 2.20.1