A lot of minor (mostly textual) changes
[ssproject1617.git] / report / organization.tex
1 %
2 % Written by dsprenkels
3 %
4 % This document describes our security analysis of the (?) CMS according to the OWASP ASVS (v3.0.1), as well as an analysis of Fortify's results compared to our findings \& the OWASP ASVS categories.
5 %
6 % Outline: first our analysis, then a summary of Fortify's analysis and how it compares to ours etc., then some reflection.
7
8 % E.g. did you split the work by files in the code, or by category of security requirements? Did you double-check important findings? Or did you use several pairs of eyeballs on the same code/ security requirement, in the hope that more eyeballs spot more problems? (How) did you integrate using the static code analysis tools into this process? Did you use other tools and methods?
9 % Have you tried to run the application? (If so, was this useful, and did you find than running the application was helpful to then review the code, understanding its functionality better? But you might want to dicuss this in the Reflection section.)
10
11
12 % Running the application
13 Each of us has initially set up the \CMS{} and made ourselves familiar with the
14 \CMS{}. This was easy, because one of us had made a \code{Dockerfile} for the
15 others to use. This made running and installing the application trivially
16 easy. Running the application made us understand the outline and components of
17 the application. We could also find some spots were easy to find vulnerabilities
18 could be expected. However, looking at the source code was more effective,
19 especially when verifying that the \CMS{} \emph{passes} a requirement. Buggy code
20 is easy to find. Bugless code is not.
21
22 We have chosen to split the work by category of security requirements in
23 the OWASP Application Security Verification Standard. We set the goal to perform
24 a sound level 2 audit on the software.
25
26 % Initial approach
27 We were quickly set up and started to do each own parts of the audit by hand.
28 For each OWASP ASVS item specific to certain mechanisms (like login and input
29 validation), we took the source code of the \CMS{} and follow the control
30 flow to see if the application satisfies the security requirement. For more
31 general requirements, we could just look at the code that is responsible for
32 this requirement (like the \code{Response} class in the case of \HTTP{} security).
33 When we had found that a requirement was not satisfied, we elaborate shortly
34 and move on.
35
36 This went well, because with five people the individual workload is just not
37 that big. Furthermore, finding vulnerabilities is a lot easier that verifying
38 the security in a lot of cases. This speeds up the auditing process, because
39 the \CMS{} turned out to not satisfy the ASVS in most cases.
40
41 % Use of Fortify
42 Because we were on track early, most of the audit was already done by when we
43 were introduced to the Fortify tool. Nonetheless, we used it to verify our own
44 verdicts. Some of us have installed and used the Fortify tool itself. These
45 students have exported a PDF report, which the others could then use.
46
47 % Double-checking process
48 When we finished the report, each of us has reread each others' parts to check
49 if things had been missed or reported incorrect. This may not have thorough, but
50 because in the end five pairs of eyes have read all verdicts, we trust that, in
51 the end, all verdicts are sufficiently checked.