From 199d6822a8770cc557ae577f05f39816f6c0e6bb Mon Sep 17 00:00:00 2001 From: Mart Lubbers Date: Wed, 28 Dec 2016 16:48:43 +0100 Subject: [PATCH] up --- assignment2/Makefile | 2 +- assignment2/a.tex | 86 +++++++++++++++++++++++++++++++++----------- 2 files changed, 66 insertions(+), 22 deletions(-) diff --git a/assignment2/Makefile b/assignment2/Makefile index 9ddd8ec..08b2acf 100644 --- a/assignment2/Makefile +++ b/assignment2/Makefile @@ -17,7 +17,7 @@ all: $(DOC).pdf $(LATEX) $(LATEXFLAGS) $< grep -q '^\\bibdata{' $(basename $<).aux && $(BIBTEX) $(basename $<) || true $(LATEX) $(LATEXFLAGS) $< | tee $(basename $<).mlog - grep -qF 'Please rerun LaTeX.' $(basename $<).mlog &&\ + grep -iqF 'rerun' $(basename $<).mlog &&\ $(LATEX) $(LATEXFLAGS) $< || true clean: diff --git a/assignment2/a.tex b/assignment2/a.tex index fcfac6e..28b3e31 100644 --- a/assignment2/a.tex +++ b/assignment2/a.tex @@ -2,7 +2,7 @@ \begin{document} \maketitleru[authorstext={Author:}, course={Philosophy and Ethics of Computer and Information Sciences}] -\section{Grey Hat Cracking Should Be Legalized} +\section{Grey Hat Cracking Should Be Legalized}\label{sec:grey} In the digital world the notion of property is significantly different than the notion of property in the real world. Property in the digital world can be interchanged, duplicated and changed without physical intervening. This means @@ -34,28 +34,72 @@ might view the case a grey hat because he informed the administrators and did not touch the data. Especially while the current view on hacking ever so much is changing it very quickly becomes a slippery slope. -The best solution in my opinion would be to tolerate, not actively chasing them, -grey hat hackers and keep testing the boundaries using the court of law. In -this way true grey hat hackers will possibly be bothered and not punished and -true black hat hackers will be punished true the court of law. Society will -change and with it the view on hacking. Police forces will have to acquire -trained IT professionals to keep up the pace with the black hat hackers. +The best solution at this point in time is in my opinion to tolerate, not +actively chasing them, grey hat hackers and keep testing the boundaries using +the court of law. In this way true grey hat hackers will possibly be bothered +and not punished and true black hat hackers will be punished true the court of +law. Society will change and with it the view on hacking. Police forces will +have to acquire trained IT professionals to keep up the pace with the black hat +hackers. \section{Web Scrapers and Robot Denial Files} +The current state of practise for \verb|robots.txt| files is that they are +read, parsed and acted upon. However, accessing data that is explicitly denied +by the \verb|robots.txt| file is not illegal in the current state of the +internet. When information is placed on the web it is known that it is visible +for everyone. When you do not want your information to be visible you can place +it behind portals in such a way that only authorized visitors can see the +content, and thus not robots. Regulating the content access and forcing the +regulations requires a fundamental change in the way we view the internet. The +internet can not be the anarchistic place anymore and requires laws and control +over the content. + +The current state of the internet does not require such an extension to the +\verb|robots.txt| protocol. When you want information not to be seen by other +you should not put it online. And if you do want to put it online you can hide +behind a digital wall. Therefore I think the change would be bad and would not +improve the internet as it is. When for example stores think they are scraped +in a bad way (such as taxes, postage etc.) they can just improve their scraping +interface. Web store aggregators just want to sell as much as possible and +probably want to assist in improving the websites by using dedicated +\emph{API}s for example. + +Changing the internet to facilitate this behaviour has too many bad +consequences. All traffic must be monitored to see whether it concerns a robot +or a normal user. The openness and freeness of the internet will come to an end +and is therefore not desirable. \section{An Immune System for the Internet} -The idea of benevolent viruses patching security holes is a outright terrible -idea because of a set of reasons. - -Firstly this breaches the privacy of the user. Analogous, a burglar that breaks -into a house to install better locks would be considered wrong as well. The -system is property of the user and the internet has been an anarchistic place -that had no rules and regulations. If the user chooses not to protect their -system then that is their own loss. However, when the system also attacks other -systems it should be fixed. This is not always the case and depending on the -view on the internet of the user breaking into the system is worse then a -possible attack on the system. This case is a living example of the security -versus privacy duality. Unprotected users are unwanted invaded by, apparently -benevolent, worms. Who can proof these worms are benevolent, who develops them, -those questions become very relevant. +This case is very much similar to the first case discussed in +Section~\ref{sec:grey}. However, the nature of this discussion point is even +more extreme. Patching systems through breaking in can be seen as grey hat +hacking. The methods used are bedraggled but the consequences are positive. +However, this matter is different from the previous ones in the sense that it +proposes to allow such behaviour. This case proposes to encourage breaching +systems to patch them. + +From a purely consequentialists point of view there is nothing wrong with it. +The victims machine will be patched and no harm is done besides walking into an +already open door. However, from a deontological point of view this is wrong +because you have to resort to inherently bad techniques such as breaching into +systems and changing systems unauthorized. + +The same problem as the similar discussion point arises which is the thin line +between good and bad. The opinions on what is a well-patched system might +change and there might even be a schism in the population on certain patches. +Take for example the \emph{phone-home} patch that was introduced in +\emph{Windows 7} that adds the similar \emph{phone-home} functionality to the +system that is present in the successor \emph{Windows 10}. This functionality +allows the system to call back to the servers of \emph{Microsoft} to send +information such as statistics. Some user might deliberately ignore this patch +because it infringes on their privacy. \emph{Microsoft} on the other hand will +probably state that it is necessary to guarantee a good user experience and +possibly even to guarantee safety. Obviously this example is arcane and +artificial but similar problems may arise. + +Therefore the solution to this problem is to disallow placing ``good'' malware +on systems. Following the same principles as the previous case it might be +legal to notify the users in some way of their unpatched systems. But again, +this is a very thin line and has to be tested continuously via the court of +law. \end{document} -- 2.20.1