In this case \haskelllhstexinline{newtype}s are used instead of regular \haskelllhstexinline{data} declarations.
A \haskelllhstexinline{newtype} is a special data type with a single constructor containing a single value only to which it is isomorphic.
It allows the programmer to define separate class instances that the instances of the isomorphic type without any overhead.
- During compilation the constructor is completely removed~\citep[Sec.~4.2.3]{peyton_jones_haskell_2003}.
+ During compilation the constructor is completely removed~\citep[\citesection{4.2.3}]{peyton_jones_haskell_2003}.
}
\begin{lstHaskellLhstex}
The recursive knot is left untied and as a result, \haskelllhstexinline{Sub_1} can never be reached from an \haskelllhstexinline{Expr_1}.
Luckily, we can reconnect them by adding a special constructor to the \haskelllhstexinline{Expr_1} data type for housing extensions.
-It contains an existentially quantified~\citep{mitchell_abstract_1988} type with type class constraints~\citep{laufer_combining_1994,laufer_type_1996} for all semantics type classes~\citep[Chp.~6.4.6]{ghc_team_ghc_2021} to allow it to house not just subtraction but any future extension.
+It contains an existentially quantified~\citep{mitchell_abstract_1988} type with type class constraints~\citep{laufer_combining_1994,laufer_type_1996} for all semantics type classes~\citep[\citesection{6.4.6}]{ghc_team_ghc_2021} to allow it to house not just subtraction but any future extension.
\begin{lstHaskellLhstex}
data Expr_2
In our example this means that the programmer can write\footnotemark{}:
\footnotetext{%
- Backticks are used to use functions or constructors in an infix fashion~\citep[Sec.~4.3.3]{peyton_jones_haskell_2003}.
+ Backticks are used to use functions or constructors in an infix fashion~\citep[\citesection{4.3.3}]{peyton_jones_haskell_2003}.
}
\begin{lstHaskellLhstex}
e2 :: Expr_2
\section{\texorpdfstring{\Acrlongpl{GADT}}{Generalised algebraic data types}}%
\Glspl{GADT} are enriched data types that allow the type instantiation of the constructor to be explicitly defined~\citep{cheney_first-class_2003,hinze_fun_2003}.
Leveraging \glspl{GADT}, deeply embedded \glspl{DSL} can be made statically type safe even when different value types are supported.
-Even when \glspl{GADT} are not supported natively in the language, they can be simulated using embedding-projection pairs or equivalence types~\citep[Sec.~2.2]{cheney_lightweight_2002}.
+Even when \glspl{GADT} are not supported natively in the language, they can be simulated using embedding-projection pairs or equivalence types~\citep[\citesection{2.2}]{cheney_lightweight_2002}.
Where some solutions to the expression problem do not easily generalise to \glspl{GADT} (see \cref{sec:cde:related}), classy deep embedding does.
Generalising the data structure of our \gls{DSL} is fairly straightforward and to spice things up a bit, we add an equality and boolean not language construct.
To make the existing \gls{DSL} constructs more general, we relax the types of those constructors.
Now that the shape of the type classes has changed, the dictionary data types and the type classes need to be adapted as well.
The introduced type variable \haskelllhstexinline{a} is not an argument to the type class, so it should not be an argument to the dictionary data type.
-To represent this type class function, a rank-2 polymorphic function is needed~\citep[Chp.~6.4.15]{ghc_team_ghc_2021}\citep{odersky_putting_1996}.
+To represent this type class function, a rank-2 polymorphic function is needed~\citep[\citesection{6.4.15}]{ghc_team_ghc_2021}\citep{odersky_putting_1996}.
Concretely, for the evaluatior this results in the following definitions:
\begin{lstHaskellLhstex}
Finally, the abstract syntax tree remains observable which makes it suitable for intensional analyses, albeit using occasional dynamic typing for truly cross-extensional transformations.
Defining reusable expressions overloaded in semantics or using multiple semantics on a single expression requires some boilerplate still, getting around this remains future work.
+\Cref{sec:classy_reprise} shows how the boilerplate can be minimised using advanced type system extensions.
\section{Related work}%
\label{sec:cde:related}
In Swierstra's approach, semantics are lifted to type classes similarly to classy deep embedding.
Each language construct is their own datatype parametrised by a type parameter.
This parameter contains some type level representation of language constructs that are in use.
-In classy deep embedding, extensions do not have to be enumerated at the type level but are captured in the extension case.
+In classy deep embedding, extensions only have to be enumerated at the type level when the term is required to be overloaded, in all other cases they are captured in the extension case.
Because all the constructs are expressed in the type system, nifty type system tricks need to be employed to convince the compiler that everything is type safe and the class constraints can be solved.
Furthermore, it requires some boilerplate code such as functor instances for the data types.
In return, pattern matching is easier and does not require dynamic typing.
Classy deep embedding only strains the programmer with writing the extension case for the main data type and the occasional loopback constructor.
-L\"oh and Hinze proposed a language extension that allows open data types and open functions, i.e.\ functions and data types that can be extended with more cases later on~\citep{loh_open_2006}.
+\Citet{loh_open_2006} proposed a language extension that allows open data types and open functions, i.e.\ functions and data types that can be extended with more cases later on.
They hinted at the possibility of using type classes for open functions but had serious concerns that pattern matching would be crippled because constructors are becoming types, thus ultimately becoming impossible to type.
In contrast, this paper shows that pattern matching is easily attainable---albeit using dynamic types---and that the terms can be typed without complicated type system extensions.
-A technique similar to classy deep embedding was proposed by Najd and Peyton~Jones to tackle a slightly different problem, namely that of reusing a data type for multiple purposes in a slightly different form~\citep{najd_trees_2017}.
+A technique similar to classy deep embedding was proposed by \citet{najd_trees_2017} to tackle a slightly different problem, namely that of reusing a data type for multiple purposes in a slightly different form.
For example to decorate the abstract syntax tree of a compiler differently for each phase of the compiler.
They propose to add an extension descriptor as a type variable to a data type and a type family that can be used to decorate constructors with extra information and add additional constructors to the data type using an extension constructor.
Classy deep embedding works similarly but uses existentially quantified type variables to describe possible extensions instead of type variables and type families.
Classy deep embedding was organically grown from observing the evolution of tagless-final embedding.
The main difference between tagless-final embedding and classy deep embedding---and in general between shallow and deep embedding---is that intensional analyses of the abstract syntax tree is more difficult because there is no tangible abstract syntax tree data structure.
In classy deep embedding, it is possible to define transformations even across extensions.
+Furthermore, in classy deep embedding, defining (mutual) dependent interpretations is automatically supported whereas in tagless-final embedding this requires some amount of code duplication \citep{sun_compositional_2022}.
Hybrid approaches between deep and shallow embedding exist as well.
-For example, Svenningson et al.\ show that by expressing the deeply embedded language in a shallowly embedded core language, extensions can be made orthogonally as well~\citep{svenningsson_combining_2013}.
+For example, \citet{svenningsson_combining_2013} show that by expressing the deeply embedded language in a shallowly embedded core language, extensions can be made orthogonally as well.
This paper differs from those approaches in the sense that it does not require a core language in which all extensions need to be expressible.
\section*{Acknowledgements}
Furthermore, I would like to thank Pieter and Rinus for the fruitful discussions, Ralf for inspiring me to write a functional pearl, and the anonymous reviewers for their valuable and honest comments.
\begin{subappendices}
+\section{Reprise: reducing boilerplate}\label{sec:classy_reprise}
+\todo{Improve text}
+One of the unique selling points of this novel \gls{DSL} embedding technique is that it, in its basic form, does not require advanced type system extensions nor a lot of boilerplate.
+However, generalising the technique to \glspl{GADT} arguably unleashes a cesspool of \emph{unsafe} compiler extensions.
+If we are willing to work with extensions, almost all of the boilerplate can be inferred or generated\footnote{The source code for this extension can be found here: \url{https://gitlab.com/mlubbers/classydeepembedding}.}.
+
+The \gls{DSL} datatype is parametrised by a type variable providing a witness to the interpretation on the language.
+When using multiple interpretations, these need to be bundled in a data type.
+Using the \gls{GHC}'s \GHCmod{ConstraintKind} extension, we can make these witnesses explicit, tying into \gls{HASKELL}'s type system immediately.
+Furthermore, this constraint does not necessarily has to be a single constraint, after enabling \GHCmod{DataKinds} and \GHCmod{TypeOperators}, we can encode lists of witnesses instead.
+The data type for this list of witnesses is \haskelllhstexinline{Record}.
+The \haskelllhstexinline{Record} \gls{GADT} is parametrised by two type variables, the first type variable (\haskelllhstexinline{dt}) is the data type on which the constraints can be applied.
+The second type variable (\haskelllhstexinline{clist}) is the list of constraints itself.
+It is not just a list of \haskelllhstexinline{Constraint} but it is a list containing constraint constructors that will, when given a type of the polymorphic kind \haskelllhstexinline{k}, produce a constraint.
+This means that when \haskelllhstexinline{Cons} is pattern matched, the type class constraint for \haskelllhstexinline{c dt} can be solved by the compiler.
+\GHCmod{KindSignatures} is used to force the kinds of the type parameters and the kind of the data type is polymorphic (\GHCmod{PolyKinds}) so that the \haskelllhstexinline{Record} data type can be used for \glspl{DSL}s using type classes but also type constructor classes (e.g.\ when using \glspl{GADT})..
+
+\begin{lstHaskellLhstex}[caption={Data type for a list of constraints}]
+data Record (dt :: k) (clist :: [k -> Constraint]) where
+ Nil :: Record dt '[]
+ Cons :: c dt => Record dt cs -> Record dt (c ': cs)
+\end{lstHaskellLhstex}
+
+To incorporate this type in the \haskelllhstexinline{Expr} type, the \haskelllhstexinline{Ext} constructor changes as follows:
+
+\begin{lstHaskellLhstex}[caption={Data type for a list of constraints}]
+data Expr c
+ = Lit Int
+ | Add (Expr c) (Expr c)
+ | Ext (Record x c) x
+\end{lstHaskellLhstex}
+
+Furthermore, we define a type class that allows us to extract explicit dictionaries \haskelllhstexinline{Dict} from these records if the constraint can is present in the list.
+
+\begin{lstHaskellLhstex}[caption={Membership functions for constraints}]
+class c `In` cs where
+ project :: Record dt cs -> Dict (c dt)
+instance {-# OVERLAPPING #-} c `In` (c ': cs) where
+ project (Cons _) = Dict
+instance {-# OVERLAPPING #-} c `In` cs => c `In` (b ': cs) where
+ project (Cons xs) = project xs
+\end{lstHaskellLhstex}
+
+Finally, creating these \haskelllhstexinline{Record} witnesses is a chore so this can be automated as well using a \haskelllhstexinline{CreateRecord} multi-parameter type class (requiring the \GHCmod{MultiParamTypeclasses} and \GHCmod{FlexibleInstances} extension).
+This type class creates a record structure cons by cons if and only if all type class constraints are available in the list of constraints.
+
+\begin{lstHaskellLhstex}[caption={Membership functions for constraints}]
+class CreateRecord dt c where
+ createRecord :: Record dt c
+instance CreateRecord d '[] where
+ createRecord = Nil
+instance (c (d c0), CreateRecord (d c0) cs) =>
+ CreateRecord (d c0) (c ': cs) where
+ createRecord = Cons createRecord
+\end{lstHaskellLhstex}
+
+The class constraints for the interpretation instances can now be greatly simplified, as shown in the evaluation instance for \haskelllhstexinline{Expr}.
+The implementation remains the same, only that for the extension case, a trick needs to be applied to convince the compiler of the correct instances.
+Using \haskelllhstexinline{`In`}'s \haskelllhstexinline{project} function, a dictionary can be brought into scope.
+This dictionary can then subsequently be used to apply the type class function on the extension using the \haskelllhstexinline{withDict} function from the \haskelllhstexinline{Data.Constraint} library\footnote{\haskelllhstexinline{withDict :: Dict c -> (c => r) -> r}}.
+The \GHCmod{ScopedTypeVariables} extension is used to make sure the existentially quantified type variable for the extension is matched to the type of the dictionary.
+Furthermore, because the class constraint is seemingly not smaller than the instance head, \GHCmod{UndecidableInstances} should be enabled.
+
+\begin{lstHaskellLhstex}[caption={Evaluation instance for the main data type}]
+class Eval v where
+ eval :: v -> Int
+
+instance Eval `In` s => Eval (Expr s) where
+ eval (Lit i) = i
+ eval (Add l r) = eval l + eval r
+ eval (Ext r (e :: x)) = withDict (project r :: Dict (Eval x)) eval e
+\end{lstHaskellLhstex}
+
+Smart constructors need to be adapted as well, as can be seen from the smart constructor \haskelllhstexinline{subst}.
+
+\begin{lstHaskellLhstex}[caption={Substitution smart constructor}]
+subst :: (Typeable c, CreateRecord (Subt c) c) => Expr c -> Expr c -> Expr c
+subst l r = Ext createRecord (l `Subt` r)
+\end{lstHaskellLhstex}
+
+Finally, defining terms in the language is can be done immediately if the interpretations are known.
+For example, if we want to print and/or optimise the term $~(~(42+(38-4)))$, we can define it as follows.
+
+\begin{lstHaskellLhstex}[caption={Substitution smart constructor}]
+e0 :: Expr '[Print,Opt]
+e0 = neg (neg (Lit 42 `Add` (Lit 38 `subt` Lit 4)))
+\end{lstHaskellLhstex}
+
+It is also possible to define terms in the \gls{DSL} as being overloaded in the interpretation.
+This does require enumerating all the \haskelllhstexinline{CreateRecord} type classes for every extension.
+At the call site, the concrete list of constraints must be known.
+
+\begin{lstHaskellLhstex}[caption={Substitution smart constructor}]
+e1 :: (Typeable c
+ , CreateRecord (Neg c) c
+ , CreateRecord (Subst c) c
+ ) => Expr c
+e1 = neg (neg (Lit 42 `Add` (Lit 38 `subt` Lit 4)))
+\end{lstHaskellLhstex}
+
+Finally, using the \GHCmod{TypeFamilies} extension, type families can be created for bundling \haskelllhstexinline{`In`} constraints (\haskelllhstexinline{UsingExt}) and \haskelllhstexinline{CreateRecord} constraints (\haskelllhstexinline{DependsOn}), making the syntax even more descriptive.
+E.g.\ \haskelllhstexinline{UsingExt '[A, B, C] c} expands to \haskelllhstexinline{(CreateRecord (A c) c, CreateRecord (B c) c, CreateRecord (C c) c)} and \haskelllhstexinline{DependsOn '[A, B, C] s} expands to \haskelllhstexinline{(A `In` s, B `In` s, C `In` s)}.
+
+\begin{lstHaskellLhstex}
+type family UsingExt cs c :: Constraint where
+ UsingExt '[] c = ()
+ UsingExt (d ': cs) c = (CreateRecord (d c) c, UsingExt cs c)
+
+type family DependsOn cs c :: Constraint where
+ DependsOn '[] c = ()
+ DependsOn (d ': cs) c = (d `In` c, DependsOn cs c)
+\end{lstHaskellLhstex}
+
+Defining the previous expression can now be done with the following shortened type that describes the semantics better:
+
+\begin{lstHaskellLhstex}
+e1 :: (Typeable c, UsingExt '[Neg, Subst]) => Expr c
+\end{lstHaskellLhstex}
+
+Giving an instance for \haskelllhstexinline{Interp} for \haskelllhstexinline{DataType} that uses the extensions \haskelllhstexinline{e_1,e2,...} and depends on interpretations \haskelllhstexinline{i_1,i_2,...} is done as follows:
+
+\begin{lstHaskellLhstex}
+instance ( UsingExt '[e_1,e_2,...] s
+ , DependsOn '[i_1, i_2,...] s
+ ) => Interp (DataType s) where
+ ...
+\end{lstHaskellLhstex}
+
\section{Data types and definitions}%
\label{sec:cde:appendix}
\begin{lstHaskellLhstex}[caption={Data type definitions.}]
opt_g (EqLoop_g e) = EqLoop_g (opt_g e)
\end{lstHaskellLhstex}
-\section{Chaining semantics (reprise)}\label{sec:classy_reprise}
-\todo{Verbeteren}
-One of the unique selling points of this novel \gls{DSL} embedding technique is that it, in its basic form, does not require advanced type system extensions.
-However, while generalising the technique to \glspl{GADT} arguably unleashes a cesspool of \emph{unsafe} compiler extensions.
-If we are willing to work with extensions, much of the boilerplate can be either generated or omitted entirely.
-
-The \gls{DSL} datatype is parametrised by a type variable providing a witness to the view on the language.
-Using constraint kinds, we can make these witnesses explicit, tying into \gls{HASKELL}'s type system immediately.
-Furthermore, when resorting to the \GHCmod{DataKinds} and \GHCmod{PolyKinds} extensions, this constraint does not necessarily has to be a single constraint, using a \haskelllhstexinline{Record} auxiliary type, we can encode list of witnesses.
-The data type for this list of witnesses is \haskelllhstexinline{Record}.
-Record is parametrised by two type variables, the first type variable (\haskelllhstexinline{dt}) is the data type on which the constraints can be applied.
-The second type variable (\haskelllhstexinline{clist}) is the list of constraints itself.
-It is not just a list of \haskelllhstexinline{Constraint} but it is a list containing constraint constructors that will, when given a type of kind \haskelllhstexinline{k}, produce a constraint.
-This means that when \haskelllhstexinline{Cons} is pattern matched, the type class constraint for \haskelllhstexinline{c dt} can be solved by the compiler.
-
-\begin{lstHaskellLhstex}[caption={Data type for a list of constraints}]
-data Record (dt :: k) (clist :: [k -> Constraint]) where
- Nil :: Record dt '[]
- Cons :: c dt => Record dt cs -> Record dt (c ': cs)
-\end{lstHaskellLhstex}
-
-To incorporate this type in the \haskelllhstexinline{Expr} type, the \haskelllhstexinline{Ext} constructor changes as follows:
-
-\begin{lstHaskellLhstex}[caption={Data type for a list of constraints}]
-data Expr c
- = Lit Int
- | Add (Expr c) (Expr c)
- | Ext (Record x c) x
-\end{lstHaskellLhstex}
-
-Furthermore, we define a type class that allows us to extract explicit dictionaries \haskelllhstexinline{Dict} from these records if the constraint can is present in the list.
-
-\begin{lstHaskellLhstex}[caption={Membership functions for constraints}]
-class c `In` cs where
- project :: Record dt cs -> Dict (c dt)
-instance {-# OVERLAPPING #-} c `In` (c ': cs) where
- project (Cons _) = Dict
-instance {-# OVERLAPPING #-} c `In` cs => c `In` (b ': cs) where
- project (Cons xs) = project xs
-\end{lstHaskellLhstex}
-
-Finally, creating these \haskelllhstexinline{Record} witnesses is a chore so this can be automated as well using a \haskelllhstexinline{CreateRecord} type class that will create a record structure cell by cell if and only if all type class constraints are available.
-
-\begin{lstHaskellLhstex}[caption={Membership functions for constraints}]
-class CreateRecord dt c where
- createRecord :: Record dt c
-instance CreateRecord d '[] where
- createRecord = Nil
-instance (c (d c0), CreateRecord (d c0) cs) =>
- CreateRecord (d c0) (c ': cs) where
- createRecord = Cons createRecord
-\end{lstHaskellLhstex}
-
-The class constraints for the interpretation instances can now be greatly simplified, as shown in the evaluation instance for \haskelllhstexinline{Expr}.
-The implementation remains the same, only that for the extension case, a trick needs to be applied to convince the compiler of the correct instances.
-Using \haskelllhstexinline{`In`}'s \haskelllhstexinline{project} function, a dictionary can be brought into scope.
-This dictionary can then subsequently be used to apply the type class function on the extension using the \haskelllhstexinline{withDict} function from the \haskelllhstexinline{Data.Constraint} library\footnote{\haskelllhstexinline{withDict :: Dict c -> (c => r) -> r}}.
-The \GHCmod{ScopedTypeVariables} extension is used to make sure the existentially quantified type variable for the extension is matched to the type of the dictionary.
-
-\begin{lstHaskellLhstex}[caption={Evaluation instance for the main data type}]
-class Eval v where eval :: v -> Int
-
-instance Eval `In` s => Eval (Expr s) where
- eval (Lit i) = i
- eval (Add l r) = eval l + eval r
- eval (Ext r (e :: x)) = withDict (project r :: Dict (Eval x)) eval e
-\end{lstHaskellLhstex}
-
-Smart constructors need to be adapted as well, as can be seen from the smart constructor \haskelllhstexinline{subst}.
-
-\begin{lstHaskellLhstex}[caption={Substitution smart constructor}]
-subst :: (Typeable c, CreateRecord (Subt c) c) => Expr c -> Expr c -> Expr c
-subst l r = Ext createRecord (l `Subt` r)
-\end{lstHaskellLhstex}
-
-Finally, defining terms in the language is can be done immediately if the interpretations are known.
-For example, if we want to print the term $~(~(42+(38-4)))$, we can define it as follows.
-
-\begin{lstHaskellLhstex}[caption={Substitution smart constructor}]
-e0 :: Expr '[Print,Opt]
-e0 = neg (neg (Lit 42 `Add` (Lit 38 `subt` Lit 4)))
-\end{lstHaskellLhstex}
-
-It is also possible to define terms in the \gls{DSL} as being overloaded in the interpretation.
-This does require enumerating all the \haskelllhstexinline{CreateRecord} type classes for every extension.
-At the call site, the concrete list of constraints must be known.
-
-\begin{lstHaskellLhstex}[caption={Substitution smart constructor}]
-e1 :: (Typeable c
- , CreateRecord (Neg c) c
- , CreateRecord (Subst c) c)
- => Expr c
-e1 = neg (neg (Lit 42 `Add` (Lit 38 `subt` Lit 4)))
-\end{lstHaskellLhstex}
-
\end{subappendices}
\input{subfilepostamble}
\input{subfilepreamble}
\begin{document}
-\chapter{Introduction}%
+\chapter{Prelude}%
\label{chp:introduction}
\begin{chapterabstract}
\begin{figure}[ht]
\centering
\includestandalone{hyponymy_of_dsls}
- \caption{Hyponymy of \glspl{DSL} (adapted from \citet[pg.\ 2]{mernik_extensible_2013})}%
+ \caption{Hyponymy of \glspl{DSL} (adapted from \citet[\citepage{2}]{mernik_extensible_2013})}%
\label{fig:hyponymy_of_dsls}
\end{figure}
A dichotomous approach is embedding the \gls{DSL} in a host language, i.e.\ \glspl{EDSL}~\citep{hudak_modular_1998}.
By defining the language as constructs in the host language, much of the machinery is inherited and the cost of creating embedded languages is very low.
-There is more linguistic reuse~\cite{krishnamurthi_linguistic_2001}.
+There is more linguistic reuse \citep{krishnamurthi_linguistic_2001}.
There are however two sides to the this coin.
If the syntax of the host language is not very flexible, the syntax of the \gls{DSL} may become clumsy.
Furthermore, errors shown to the programmer may be larded with host language errors, making it difficult for a non-expert of the host language to work with the \gls{DSL}.
\includestandalone{tosd}
\caption{\Gls{TOSD} approach.}
\end{subfigure}
- \caption{Separation of concerns in a traditional setting and in \gls{TOSD} (adapted from~\cite[pg.\ 20]{wang_maintaining_2018}).}%
+ \caption{Separation of concerns in a traditional setting and in \gls{TOSD} (adapted from \citet[\citesection{1}]{wang_maintaining_2018}).}%
\label{fig:tosd}
\end{figure}
\citet{piers_task-oriented_2016} created \textmu{}Task, a \gls{TOP} language for specifying non-interruptible embedded systems implemented as an \gls{EDSL} in \gls{HASKELL}.
\citet{van_gemert_task_2022} created LTasks, a \gls{TOP} language for interactive terminal applications implemented in LUA, a dynamically typed imperative language.
\citet{lijnse_toppyt_2022} created Toppyt, a \gls{TOP} language based on \gls{ITASK}, implemented in \gls{PYTHON}, but designed to be simpler and smaller.
-Finally there is \gls{MTASK}, \gls{TOP} language designed for defining workflow for \gls{IOT} devices~\cite{koopman_task-based_2018}.
+Finally there is \gls{MTASK}, \gls{TOP} language designed for defining workflow for \gls{IOT} devices \citep{koopman_task-based_2018}.
It is written in \gls{CLEAN} as an \gls{EDSL} fully integrated with \gls{ITASK} and allows the programmer to define all layers of an \gls{IOT} system from a single source.
\section{Outline}
-\todo[inline]{uitbreiden}
-On Wikipedia, a rhapsody is defined as follows~\citep{wikipedia_contributors_rhapsody_2022}:
+Wikipedia defines a rhapsody as follows \citep{wikipedia_contributors_rhapsody_2022}:
\begin{quote}
A \textbf{rhapsody} in music is a one-movement work that is episodic yet integrated, free-flowing in structure, featuring a range of highly contrasted moods, colour, and tonality. An air of spontaneous inspiration and a sense of improvisation make it freer in form than a set of variations.
\end{quote}
After reading the first chapter, subsequent chapters in this movement are readable independently.
\subsubsection*{\fullref{chp:dsl_embedding_techniques}}
-This chapter shows the basic \gls{DSL} embedding techniques and compares the properties of several embedding methods.
-This chapter is not based on a paper and written as a extra background material for the subsequent chapters in the movement.
+This chapter outlines the basic \gls{DSL} embedding techniques and compares the properties of several embedding methods.
+By example, it provides intuition on shallow embedding, including tagless-final embedding and deep embedding, including deep embedding with \acrshortpl{GADT}.
+It is not based on a paper but written as gentle background material for the subsequent chapters in the movement.
\subsubsection*{\fullref{chp:classy_deep_embedding}}
-This chapter is based on the paper: \bibentry{lubbers_deep_2022}\todo{change in-press when published}.
+This chapter is based on the paper: \citeentry{lubbers_deep_2022}\todo{change in-press when published}
While supervising \citeauthor{amazonas_cabral_de_andrade_developing_2018}'s \citeyear{amazonas_cabral_de_andrade_developing_2018} Master's thesis, focussing on an early version of \gls{MTASK}, a seed was planted for a novel deep embedding technique for \glspl{DSL} where the resulting language is extendible both in constructs and in interpretation using type classes and existential data types.
Slowly the ideas organically grew to form the technique shown in the paper.
The research from this paper and writing the paper was solely performed by me.
\Cref{sec:classy_reprise} was added after publication and contains a (yet) unpublished extension of the embedding technique.
+The related work section (\cref{sec:cde:related}) is also brought up to date.\todo{weghalen als dit niet het geval is}
\subsubsection*{\fullref{chp:first-class_datatypes}}
-This chapter is based on the paper: \bibentry{lubbers_first-class_2022}\todo{change when accepted}.
+This chapter is based on the paper: \citeentry{lubbers_first-class_2022}\todo{change when accepted}
-It shows how to inherit data types from the host language in \glspl{EDSL} using metaprogramming.
+When embedding \glspl{DSL} many features of the host language can be inherited.
+However, data types from the host are not first-class citizens, in order to use the datatypes, access functions need to be created in the \gls{DSL} resulting in boilerplate.
+This paper shows how to inherit data types from the host language in \glspl{EDSL} using metaprogramming by generating the boilerplate required.
The research in this paper and writing the paper was performed by me, though there were weekly meetings with Pieter Koopman and Rinus Plasmeijer in which we discussed and refined the ideas.
This part is a monograph focussing on \glspl{TOP} for the \gls{IOT} and hence are the chapters best read in order.
The monograph is compiled from the following papers and revised lecture notes.
-\newcommand{\citeentry}[1]{\begin{NoHyper}\bibentry{#1}\end{NoHyper}. \citep{#1}}
\begin{itemize}
\item \citeentry{koopman_task-based_2018}
- This was the initial \gls{TOP}/\gls{MTASK} paper.
- Pieter Koopman wrote it, I helped with the software and research.
+ While an imperative predecessor of \gls{MTASK} was conceived in 2017 \citep{plasmeijer_shallow_2016}, this paper showed the first \gls{TOP} version of \gls{MTASK}.
+ It shows the design of the language and three intepretations: pretty printing, simulation using \gls{ITASK} and \gls{C} code generation.
+ Pieter Koopman wrote the paper, I helped with the software and research.
\item \citeentry{lubbers_task_2018}
This paper was an extension of my Master's thesis~\citep{lubbers_task_2017}.