Merge.
ci/woodpecker/push/build Pipeline was successful
Details
ci/woodpecker/push/build Pipeline was successful
Details
This commit is contained in:
commit
4393042f2c
|
@ -184,7 +184,7 @@ scenarios from the point of view of the ontology modeling. In @{text_section (un
|
|||
we discuss the user-interaction generated from the ontological definitions. Finally, we draw
|
||||
conclusions and discuss related work in @{text_section (unchecked) \<open>conclusion\<close>}. \<close>
|
||||
|
||||
section*[bgrnd::text_section,main_author="Some(@{docitem ''bu''}::author)"]
|
||||
section*[bgrnd::text_section,main_author="Some(@{author ''bu''}::author)"]
|
||||
\<open> Background: The Isabelle System \<close>
|
||||
text*[background::introduction, level="Some 1"]\<open>
|
||||
While Isabelle is widely perceived as an interactive theorem prover for HOL
|
||||
|
@ -246,7 +246,7 @@ can be type-checked before being displayed and can be used for calculations befo
|
|||
typeset. When editing, Isabelle's PIDE offers auto-completion and error-messages while typing the
|
||||
above \<^emph>\<open>semi-formal\<close> content.\<close>
|
||||
|
||||
section*[isadof::technical,main_author="Some(@{docitem ''adb''}::author)"]\<open> \<^isadof> \<close>
|
||||
section*[isadof::technical,main_author="Some(@{author ''adb''}::author)"]\<open> \<^isadof> \<close>
|
||||
|
||||
text\<open> An \<^isadof> document consists of three components:
|
||||
\<^item> the \<^emph>\<open>ontology definition\<close> which is an Isabelle theory file with definitions
|
||||
|
@ -635,20 +635,20 @@ text\<open> We present a selection of interaction scenarios @{example \<open>sc
|
|||
and @{example \<open>cenelec_onto\<close>} with Isabelle/PIDE instrumented by \<^isadof>. \<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["text-elements"::float]
|
||||
declare_reference*["text_elements"::float]
|
||||
declare_reference*["hyperlinks"::float]
|
||||
(*>*)
|
||||
|
||||
subsection*[scholar_pide::example]\<open> A Scholarly Paper \<close>
|
||||
text\<open> In @{float (unchecked) "text-elements"}~(a)
|
||||
and @{float (unchecked) "text-elements"}~(b)we show how
|
||||
text\<open> In @{float (unchecked) "text_elements"}~(a)
|
||||
and @{float (unchecked) "text_elements"}~(b)we show how
|
||||
hovering over links permits to explore its meta-information.
|
||||
Clicking on a document class identifier permits to hyperlink into the corresponding
|
||||
class definition (@{float (unchecked) "hyperlinks"}~(a)); hovering over an attribute-definition
|
||||
(which is qualified in order to disambiguate; @{float (unchecked) "hyperlinks"}~(b)).
|
||||
\<close>
|
||||
|
||||
text*["text-elements"::float,
|
||||
text*["text_elements"::float,
|
||||
main_caption="\<open>Exploring text elements.\<close>"]
|
||||
\<open>
|
||||
@{fig_content (width=53, height=5, caption="Exploring a reference of a text element.") "figures/Dogfood-II-bgnd1.png"
|
||||
|
|
|
@ -54,7 +54,7 @@ abstract*[abs, keywordlist="[\<open>Shallow Embedding\<close>,\<open>Process-Alg
|
|||
If you consider citing this paper, please refer to @{cite "HOL-CSP-iFM2020"}.
|
||||
\<close>
|
||||
text\<open>\<close>
|
||||
section*[introheader::introduction,main_author="Some(@{docitem ''bu''}::author)"]\<open> Introduction \<close>
|
||||
section*[introheader::introduction,main_author="Some(@{author ''bu''}::author)"]\<open> Introduction \<close>
|
||||
text*[introtext::introduction, level="Some 1"]\<open>
|
||||
Communicating Sequential Processes (\<^csp>) is a language to specify and verify patterns of
|
||||
interaction of concurrent systems. Together with CCS and LOTOS, it belongs to the family of
|
||||
|
@ -126,10 +126,10 @@ attempt to formalize denotational \<^csp> semantics covering a part of Bill Rosc
|
|||
omitted.\<close>}.
|
||||
\<close>
|
||||
|
||||
section*["pre"::tc,main_author="Some(@{author \<open>bu\<close>}::author)"]
|
||||
section*["pre"::technical,main_author="Some(@{author \<open>bu\<close>}::author)"]
|
||||
\<open>Preliminaries\<close>
|
||||
|
||||
subsection*[cspsemantics::tc, main_author="Some(@{author ''bu''})"]\<open>Denotational \<^csp> Semantics\<close>
|
||||
subsection*[cspsemantics::technical, main_author="Some(@{author ''bu''})"]\<open>Denotational \<^csp> Semantics\<close>
|
||||
|
||||
text\<open> The denotational semantics (following @{cite "roscoe:csp:1998"}) comes in three layers:
|
||||
the \<^emph>\<open>trace model\<close>, the \<^emph>\<open>(stable) failures model\<close> and the \<^emph>\<open>failure/divergence model\<close>.
|
||||
|
@ -189,7 +189,7 @@ of @{cite "IsobeRoggenbach2010"} is restricted to a variant of the failures mode
|
|||
|
||||
\<close>
|
||||
|
||||
subsection*["isabelleHol"::tc, main_author="Some(@{author ''bu''})"]\<open>Isabelle/HOL\<close>
|
||||
subsection*["isabelleHol"::technical, main_author="Some(@{author ''bu''})"]\<open>Isabelle/HOL\<close>
|
||||
text\<open> Nowadays, Isabelle/HOL is one of the major interactive theory development environments
|
||||
@{cite "nipkow.ea:isabelle:2002"}. HOL stands for Higher-Order Logic, a logic based on simply-typed
|
||||
\<open>\<lambda>\<close>-calculus extended by parametric polymorphism and Haskell-like type-classes.
|
||||
|
@ -218,10 +218,10 @@ domain theory for a particular type-class \<open>\<alpha>::pcpo\<close>, \<^ie>
|
|||
fixed-point induction and other (automated) proof infrastructure. Isabelle's type-inference can
|
||||
automatically infer, for example, that if \<open>\<alpha>::pcpo\<close>, then \<open>(\<beta> \<Rightarrow> \<alpha>)::pcpo\<close>. \<close>
|
||||
|
||||
section*["csphol"::tc,main_author="Some(@{author ''bu''}::author)", level="Some 2"]
|
||||
section*["csphol"::technical,main_author="Some(@{author ''bu''}::author)", level="Some 2"]
|
||||
\<open>Formalising Denotational \<^csp> Semantics in HOL \<close>
|
||||
|
||||
subsection*["processinv"::tc, main_author="Some(@{author ''bu''})"]
|
||||
subsection*["processinv"::technical, main_author="Some(@{author ''bu''})"]
|
||||
\<open>Process Invariant and Process Type\<close>
|
||||
text\<open> First, we need a slight revision of the concept
|
||||
of \<^emph>\<open>trace\<close>: if \<open>\<Sigma>\<close> is the type of the atomic events (represented by a type variable), then
|
||||
|
@ -272,7 +272,7 @@ but this can be constructed in a straight-forward manner. Suitable definitions f
|
|||
\<open>\<T>\<close>, \<open>\<F>\<close> and \<open>\<D>\<close> lifting \<open>fst\<close> and \<open>snd\<close> on the new \<open>'\<alpha> process\<close>-type allows to derive
|
||||
the above properties for any \<open>P::'\<alpha> process\<close>. \<close>
|
||||
|
||||
subsection*["operator"::tc, main_author="Some(@{author ''lina''})"]
|
||||
subsection*["operator"::technical, main_author="Some(@{author ''lina''})"]
|
||||
\<open>\<^csp> Operators over the Process Type\<close>
|
||||
text\<open> Now, the operators of \<^csp> \<open>Skip\<close>, \<open>Stop\<close>, \<open>_\<sqinter>_\<close>, \<open>_\<box>_\<close>, \<open>_\<rightarrow>_\<close>,\<open>_\<lbrakk>_\<rbrakk>_\<close> etc.
|
||||
for internal choice, external choice, prefix and parallel composition, can
|
||||
|
@ -297,7 +297,7 @@ The definitional presentation of the \<^csp> process operators according to @{ci
|
|||
follows always this scheme. This part of the theory comprises around 2000 loc.
|
||||
\<close>
|
||||
|
||||
subsection*["orderings"::tc, main_author="Some(@{author ''bu''})"]
|
||||
subsection*["orderings"::technical, main_author="Some(@{author ''bu''})"]
|
||||
\<open>Refinement Orderings\<close>
|
||||
|
||||
text\<open> \<^csp> is centered around the idea of process refinement; many critical properties,
|
||||
|
@ -327,7 +327,7 @@ states, from which no internal progress is possible.
|
|||
\<close>
|
||||
|
||||
|
||||
subsection*["fixpoint"::tc, main_author="Some(@{author ''lina''})"]
|
||||
subsection*["fixpoint"::technical, main_author="Some(@{author ''lina''})"]
|
||||
\<open>Process Ordering and HOLCF\<close>
|
||||
text\<open> For any denotational semantics, the fixed point theory giving semantics to systems
|
||||
of recursive equations is considered as keystone. Its prerequisite is a complete partial ordering
|
||||
|
@ -394,7 +394,7 @@ Fixed-point inductions are the main proof weapon in verifications, together with
|
|||
and the \<^csp> laws. Denotational arguments can be hidden as they are not needed in practical
|
||||
verifications. \<close>
|
||||
|
||||
subsection*["law"::tc, main_author="Some(@{author ''lina''})"]
|
||||
subsection*["law"::technical, main_author="Some(@{author ''lina''})"]
|
||||
\<open>\<^csp> Rules: Improved Proofs and New Results\<close>
|
||||
|
||||
|
||||
|
@ -436,11 +436,11 @@ cases to be considered as well as their complexity makes pen and paper proofs
|
|||
practically infeasible.
|
||||
\<close>
|
||||
|
||||
section*["newResults"::tc,main_author="Some(@{author ''safouan''}::author)",
|
||||
section*["newResults"::technical,main_author="Some(@{author ''safouan''}::author)",
|
||||
main_author="Some(@{author ''lina''}::author)", level= "Some 3"]
|
||||
\<open>Theoretical Results on Refinement\<close>
|
||||
text\<open>\<close>
|
||||
subsection*["adm"::tc,main_author="Some(@{author ''safouan''}::author)",
|
||||
subsection*["adm"::technical,main_author="Some(@{author ''safouan''}::author)",
|
||||
main_author="Some(@{author ''lina''}::author)"]
|
||||
\<open>Decomposition Rules\<close>
|
||||
text\<open>
|
||||
|
@ -476,7 +476,7 @@ The failure and divergence projections of this operator are also interdependent,
|
|||
sequence operator. Hence, this operator is not monotonic with \<open>\<sqsubseteq>\<^sub>\<F>\<close>, \<open>\<sqsubseteq>\<^sub>\<D>\<close> and \<open>\<sqsubseteq>\<^sub>\<T>\<close>, but monotonic
|
||||
when their combinations are considered. \<close>
|
||||
|
||||
subsection*["processes"::tc,main_author="Some(@{author ''safouan''}::author)",
|
||||
subsection*["processes"::technical,main_author="Some(@{author ''safouan''}::author)",
|
||||
main_author="Some(@{author ''lina''}::author)"]
|
||||
\<open>Reference Processes and their Properties\<close>
|
||||
text\<open>
|
||||
|
@ -597,7 +597,7 @@ then it may still be livelock-free. % This makes sense since livelocks are worse
|
|||
|
||||
\<close>
|
||||
|
||||
section*["advanced"::tc,main_author="Some(@{author ''safouan''}::author)",level="Some 3"]
|
||||
section*["advanced"::technical,main_author="Some(@{author ''safouan''}::author)",level="Some 3"]
|
||||
\<open>Advanced Verification Techniques\<close>
|
||||
|
||||
text\<open>
|
||||
|
@ -612,7 +612,7 @@ verification. In the latter case, we present an approach to a verification of a
|
|||
architecture, in this case a ring-structure of arbitrary size.
|
||||
\<close>
|
||||
|
||||
subsection*["illustration"::tc,main_author="Some(@{author ''safouan''}::author)", level="Some 3"]
|
||||
subsection*["illustration"::technical,main_author="Some(@{author ''safouan''}::author)", level="Some 3"]
|
||||
\<open>The General CopyBuffer Example\<close>
|
||||
text\<open>
|
||||
We consider the paradigmatic copy buffer example @{cite "Hoare:1985:CSP:3921" and "Roscoe:UCS:2010"}
|
||||
|
@ -677,7 +677,7 @@ corollary deadlock_free COPY
|
|||
\<close>
|
||||
|
||||
|
||||
subsection*["inductions"::tc,main_author="Some(@{author ''safouan''}::author)"]
|
||||
subsection*["inductions"::technical,main_author="Some(@{author ''safouan''}::author)"]
|
||||
\<open>New Fixed-Point Inductions\<close>
|
||||
|
||||
text\<open>
|
||||
|
@ -727,7 +727,7 @@ The astute reader may notice here that if the induction step is weakened (having
|
|||
the base steps require enforcement.
|
||||
\<close>
|
||||
|
||||
subsection*["norm"::tc,main_author="Some(@{author ''safouan''}::author)"]
|
||||
subsection*["norm"::technical,main_author="Some(@{author ''safouan''}::author)"]
|
||||
\<open>Normalization\<close>
|
||||
text\<open>
|
||||
Our framework can reason not only over infinite alphabets, but also over processes parameterized
|
||||
|
@ -787,7 +787,7 @@ Summing up, our method consists of four stages:
|
|||
|
||||
\<close>
|
||||
|
||||
subsection*["dining_philosophers"::tc,main_author="Some(@{author ''safouan''}::author)",level="Some 3"]
|
||||
subsection*["dining_philosophers"::technical,main_author="Some(@{author ''safouan''}::author)",level="Some 3"]
|
||||
\<open>Generalized Dining Philosophers\<close>
|
||||
|
||||
text\<open> The dining philosophers problem is another paradigmatic example in the \<^csp> literature
|
||||
|
@ -879,7 +879,7 @@ for a dozen of philosophers (on a usual machine) due to the exponential combinat
|
|||
Furthermore, our proof is fairly stable against modifications like adding non synchronized events like
|
||||
thinking or sitting down in contrast to model-checking techniques. \<close>
|
||||
|
||||
section*["relatedwork"::tc,main_author="Some(@{author ''lina''}::author)",level="Some 3"]
|
||||
section*["relatedwork"::technical,main_author="Some(@{author ''lina''}::author)",level="Some 3"]
|
||||
\<open>Related work\<close>
|
||||
|
||||
text\<open>
|
||||
|
|
|
@ -102,13 +102,13 @@ text\<open>
|
|||
functioning of the system and for its integration into the system as a whole. In
|
||||
particular, we need to make the following assumptions explicit: \<^vs>\<open>-0.3cm\<close>\<close>
|
||||
|
||||
text*["perfect-wheel"::assumption]
|
||||
text*["perfect_wheel"::assumption]
|
||||
\<open>\<^item> the wheel is perfectly circular with a given, constant radius. \<^vs>\<open>-0.3cm\<close>\<close>
|
||||
text*["no-slip"::assumption]
|
||||
text*["no_slip"::assumption]
|
||||
\<open>\<^item> the slip between the trains wheel and the track negligible. \<^vs>\<open>-0.3cm\<close>\<close>
|
||||
text*["constant-teeth-dist"::assumption]
|
||||
text*["constant_teeth_dist"::assumption]
|
||||
\<open>\<^item> the distance between all teeth of a wheel is the same and constant, and \<^vs>\<open>-0.3cm\<close>\<close>
|
||||
text*["constant-sampling-rate"::assumption]
|
||||
text*["constant_sampling_rate"::assumption]
|
||||
\<open>\<^item> the sampling rate of positions is a given constant.\<close>
|
||||
|
||||
text\<open>
|
||||
|
@ -126,13 +126,13 @@ text\<open>
|
|||
|
||||
subsection\<open>Capturing ``System Architecture.''\<close>
|
||||
|
||||
figure*["three-phase"::figure,relative_width="70",file_src="''figures/three-phase-odo.pdf''"]
|
||||
figure*["three_phase"::figure,relative_width="70",file_src="''figures/three-phase-odo.pdf''"]
|
||||
\<open>An odometer with three sensors \<open>C1\<close>, \<open>C2\<close>, and \<open>C3\<close>.\<close>
|
||||
|
||||
text\<open>
|
||||
The requirements analysis also contains a document \<^doc_class>\<open>SYSAD\<close>
|
||||
(\<^typ>\<open>system_architecture_description\<close>) that contains technical drawing of the odometer,
|
||||
a timing diagram (see \<^figure>\<open>three-phase\<close>), and tables describing the encoding of the position
|
||||
a timing diagram (see \<^figure>\<open>three_phase\<close>), and tables describing the encoding of the position
|
||||
for the possible signal transitions of the sensors \<open>C1\<close>, \<open>C2\<close>, and \<open>C3\<close>.
|
||||
\<close>
|
||||
|
||||
|
@ -146,7 +146,7 @@ text\<open>
|
|||
sub-system configuration. \<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["df-numerics-encshaft"::figure]
|
||||
declare_reference*["df_numerics_encshaft"::figure]
|
||||
(*>*)
|
||||
subsection\<open>Capturing ``Required Performances.''\<close>
|
||||
text\<open>
|
||||
|
@ -160,9 +160,9 @@ text\<open>
|
|||
|
||||
The requirement analysis document describes the physical environment, the architecture
|
||||
of the measuring device, and the required format and precision of the measurements of the odometry
|
||||
function as represented (see @{figure (unchecked) "df-numerics-encshaft"}).\<close>
|
||||
function as represented (see @{figure (unchecked) "df_numerics_encshaft"}).\<close>
|
||||
|
||||
figure*["df-numerics-encshaft"::figure,relative_width="76",file_src="''figures/df-numerics-encshaft.png''"]
|
||||
figure*["df_numerics_encshaft"::figure,relative_width="76",file_src="''figures/df-numerics-encshaft.png''"]
|
||||
\<open>Real distance vs. discrete distance vs. shaft-encoder sequence\<close>
|
||||
|
||||
|
||||
|
@ -215,7 +215,7 @@ text\<open>
|
|||
concepts such as Cauchy Sequences, limits, differentiability, and a very substantial part of
|
||||
classical Calculus. \<open>SOME\<close> is the Hilbert choice operator from HOL; the definitions of the
|
||||
model parameters admit all possible positive values as uninterpreted constants. Our
|
||||
\<^assumption>\<open>perfect-wheel\<close> is translated into a calculation of the circumference of the
|
||||
\<^assumption>\<open>perfect_wheel\<close> is translated into a calculation of the circumference of the
|
||||
wheel, while \<open>\<delta>s\<^sub>r\<^sub>e\<^sub>s\<close>, the resolution of the odometer, can be calculated
|
||||
from the these parameters. HOL-Analysis permits to formalize the fundamental physical observables:
|
||||
\<close>
|
||||
|
|
|
@ -30,6 +30,10 @@ text\<open>
|
|||
\vfill
|
||||
\<close>
|
||||
|
||||
text\<open>
|
||||
@{block (title = "\<open>Title\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close>") "\<open>Block content\<^sub>t\<^sub>e\<^sub>s\<^sub>t\<close>"}
|
||||
\<close>
|
||||
|
||||
(*<*)
|
||||
end
|
||||
(*>*)
|
||||
|
|
|
@ -807,7 +807,7 @@ text\<open> They reflect the Pure logic depicted in a number of presentations s
|
|||
Notated as logical inference rules, these operations were presented as follows:
|
||||
\<close>
|
||||
|
||||
text*["text-elements"::float,
|
||||
text*["text_elements"::float,
|
||||
main_caption="\<open>Kernel Inference Rules.\<close>"]
|
||||
\<open>
|
||||
@{fig_content (width=48, caption="Pure Kernel Inference Rules I.") "figures/pure-inferences-I.pdf"
|
||||
|
|
|
@ -131,7 +131,7 @@ type_synonym XX = B
|
|||
|
||||
section\<open>Examples of inheritance \<close>
|
||||
|
||||
doc_class C = XX +
|
||||
doc_class C = B +
|
||||
z :: "A option" <= None (* A LINK, i.e. an attribute that has a type
|
||||
referring to a document class. Mathematical
|
||||
relations over document items can be modeled. *)
|
||||
|
|
|
@ -160,7 +160,7 @@ ML\<open> @{docitem_attribute a2::omega};
|
|||
|
||||
type_synonym ALFACENTAURI = E
|
||||
|
||||
update_instance*[omega::ALFACENTAURI, x+="''inition''"]
|
||||
update_instance*[omega::E, x+="''inition''"]
|
||||
|
||||
ML\<open> val s = HOLogic.dest_string ( @{docitem_attribute x::omega}) \<close>
|
||||
|
||||
|
|
|
@ -144,8 +144,8 @@ update_instance*[f::F,r:="[@{thm ''Concept_OntoReferencing.some_proof''}]"]
|
|||
text\<open> ..., mauris amet, id elit aliquam aptent id, ... @{docitem \<open>a\<close>} \<close>
|
||||
(*>*)
|
||||
text\<open>Here we add and maintain a link that is actually modeled as m-to-n relation ...\<close>
|
||||
update_instance*[f::F,b:="{(@{docitem \<open>a\<close>}::A,@{docitem \<open>c1\<close>}::C),
|
||||
(@{docitem \<open>a\<close>}, @{docitem \<open>c2\<close>})}"]
|
||||
update_instance*[f::F,b:="{(@{A \<open>a\<close>}::A,@{C \<open>c1\<close>}::C),
|
||||
(@{A \<open>a\<close>}, @{C \<open>c2\<close>})}"]
|
||||
|
||||
section\<open>Closing the Monitor and testing the Results.\<close>
|
||||
|
||||
|
|
|
@ -116,20 +116,22 @@ section\<open>Putting everything together\<close>
|
|||
|
||||
text\<open>Major sample: test-item of doc-class \<open>F\<close> with a relational link between class instances,
|
||||
and links to formal Isabelle items like \<open>typ\<close>, \<open>term\<close> and \<open>thm\<close>. \<close>
|
||||
declare[[ML_print_depth = 10000]]
|
||||
text*[xcv4::F, r="[@{thm ''HOL.refl''},
|
||||
@{thm \<open>Concept_TermAntiquotations.local_sample_lemma\<close>}]", (* long names required *)
|
||||
b="{(@{docitem ''xcv1''},@{docitem \<open>xcv2\<close>})}", (* notations \<open>...\<close> vs. ''...'' *)
|
||||
b="{(@{A ''xcv1''},@{C \<open>xcv2\<close>})}", (* notations \<open>...\<close> vs. ''...'' *)
|
||||
s="[@{typ \<open>int list\<close>}]",
|
||||
properties = "[@{term \<open>H \<longrightarrow> H\<close>}]" (* notation \<open>...\<close> required for UTF8*)
|
||||
]\<open>Lorem ipsum ...\<close>
|
||||
|
||||
declare[[ML_print_depth = 20]]
|
||||
text*[xcv5::G, g="@{thm \<open>HOL.sym\<close>}"]\<open>Lorem ipsum ...\<close>
|
||||
|
||||
text\<open>... and here we add a relation between @{docitem \<open>xcv3\<close>} and @{docitem \<open>xcv2\<close>}
|
||||
into the relation \verb+b+ of @{docitem \<open>xcv5\<close>}. Note that in the link-relation,
|
||||
a @{typ "C"}-type is required, but a @{typ "G"}-type is offered which is legal in
|
||||
\verb+Isa_DOF+ because of the sub-class relation between those classes: \<close>
|
||||
update_instance*[xcv4::F, b+="{(@{docitem ''xcv3''},@{docitem ''xcv5''})}"]
|
||||
a @{typ "C"}-type is required, so if a @{typ "G"}-type is offered, it is considered illegal
|
||||
in \verb+Isa_DOF+ despite the sub-class relation between those classes: \<close>
|
||||
update_instance-assert-error[xcv4::F, b+="{(@{docitem ''xcv3''},@{docitem ''xcv5''})}"]
|
||||
\<open>Type unification failed\<close>
|
||||
|
||||
text\<open>And here is the results of some ML-term antiquotations:\<close>
|
||||
ML\<open> @{docitem_attribute b::xcv4} \<close>
|
||||
|
|
|
@ -187,7 +187,7 @@ to update the instance @{docitem \<open>xcv4\<close>}:
|
|||
\<close>
|
||||
|
||||
update_instance-assert-error[xcv4::F, b+="{(@{A ''xcv3''},@{G ''xcv5''})}"]
|
||||
\<open>type of attribute: Conceptual.F.b does not fit to term\<close>
|
||||
\<open>Type unification failed: Clash of types\<close>
|
||||
|
||||
|
||||
section\<open>\<^theory_text>\<open>assert*\<close>-Annotated assertion-commands\<close>
|
||||
|
@ -225,11 +225,11 @@ text\<open>... and here we reference @{A \<open>assertionA\<close>}.\<close>
|
|||
(*>*)
|
||||
assert*\<open>evidence @{result \<open>resultProof\<close>} = evidence @{result \<open>resultProof2\<close>}\<close>
|
||||
|
||||
text\<open>The optional evaluator of \<open>value*\<close> and \<open>assert*\<close> must be specified after the meta arguments:\<close>
|
||||
value* [optional_test_A::A, x=6] [nbe] \<open>filter (\<lambda>\<sigma>. A.x \<sigma> > 5) @{A_instances}\<close>
|
||||
text\<open>The optional evaluator of \<open>value*\<close> and \<open>assert*\<close> must be specified before the meta arguments:\<close>
|
||||
value* [nbe] [optional_test_A::A, x=6] \<open>filter (\<lambda>\<sigma>. A.x \<sigma> > 5) @{A_instances}\<close>
|
||||
|
||||
assert* [resultProof3::result, evidence = "proof", property="[@{thm \<open>HOL.sym\<close>}]"] [nbe]
|
||||
\<open>evidence @{result \<open>resultProof3\<close>} = evidence @{result \<open>resultProof2\<close>}\<close>
|
||||
assert* [nbe] [resultProof3::result, evidence = "proof", property="[@{thm \<open>HOL.sym\<close>}]"]
|
||||
\<open>evidence @{result \<open>resultProof3\<close>} = evidence @{result \<open>resultProof2\<close>}\<close>
|
||||
|
||||
text\<open>
|
||||
The evaluation of @{command "assert*"} can be disabled
|
||||
|
|
|
@ -390,11 +390,11 @@ text-latex\<open>
|
|||
ML\<open>
|
||||
|
||||
fun gen_enriched_document_command3 name {body} cid_transform attr_transform markdown
|
||||
(((((oid,pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t,
|
||||
((((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t,
|
||||
xstring_opt:(xstring * Position.T) option),
|
||||
toks:Input.source list)
|
||||
= gen_enriched_document_command2 name {body=body} cid_transform attr_transform markdown
|
||||
(((((oid,pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t,
|
||||
((((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t,
|
||||
xstring_opt:(xstring * Position.T) option),
|
||||
toks) \<comment> \<open>Hack : drop second and thrd args.\<close>
|
||||
|
||||
|
|
|
@ -382,11 +382,11 @@ text-latex\<open>
|
|||
ML\<open>
|
||||
|
||||
fun gen_enriched_document_command3 name {body} cid_transform attr_transform markdown
|
||||
(((((oid,pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t,
|
||||
((((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t,
|
||||
xstring_opt:(xstring * Position.T) option),
|
||||
toks:Input.source list)
|
||||
= gen_enriched_document_command2 name {body=body} cid_transform attr_transform markdown
|
||||
(((((oid,pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t,
|
||||
((((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t,
|
||||
xstring_opt:(xstring * Position.T) option),
|
||||
toks) \<comment> \<open>Hack : drop second and thrd args.\<close>
|
||||
|
||||
|
|
|
@ -16,6 +16,7 @@ session "Isabelle_DOF-Unit-Tests" = "Isabelle_DOF-Ontologies" +
|
|||
"Cenelec_Test"
|
||||
"OutOfOrderPresntn"
|
||||
"COL_Test"
|
||||
"Test_Polymorphic_Classes"
|
||||
document_files
|
||||
"root.bib"
|
||||
"figures/A.png"
|
||||
|
|
|
@ -1,463 +0,0 @@
|
|||
(* Title: HOL/Record.thy
|
||||
Author: Wolfgang Naraschewski, TU Muenchen
|
||||
Author: Markus Wenzel, TU Muenchen
|
||||
Author: Norbert Schirmer, TU Muenchen
|
||||
Author: Thomas Sewell, NICTA
|
||||
Author: Florian Haftmann, TU Muenchen
|
||||
*)
|
||||
|
||||
section \<open>Extensible records with structural subtyping\<close>
|
||||
|
||||
theory Test
|
||||
imports HOL.Quickcheck_Exhaustive
|
||||
keywords
|
||||
"record*" :: thy_defn and
|
||||
"print_record*" :: diag
|
||||
begin
|
||||
|
||||
subsection \<open>Introduction\<close>
|
||||
|
||||
text \<open>
|
||||
Records are isomorphic to compound tuple types. To implement
|
||||
efficient records, we make this isomorphism explicit. Consider the
|
||||
record access/update simplification \<open>alpha (beta_update f
|
||||
rec) = alpha rec\<close> for distinct fields alpha and beta of some record
|
||||
rec with n fields. There are \<open>n ^ 2\<close> such theorems, which
|
||||
prohibits storage of all of them for large n. The rules can be
|
||||
proved on the fly by case decomposition and simplification in O(n)
|
||||
time. By creating O(n) isomorphic-tuple types while defining the
|
||||
record, however, we can prove the access/update simplification in
|
||||
\<open>O(log(n)^2)\<close> time.
|
||||
|
||||
The O(n) cost of case decomposition is not because O(n) steps are
|
||||
taken, but rather because the resulting rule must contain O(n) new
|
||||
variables and an O(n) size concrete record construction. To sidestep
|
||||
this cost, we would like to avoid case decomposition in proving
|
||||
access/update theorems.
|
||||
|
||||
Record types are defined as isomorphic to tuple types. For instance,
|
||||
a record type with fields \<open>'a\<close>, \<open>'b\<close>, \<open>'c\<close>
|
||||
and \<open>'d\<close> might be introduced as isomorphic to \<open>'a \<times>
|
||||
('b \<times> ('c \<times> 'd))\<close>. If we balance the tuple tree to \<open>('a \<times>
|
||||
'b) \<times> ('c \<times> 'd)\<close> then accessors can be defined by converting to the
|
||||
underlying type then using O(log(n)) fst or snd operations.
|
||||
Updators can be defined similarly, if we introduce a \<open>fst_update\<close> and \<open>snd_update\<close> function. Furthermore, we can
|
||||
prove the access/update theorem in O(log(n)) steps by using simple
|
||||
rewrites on fst, snd, \<open>fst_update\<close> and \<open>snd_update\<close>.
|
||||
|
||||
The catch is that, although O(log(n)) steps were taken, the
|
||||
underlying type we converted to is a tuple tree of size
|
||||
O(n). Processing this term type wastes performance. We avoid this
|
||||
for large n by taking each subtree of size K and defining a new type
|
||||
isomorphic to that tuple subtree. A record can now be defined as
|
||||
isomorphic to a tuple tree of these O(n/K) new types, or, if \<open>n > K*K\<close>, we can repeat the process, until the record can be
|
||||
defined in terms of a tuple tree of complexity less than the
|
||||
constant K.
|
||||
|
||||
If we prove the access/update theorem on this type with the
|
||||
analogous steps to the tuple tree, we consume \<open>O(log(n)^2)\<close>
|
||||
time as the intermediate terms are \<open>O(log(n))\<close> in size and
|
||||
the types needed have size bounded by K. To enable this analogous
|
||||
traversal, we define the functions seen below: \<open>iso_tuple_fst\<close>, \<open>iso_tuple_snd\<close>, \<open>iso_tuple_fst_update\<close>
|
||||
and \<open>iso_tuple_snd_update\<close>. These functions generalise tuple
|
||||
operations by taking a parameter that encapsulates a tuple
|
||||
isomorphism. The rewrites needed on these functions now need an
|
||||
additional assumption which is that the isomorphism works.
|
||||
|
||||
These rewrites are typically used in a structured way. They are here
|
||||
presented as the introduction rule \<open>isomorphic_tuple.intros\<close>
|
||||
rather than as a rewrite rule set. The introduction form is an
|
||||
optimisation, as net matching can be performed at one term location
|
||||
for each step rather than the simplifier searching the term for
|
||||
possible pattern matches. The rule set is used as it is viewed
|
||||
outside the locale, with the locale assumption (that the isomorphism
|
||||
is valid) left as a rule assumption. All rules are structured to aid
|
||||
net matching, using either a point-free form or an encapsulating
|
||||
predicate.
|
||||
\<close>
|
||||
|
||||
subsection \<open>Operators and lemmas for types isomorphic to tuples\<close>
|
||||
|
||||
datatype (dead 'a, dead 'b, dead 'c) tuple_isomorphism =
|
||||
Tuple_Isomorphism "'a \<Rightarrow> 'b \<times> 'c" "'b \<times> 'c \<Rightarrow> 'a"
|
||||
|
||||
primrec
|
||||
repr :: "('a, 'b, 'c) tuple_isomorphism \<Rightarrow> 'a \<Rightarrow> 'b \<times> 'c" where
|
||||
"repr (Tuple_Isomorphism r a) = r"
|
||||
|
||||
primrec
|
||||
abst :: "('a, 'b, 'c) tuple_isomorphism \<Rightarrow> 'b \<times> 'c \<Rightarrow> 'a" where
|
||||
"abst (Tuple_Isomorphism r a) = a"
|
||||
|
||||
definition
|
||||
iso_tuple_fst :: "('a, 'b, 'c) tuple_isomorphism \<Rightarrow> 'a \<Rightarrow> 'b" where
|
||||
"iso_tuple_fst isom = fst \<circ> repr isom"
|
||||
|
||||
definition
|
||||
iso_tuple_snd :: "('a, 'b, 'c) tuple_isomorphism \<Rightarrow> 'a \<Rightarrow> 'c" where
|
||||
"iso_tuple_snd isom = snd \<circ> repr isom"
|
||||
|
||||
definition
|
||||
iso_tuple_fst_update ::
|
||||
"('a, 'b, 'c) tuple_isomorphism \<Rightarrow> ('b \<Rightarrow> 'b) \<Rightarrow> ('a \<Rightarrow> 'a)" where
|
||||
"iso_tuple_fst_update isom f = abst isom \<circ> apfst f \<circ> repr isom"
|
||||
|
||||
definition
|
||||
iso_tuple_snd_update ::
|
||||
"('a, 'b, 'c) tuple_isomorphism \<Rightarrow> ('c \<Rightarrow> 'c) \<Rightarrow> ('a \<Rightarrow> 'a)" where
|
||||
"iso_tuple_snd_update isom f = abst isom \<circ> apsnd f \<circ> repr isom"
|
||||
|
||||
definition
|
||||
iso_tuple_cons ::
|
||||
"('a, 'b, 'c) tuple_isomorphism \<Rightarrow> 'b \<Rightarrow> 'c \<Rightarrow> 'a" where
|
||||
"iso_tuple_cons isom = curry (abst isom)"
|
||||
|
||||
|
||||
subsection \<open>Logical infrastructure for records\<close>
|
||||
|
||||
definition
|
||||
iso_tuple_surjective_proof_assist :: "'a \<Rightarrow> 'b \<Rightarrow> ('a \<Rightarrow> 'b) \<Rightarrow> bool" where
|
||||
"iso_tuple_surjective_proof_assist x y f \<longleftrightarrow> f x = y"
|
||||
|
||||
definition
|
||||
iso_tuple_update_accessor_cong_assist ::
|
||||
"(('b \<Rightarrow> 'b) \<Rightarrow> ('a \<Rightarrow> 'a)) \<Rightarrow> ('a \<Rightarrow> 'b) \<Rightarrow> bool" where
|
||||
"iso_tuple_update_accessor_cong_assist upd ac \<longleftrightarrow>
|
||||
(\<forall>f v. upd (\<lambda>x. f (ac v)) v = upd f v) \<and> (\<forall>v. upd id v = v)"
|
||||
|
||||
definition
|
||||
iso_tuple_update_accessor_eq_assist ::
|
||||
"(('b \<Rightarrow> 'b) \<Rightarrow> ('a \<Rightarrow> 'a)) \<Rightarrow> ('a \<Rightarrow> 'b) \<Rightarrow> 'a \<Rightarrow> ('b \<Rightarrow> 'b) \<Rightarrow> 'a \<Rightarrow> 'b \<Rightarrow> bool" where
|
||||
"iso_tuple_update_accessor_eq_assist upd ac v f v' x \<longleftrightarrow>
|
||||
upd f v = v' \<and> ac v = x \<and> iso_tuple_update_accessor_cong_assist upd ac"
|
||||
|
||||
lemma update_accessor_congruence_foldE:
|
||||
assumes uac: "iso_tuple_update_accessor_cong_assist upd ac"
|
||||
and r: "r = r'" and v: "ac r' = v'"
|
||||
and f: "\<And>v. v' = v \<Longrightarrow> f v = f' v"
|
||||
shows "upd f r = upd f' r'"
|
||||
using uac r v [symmetric]
|
||||
apply (subgoal_tac "upd (\<lambda>x. f (ac r')) r' = upd (\<lambda>x. f' (ac r')) r'")
|
||||
apply (simp add: iso_tuple_update_accessor_cong_assist_def)
|
||||
apply (simp add: f)
|
||||
done
|
||||
|
||||
lemma update_accessor_congruence_unfoldE:
|
||||
"iso_tuple_update_accessor_cong_assist upd ac \<Longrightarrow>
|
||||
r = r' \<Longrightarrow> ac r' = v' \<Longrightarrow> (\<And>v. v = v' \<Longrightarrow> f v = f' v) \<Longrightarrow>
|
||||
upd f r = upd f' r'"
|
||||
apply (erule(2) update_accessor_congruence_foldE)
|
||||
apply simp
|
||||
done
|
||||
|
||||
lemma iso_tuple_update_accessor_cong_assist_id:
|
||||
"iso_tuple_update_accessor_cong_assist upd ac \<Longrightarrow> upd id = id"
|
||||
by rule (simp add: iso_tuple_update_accessor_cong_assist_def)
|
||||
|
||||
lemma update_accessor_noopE:
|
||||
assumes uac: "iso_tuple_update_accessor_cong_assist upd ac"
|
||||
and ac: "f (ac x) = ac x"
|
||||
shows "upd f x = x"
|
||||
using uac
|
||||
by (simp add: ac iso_tuple_update_accessor_cong_assist_id [OF uac, unfolded id_def]
|
||||
cong: update_accessor_congruence_unfoldE [OF uac])
|
||||
|
||||
lemma update_accessor_noop_compE:
|
||||
assumes uac: "iso_tuple_update_accessor_cong_assist upd ac"
|
||||
and ac: "f (ac x) = ac x"
|
||||
shows "upd (g \<circ> f) x = upd g x"
|
||||
by (simp add: ac cong: update_accessor_congruence_unfoldE[OF uac])
|
||||
|
||||
lemma update_accessor_cong_assist_idI:
|
||||
"iso_tuple_update_accessor_cong_assist id id"
|
||||
by (simp add: iso_tuple_update_accessor_cong_assist_def)
|
||||
|
||||
lemma update_accessor_cong_assist_triv:
|
||||
"iso_tuple_update_accessor_cong_assist upd ac \<Longrightarrow>
|
||||
iso_tuple_update_accessor_cong_assist upd ac"
|
||||
by assumption
|
||||
|
||||
lemma update_accessor_accessor_eqE:
|
||||
"iso_tuple_update_accessor_eq_assist upd ac v f v' x \<Longrightarrow> ac v = x"
|
||||
by (simp add: iso_tuple_update_accessor_eq_assist_def)
|
||||
|
||||
lemma update_accessor_updator_eqE:
|
||||
"iso_tuple_update_accessor_eq_assist upd ac v f v' x \<Longrightarrow> upd f v = v'"
|
||||
by (simp add: iso_tuple_update_accessor_eq_assist_def)
|
||||
|
||||
lemma iso_tuple_update_accessor_eq_assist_idI:
|
||||
"v' = f v \<Longrightarrow> iso_tuple_update_accessor_eq_assist id id v f v' v"
|
||||
by (simp add: iso_tuple_update_accessor_eq_assist_def update_accessor_cong_assist_idI)
|
||||
|
||||
lemma iso_tuple_update_accessor_eq_assist_triv:
|
||||
"iso_tuple_update_accessor_eq_assist upd ac v f v' x \<Longrightarrow>
|
||||
iso_tuple_update_accessor_eq_assist upd ac v f v' x"
|
||||
by assumption
|
||||
|
||||
lemma iso_tuple_update_accessor_cong_from_eq:
|
||||
"iso_tuple_update_accessor_eq_assist upd ac v f v' x \<Longrightarrow>
|
||||
iso_tuple_update_accessor_cong_assist upd ac"
|
||||
by (simp add: iso_tuple_update_accessor_eq_assist_def)
|
||||
|
||||
lemma iso_tuple_surjective_proof_assistI:
|
||||
"f x = y \<Longrightarrow> iso_tuple_surjective_proof_assist x y f"
|
||||
by (simp add: iso_tuple_surjective_proof_assist_def)
|
||||
|
||||
lemma iso_tuple_surjective_proof_assist_idE:
|
||||
"iso_tuple_surjective_proof_assist x y id \<Longrightarrow> x = y"
|
||||
by (simp add: iso_tuple_surjective_proof_assist_def)
|
||||
|
||||
locale isomorphic_tuple =
|
||||
fixes isom :: "('a, 'b, 'c) tuple_isomorphism"
|
||||
assumes repr_inv: "\<And>x. abst isom (repr isom x) = x"
|
||||
and abst_inv: "\<And>y. repr isom (abst isom y) = y"
|
||||
begin
|
||||
|
||||
lemma repr_inj: "repr isom x = repr isom y \<longleftrightarrow> x = y"
|
||||
by (auto dest: arg_cong [of "repr isom x" "repr isom y" "abst isom"]
|
||||
simp add: repr_inv)
|
||||
|
||||
lemma abst_inj: "abst isom x = abst isom y \<longleftrightarrow> x = y"
|
||||
by (auto dest: arg_cong [of "abst isom x" "abst isom y" "repr isom"]
|
||||
simp add: abst_inv)
|
||||
|
||||
lemmas simps = Let_def repr_inv abst_inv repr_inj abst_inj
|
||||
|
||||
lemma iso_tuple_access_update_fst_fst:
|
||||
"f \<circ> h g = j \<circ> f \<Longrightarrow>
|
||||
(f \<circ> iso_tuple_fst isom) \<circ> (iso_tuple_fst_update isom \<circ> h) g =
|
||||
j \<circ> (f \<circ> iso_tuple_fst isom)"
|
||||
by (clarsimp simp: iso_tuple_fst_update_def iso_tuple_fst_def simps
|
||||
fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_access_update_snd_snd:
|
||||
"f \<circ> h g = j \<circ> f \<Longrightarrow>
|
||||
(f \<circ> iso_tuple_snd isom) \<circ> (iso_tuple_snd_update isom \<circ> h) g =
|
||||
j \<circ> (f \<circ> iso_tuple_snd isom)"
|
||||
by (clarsimp simp: iso_tuple_snd_update_def iso_tuple_snd_def simps
|
||||
fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_access_update_fst_snd:
|
||||
"(f \<circ> iso_tuple_fst isom) \<circ> (iso_tuple_snd_update isom \<circ> h) g =
|
||||
id \<circ> (f \<circ> iso_tuple_fst isom)"
|
||||
by (clarsimp simp: iso_tuple_snd_update_def iso_tuple_fst_def simps
|
||||
fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_access_update_snd_fst:
|
||||
"(f \<circ> iso_tuple_snd isom) \<circ> (iso_tuple_fst_update isom \<circ> h) g =
|
||||
id \<circ> (f \<circ> iso_tuple_snd isom)"
|
||||
by (clarsimp simp: iso_tuple_fst_update_def iso_tuple_snd_def simps
|
||||
fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_update_swap_fst_fst:
|
||||
"h f \<circ> j g = j g \<circ> h f \<Longrightarrow>
|
||||
(iso_tuple_fst_update isom \<circ> h) f \<circ> (iso_tuple_fst_update isom \<circ> j) g =
|
||||
(iso_tuple_fst_update isom \<circ> j) g \<circ> (iso_tuple_fst_update isom \<circ> h) f"
|
||||
by (clarsimp simp: iso_tuple_fst_update_def simps apfst_compose fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_update_swap_snd_snd:
|
||||
"h f \<circ> j g = j g \<circ> h f \<Longrightarrow>
|
||||
(iso_tuple_snd_update isom \<circ> h) f \<circ> (iso_tuple_snd_update isom \<circ> j) g =
|
||||
(iso_tuple_snd_update isom \<circ> j) g \<circ> (iso_tuple_snd_update isom \<circ> h) f"
|
||||
by (clarsimp simp: iso_tuple_snd_update_def simps apsnd_compose fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_update_swap_fst_snd:
|
||||
"(iso_tuple_snd_update isom \<circ> h) f \<circ> (iso_tuple_fst_update isom \<circ> j) g =
|
||||
(iso_tuple_fst_update isom \<circ> j) g \<circ> (iso_tuple_snd_update isom \<circ> h) f"
|
||||
by (clarsimp simp: iso_tuple_fst_update_def iso_tuple_snd_update_def
|
||||
simps fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_update_swap_snd_fst:
|
||||
"(iso_tuple_fst_update isom \<circ> h) f \<circ> (iso_tuple_snd_update isom \<circ> j) g =
|
||||
(iso_tuple_snd_update isom \<circ> j) g \<circ> (iso_tuple_fst_update isom \<circ> h) f"
|
||||
by (clarsimp simp: iso_tuple_fst_update_def iso_tuple_snd_update_def simps
|
||||
fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_update_compose_fst_fst:
|
||||
"h f \<circ> j g = k (f \<circ> g) \<Longrightarrow>
|
||||
(iso_tuple_fst_update isom \<circ> h) f \<circ> (iso_tuple_fst_update isom \<circ> j) g =
|
||||
(iso_tuple_fst_update isom \<circ> k) (f \<circ> g)"
|
||||
by (clarsimp simp: iso_tuple_fst_update_def simps apfst_compose fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_update_compose_snd_snd:
|
||||
"h f \<circ> j g = k (f \<circ> g) \<Longrightarrow>
|
||||
(iso_tuple_snd_update isom \<circ> h) f \<circ> (iso_tuple_snd_update isom \<circ> j) g =
|
||||
(iso_tuple_snd_update isom \<circ> k) (f \<circ> g)"
|
||||
by (clarsimp simp: iso_tuple_snd_update_def simps apsnd_compose fun_eq_iff)
|
||||
|
||||
lemma iso_tuple_surjective_proof_assist_step:
|
||||
"iso_tuple_surjective_proof_assist v a (iso_tuple_fst isom \<circ> f) \<Longrightarrow>
|
||||
iso_tuple_surjective_proof_assist v b (iso_tuple_snd isom \<circ> f) \<Longrightarrow>
|
||||
iso_tuple_surjective_proof_assist v (iso_tuple_cons isom a b) f"
|
||||
by (clarsimp simp: iso_tuple_surjective_proof_assist_def simps
|
||||
iso_tuple_fst_def iso_tuple_snd_def iso_tuple_cons_def)
|
||||
|
||||
lemma iso_tuple_fst_update_accessor_cong_assist:
|
||||
assumes "iso_tuple_update_accessor_cong_assist f g"
|
||||
shows "iso_tuple_update_accessor_cong_assist
|
||||
(iso_tuple_fst_update isom \<circ> f) (g \<circ> iso_tuple_fst isom)"
|
||||
proof -
|
||||
from assms have "f id = id"
|
||||
by (rule iso_tuple_update_accessor_cong_assist_id)
|
||||
with assms show ?thesis
|
||||
by (clarsimp simp: iso_tuple_update_accessor_cong_assist_def simps
|
||||
iso_tuple_fst_update_def iso_tuple_fst_def)
|
||||
qed
|
||||
|
||||
lemma iso_tuple_snd_update_accessor_cong_assist:
|
||||
assumes "iso_tuple_update_accessor_cong_assist f g"
|
||||
shows "iso_tuple_update_accessor_cong_assist
|
||||
(iso_tuple_snd_update isom \<circ> f) (g \<circ> iso_tuple_snd isom)"
|
||||
proof -
|
||||
from assms have "f id = id"
|
||||
by (rule iso_tuple_update_accessor_cong_assist_id)
|
||||
with assms show ?thesis
|
||||
by (clarsimp simp: iso_tuple_update_accessor_cong_assist_def simps
|
||||
iso_tuple_snd_update_def iso_tuple_snd_def)
|
||||
qed
|
||||
|
||||
lemma iso_tuple_fst_update_accessor_eq_assist:
|
||||
assumes "iso_tuple_update_accessor_eq_assist f g a u a' v"
|
||||
shows "iso_tuple_update_accessor_eq_assist
|
||||
(iso_tuple_fst_update isom \<circ> f) (g \<circ> iso_tuple_fst isom)
|
||||
(iso_tuple_cons isom a b) u (iso_tuple_cons isom a' b) v"
|
||||
proof -
|
||||
from assms have "f id = id"
|
||||
by (auto simp add: iso_tuple_update_accessor_eq_assist_def
|
||||
intro: iso_tuple_update_accessor_cong_assist_id)
|
||||
with assms show ?thesis
|
||||
by (clarsimp simp: iso_tuple_update_accessor_eq_assist_def
|
||||
iso_tuple_fst_update_def iso_tuple_fst_def
|
||||
iso_tuple_update_accessor_cong_assist_def iso_tuple_cons_def simps)
|
||||
qed
|
||||
|
||||
lemma iso_tuple_snd_update_accessor_eq_assist:
|
||||
assumes "iso_tuple_update_accessor_eq_assist f g b u b' v"
|
||||
shows "iso_tuple_update_accessor_eq_assist
|
||||
(iso_tuple_snd_update isom \<circ> f) (g \<circ> iso_tuple_snd isom)
|
||||
(iso_tuple_cons isom a b) u (iso_tuple_cons isom a b') v"
|
||||
proof -
|
||||
from assms have "f id = id"
|
||||
by (auto simp add: iso_tuple_update_accessor_eq_assist_def
|
||||
intro: iso_tuple_update_accessor_cong_assist_id)
|
||||
with assms show ?thesis
|
||||
by (clarsimp simp: iso_tuple_update_accessor_eq_assist_def
|
||||
iso_tuple_snd_update_def iso_tuple_snd_def
|
||||
iso_tuple_update_accessor_cong_assist_def iso_tuple_cons_def simps)
|
||||
qed
|
||||
|
||||
lemma iso_tuple_cons_conj_eqI:
|
||||
"a = c \<and> b = d \<and> P \<longleftrightarrow> Q \<Longrightarrow>
|
||||
iso_tuple_cons isom a b = iso_tuple_cons isom c d \<and> P \<longleftrightarrow> Q"
|
||||
by (clarsimp simp: iso_tuple_cons_def simps)
|
||||
|
||||
lemmas intros =
|
||||
iso_tuple_access_update_fst_fst
|
||||
iso_tuple_access_update_snd_snd
|
||||
iso_tuple_access_update_fst_snd
|
||||
iso_tuple_access_update_snd_fst
|
||||
iso_tuple_update_swap_fst_fst
|
||||
iso_tuple_update_swap_snd_snd
|
||||
iso_tuple_update_swap_fst_snd
|
||||
iso_tuple_update_swap_snd_fst
|
||||
iso_tuple_update_compose_fst_fst
|
||||
iso_tuple_update_compose_snd_snd
|
||||
iso_tuple_surjective_proof_assist_step
|
||||
iso_tuple_fst_update_accessor_eq_assist
|
||||
iso_tuple_snd_update_accessor_eq_assist
|
||||
iso_tuple_fst_update_accessor_cong_assist
|
||||
iso_tuple_snd_update_accessor_cong_assist
|
||||
iso_tuple_cons_conj_eqI
|
||||
|
||||
end
|
||||
|
||||
lemma isomorphic_tuple_intro:
|
||||
fixes repr abst
|
||||
assumes repr_inj: "\<And>x y. repr x = repr y \<longleftrightarrow> x = y"
|
||||
and abst_inv: "\<And>z. repr (abst z) = z"
|
||||
and v: "v \<equiv> Tuple_Isomorphism repr abst"
|
||||
shows "isomorphic_tuple v"
|
||||
proof
|
||||
fix x have "repr (abst (repr x)) = repr x"
|
||||
by (simp add: abst_inv)
|
||||
then show "Test.abst v (Test.repr v x) = x"
|
||||
by (simp add: v repr_inj)
|
||||
next
|
||||
fix y
|
||||
show "Test.repr v (Test.abst v y) = y"
|
||||
by (simp add: v) (fact abst_inv)
|
||||
qed
|
||||
|
||||
definition
|
||||
"tuple_iso_tuple \<equiv> Tuple_Isomorphism id id"
|
||||
|
||||
lemma tuple_iso_tuple:
|
||||
"isomorphic_tuple tuple_iso_tuple"
|
||||
by (simp add: isomorphic_tuple_intro [OF _ _ reflexive] tuple_iso_tuple_def)
|
||||
|
||||
lemma refl_conj_eq: "Q = R \<Longrightarrow> P \<and> Q \<longleftrightarrow> P \<and> R"
|
||||
by simp
|
||||
|
||||
lemma iso_tuple_UNIV_I: "x \<in> UNIV \<equiv> True"
|
||||
by simp
|
||||
|
||||
lemma iso_tuple_True_simp: "(True \<Longrightarrow> PROP P) \<equiv> PROP P"
|
||||
by simp
|
||||
|
||||
lemma prop_subst: "s = t \<Longrightarrow> PROP P t \<Longrightarrow> PROP P s"
|
||||
by simp
|
||||
|
||||
lemma K_record_comp: "(\<lambda>x. c) \<circ> f = (\<lambda>x. c)"
|
||||
by (simp add: comp_def)
|
||||
|
||||
|
||||
subsection \<open>Concrete record syntax\<close>
|
||||
|
||||
nonterminal
|
||||
ident and
|
||||
field_type and
|
||||
field_types and
|
||||
field and
|
||||
fields and
|
||||
field_update and
|
||||
field_updates
|
||||
|
||||
syntax
|
||||
"_constify" :: "id => ident" ("_")
|
||||
"_constify" :: "longid => ident" ("_")
|
||||
|
||||
"_field_type" :: "ident => type => field_type" ("(2_ ::/ _)")
|
||||
"" :: "field_type => field_types" ("_")
|
||||
"_field_types" :: "field_type => field_types => field_types" ("_,/ _")
|
||||
"_record_type" :: "field_types => type" ("(3\<lparr>_\<rparr>)")
|
||||
"_record_type_scheme" :: "field_types => type => type" ("(3\<lparr>_,/ (2\<dots> ::/ _)\<rparr>)")
|
||||
|
||||
"_field" :: "ident => 'a => field" ("(2_ =/ _)")
|
||||
"" :: "field => fields" ("_")
|
||||
"_fields" :: "field => fields => fields" ("_,/ _")
|
||||
"_record" :: "fields => 'a" ("(3\<lparr>_\<rparr>)")
|
||||
"_record_scheme" :: "fields => 'a => 'a" ("(3\<lparr>_,/ (2\<dots> =/ _)\<rparr>)")
|
||||
|
||||
"_field_update" :: "ident => 'a => field_update" ("(2_ :=/ _)")
|
||||
"" :: "field_update => field_updates" ("_")
|
||||
"_field_updates" :: "field_update => field_updates => field_updates" ("_,/ _")
|
||||
"_record_update" :: "'a => field_updates => 'b" ("_/(3\<lparr>_\<rparr>)" [900, 0] 900)
|
||||
|
||||
syntax (ASCII)
|
||||
"_record_type" :: "field_types => type" ("(3'(| _ |'))")
|
||||
"_record_type_scheme" :: "field_types => type => type" ("(3'(| _,/ (2... ::/ _) |'))")
|
||||
"_record" :: "fields => 'a" ("(3'(| _ |'))")
|
||||
"_record_scheme" :: "fields => 'a => 'a" ("(3'(| _,/ (2... =/ _) |'))")
|
||||
"_record_update" :: "'a => field_updates => 'b" ("_/(3'(| _ |'))" [900, 0] 900)
|
||||
|
||||
|
||||
subsection \<open>Record package\<close>
|
||||
|
||||
ML_file "test.ML"
|
||||
|
||||
hide_const (open) Tuple_Isomorphism repr abst iso_tuple_fst iso_tuple_snd
|
||||
iso_tuple_fst_update iso_tuple_snd_update iso_tuple_cons
|
||||
iso_tuple_surjective_proof_assist iso_tuple_update_accessor_cong_assist
|
||||
iso_tuple_update_accessor_eq_assist tuple_iso_tuple
|
||||
|
||||
end
|
|
@ -23,6 +23,8 @@ keywords "text-" "text-latex" :: document_body
|
|||
and "update_instance-assert-error" :: document_body
|
||||
and "declare_reference-assert-error" :: document_body
|
||||
and "value-assert-error" :: document_body
|
||||
and "definition-assert-error" :: document_body
|
||||
and "doc_class-assert-error" :: document_body
|
||||
|
||||
begin
|
||||
|
||||
|
@ -35,7 +37,8 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark
|
|||
xstring_opt:(xstring * Position.T) option),
|
||||
toks_list:Input.source list)
|
||||
: theory -> theory =
|
||||
let val (((oid,pos),cid_pos), doc_attrs) = meta_args
|
||||
let val ((binding,cid_pos), doc_attrs) = meta_args
|
||||
val oid = Binding.name_of binding
|
||||
val oid' = if meta_args = ODL_Meta_Args_Parser.empty_meta_args
|
||||
then "output"
|
||||
else oid
|
||||
|
@ -71,7 +74,7 @@ fun gen_enriched_document_command2 name {body} cid_transform attr_transform mark
|
|||
else
|
||||
Value_Command.Docitem_Parser.create_and_check_docitem
|
||||
{is_monitor = false} {is_inline = false} {define = true}
|
||||
oid pos (cid_transform cid_pos) (attr_transform doc_attrs))
|
||||
binding (cid_transform cid_pos) (attr_transform doc_attrs))
|
||||
(* ... generating the level-attribute syntax *)
|
||||
in handle_margs_opt #> (fn thy => (app (check_n_tex_text thy) toks_list; thy))
|
||||
end;
|
||||
|
@ -136,10 +139,10 @@ val _ =
|
|||
>> (Toplevel.theory o update_instance_command));
|
||||
|
||||
val _ =
|
||||
let fun create_and_check_docitem ((((oid, pos),cid_pos),doc_attrs),src) thy =
|
||||
let fun create_and_check_docitem (((binding,cid_pos),doc_attrs),src) thy =
|
||||
(Value_Command.Docitem_Parser.create_and_check_docitem
|
||||
{is_monitor = false} {is_inline=true}
|
||||
{define = false} oid pos (cid_pos) (doc_attrs) thy)
|
||||
{define = false} binding (cid_pos) (doc_attrs) thy)
|
||||
handle ERROR msg => (if error_match src msg
|
||||
then (writeln ("Correct error: "^msg^": reported.");thy)
|
||||
else error"Wrong error reported")
|
||||
|
@ -151,20 +154,55 @@ val _ =
|
|||
|
||||
|
||||
val _ =
|
||||
let fun pass_trans_to_value_cmd (args, (((name, modes), t),src)) trans =
|
||||
(Value_Command.value_cmd {assert=false} args name modes t @{here} trans
|
||||
handle ERROR msg => (if error_match src msg
|
||||
then (writeln ("Correct error: "^msg^": reported.");trans)
|
||||
else error"Wrong error reported"))
|
||||
let fun pass_trans_to_value_cmd (args, (((name, modes), t),src)) trans =
|
||||
let val pos = Toplevel.pos_of trans
|
||||
in trans |> Toplevel.theory
|
||||
(fn thy => Value_Command.value_cmd {assert=false} args name modes t pos thy
|
||||
handle ERROR msg => (if error_match src msg
|
||||
then (writeln ("Correct error: "^msg^": reported."); thy)
|
||||
else error"Wrong error reported"))
|
||||
end
|
||||
in Outer_Syntax.command \<^command_keyword>\<open>value-assert-error\<close> "evaluate and print term"
|
||||
(ODL_Meta_Args_Parser.opt_attributes --
|
||||
(Value_Command.opt_evaluator
|
||||
-- Value_Command.opt_modes
|
||||
-- Parse.term
|
||||
-- Parse.document_source)
|
||||
>> (Toplevel.theory o pass_trans_to_value_cmd))
|
||||
>> (pass_trans_to_value_cmd))
|
||||
end;
|
||||
|
||||
val _ =
|
||||
let fun definition_cmd' meta_args_opt decl params prems spec src bool ctxt =
|
||||
Local_Theory.background_theory (Value_Command.meta_args_exec meta_args_opt) ctxt
|
||||
|> (fn ctxt => Definition_Star_Command.definition_cmd decl params prems spec bool ctxt
|
||||
handle ERROR msg => if error_match src msg
|
||||
then (writeln ("Correct error: "^msg^": reported.")
|
||||
; pair "Bound 0" @{thm refl}
|
||||
|> pair (Bound 0)
|
||||
|> rpair ctxt)
|
||||
else error"Wrong error reported")
|
||||
in
|
||||
Outer_Syntax.local_theory' \<^command_keyword>\<open>definition-assert-error\<close> "constant definition"
|
||||
(ODL_Meta_Args_Parser.opt_attributes --
|
||||
(Scan.option Parse_Spec.constdecl -- (Parse_Spec.opt_thm_name ":" -- Parse.prop) --
|
||||
Parse_Spec.if_assumes -- Parse.for_fixes -- Parse.document_source)
|
||||
>> (fn (meta_args_opt, ((((decl, spec), prems), params), src)) =>
|
||||
#2 oo definition_cmd' meta_args_opt decl params prems spec src))
|
||||
end;
|
||||
|
||||
|
||||
val _ =
|
||||
let fun add_doc_class_cmd' ((((overloaded, hdr), (parent, attrs)),((rejects,accept_rex),invars)), src) =
|
||||
(fn thy => OntoParser.add_doc_class_cmd {overloaded = overloaded} hdr parent attrs rejects accept_rex invars thy
|
||||
handle ERROR msg => (if error_match src msg
|
||||
then (writeln ("Correct error: "^msg^": reported."); thy)
|
||||
else error"Wrong error reported"))
|
||||
in
|
||||
Outer_Syntax.command \<^command_keyword>\<open>doc_class-assert-error\<close>
|
||||
"define document class"
|
||||
((OntoParser.parse_doc_class -- Parse.document_source)
|
||||
>> (Toplevel.theory o add_doc_class_cmd'))
|
||||
end
|
||||
|
||||
val _ =
|
||||
Outer_Syntax.command ("text-latex", \<^here>) "formal comment (primary style)"
|
||||
|
|
|
@ -0,0 +1,691 @@
|
|||
theory Test_Polymorphic_Classes
|
||||
imports Isabelle_DOF.Isa_DOF
|
||||
TestKit
|
||||
begin
|
||||
|
||||
doc_class title =
|
||||
short_title :: "string option" <= "None"
|
||||
|
||||
doc_class Author =
|
||||
email :: "string" <= "''''"
|
||||
|
||||
datatype classification = SIL0 | SIL1 | SIL2 | SIL3 | SIL4
|
||||
|
||||
doc_class abstract =
|
||||
keywordlist :: "string list" <= "[]"
|
||||
safety_level :: "classification" <= "SIL3"
|
||||
|
||||
doc_class text_section =
|
||||
authored_by :: "Author set" <= "{}"
|
||||
level :: "int option" <= "None"
|
||||
|
||||
doc_class ('a::one, 'b, 'c) test0 = text_section +
|
||||
testa :: "'a list"
|
||||
testb :: "'b list"
|
||||
testc :: "'c list"
|
||||
|
||||
typ\<open>('a, 'b, 'c) test0\<close>
|
||||
typ\<open>('a, 'b, 'c, 'd) test0_scheme\<close>
|
||||
|
||||
find_consts name:"test0"
|
||||
find_theorems name:"test0"
|
||||
|
||||
|
||||
doc_class 'a test1 = text_section +
|
||||
test1 :: "'a list"
|
||||
invariant author_finite_test :: "finite (authored_by \<sigma>)"
|
||||
invariant force_level_test :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 1"
|
||||
|
||||
find_consts name:"test1*inv"
|
||||
find_theorems name:"test1*inv"
|
||||
|
||||
text*[church::Author, email="\<open>b\<close>"]\<open>\<close>
|
||||
text\<open>@{Author "church"}\<close>
|
||||
value*\<open>@{Author \<open>church\<close>}\<close>
|
||||
|
||||
text\<open>\<^value_>\<open>@{Author \<open>church\<close>}\<close>\<close>
|
||||
|
||||
doc_class ('a, 'b) test2 = "'a test1" +
|
||||
test2 :: "'b list"
|
||||
type_synonym ('a, 'b) test2_syn = "('a, 'b) test2"
|
||||
|
||||
find_theorems name:"test2"
|
||||
|
||||
declare [[invariants_checking_with_tactics]]
|
||||
text*[testtest::"('a, int) test2", level = "Some 2", authored_by = "{@{Author \<open>church\<close>}}", test2 = "[1]"]\<open>\<close>
|
||||
value*\<open>test2 @{test2 \<open>testtest\<close>}\<close>
|
||||
|
||||
text*[testtest2''::"(nat, int) test2", test1 = "[2::nat, 3]", test2 = "[4::int, 5]", level = "Some (2::int)"]\<open>\<close>
|
||||
value*\<open>test1 @{test2 \<open>testtest2''\<close>}\<close>
|
||||
declare [[invariants_checking_with_tactics = false]]
|
||||
|
||||
ML\<open>
|
||||
val t = Syntax.parse_term \<^context> "@{test2 \<open>testtest\<close>}"
|
||||
|
||||
\<close>
|
||||
ML\<open>
|
||||
val t = \<^term>\<open>test2.make 8142730 Test_Parametric_Classes_2_test2_authored_by_Attribute_Not_Initialized Test_Parametric_Classes_2_test2_level_Attribute_Not_Initialized Test_Parametric_Classes_2_test2_test1_Attribute_Not_Initialized
|
||||
Test_Parametric_Classes_2_test2_test2_Attribute_Not_Initialized
|
||||
\<lparr>authored_by := bot, level := None\<rparr> \<close>
|
||||
\<close>
|
||||
|
||||
text\<open>test2 = "[1::'a::one]" should be test2 = "[1::int]" because the type of testtest4 is ('a::one, int) test2:\<close>
|
||||
text-assert-error[testtest4::"('a::one, int) test2", level = "Some 2", authored_by = "{@{Author \<open>church\<close>}}", test2 = "[1::'a::one]"]\<open>\<close>
|
||||
\<open>Type unification failed\<close>
|
||||
text\<open>Indeed this definition fails:\<close>
|
||||
definition-assert-error testtest2::"('a::one, int) test2" where "testtest2 \<equiv>
|
||||
test2.make 11953346
|
||||
{@{Author \<open>church\<close>}}
|
||||
(Some 2)
|
||||
[]
|
||||
[]
|
||||
\<lparr>authored_by := bot
|
||||
, level := None, level := Some 2
|
||||
, authored_by := insert \<lparr>Author.tag_attribute = 11953164, email = []\<rparr> bot
|
||||
, test2.test2 := [1::('a::one)]\<rparr> "
|
||||
\<open>Type unification failed\<close>
|
||||
|
||||
text\<open>For now, no more support of type synonyms as parent:\<close>
|
||||
doc_class ('a, 'b, 'c) A =
|
||||
a :: "'a list"
|
||||
b :: "'b list"
|
||||
c :: "'c list"
|
||||
type_synonym ('a, 'b, 'c) A_syn = "('a, 'b, 'c) A"
|
||||
|
||||
doc_class-assert-error ('a, 'b, 'c, 'd) B = "('b, 'c, 'd) A_syn" +
|
||||
d ::"'a::one list" <= "[1]"
|
||||
\<open>Undefined onto class: "A_syn"\<close>
|
||||
|
||||
|
||||
declare[[invariants_checking_with_tactics]]
|
||||
|
||||
definition* testauthor0 where "testauthor0 \<equiv> \<lparr>Author.tag_attribute = 5, email = \<open>test_author_email\<close>\<rparr>"
|
||||
definition* testauthor :: "Author" where "testauthor \<equiv> \<lparr>Author.tag_attribute = 5, email = \<open>test_author_email\<close>\<rparr>"
|
||||
definition* testauthor2 :: "Author" where "testauthor2 \<equiv> \<lparr>Author.tag_attribute = 5, email = \<open>test_author_email\<close>\<rparr> \<lparr>email := \<open>test_author_email_2\<close> \<rparr>"
|
||||
definition* testauthor3 :: "Author" where "testauthor3 \<equiv> testauthor \<lparr>email := \<open>test_author_email_2\<close> \<rparr>"
|
||||
|
||||
ML\<open>
|
||||
val ctxt = \<^context>
|
||||
val input0 = Syntax.read_input "@{Author \<open>church\<close>}"
|
||||
val source = Syntax.read_input "\<^term_>\<open>@{Author \<open>church\<close>}\<close>"
|
||||
val input = source
|
||||
val tt = Document_Output.output_document ctxt {markdown = false} input
|
||||
\<close>
|
||||
|
||||
doc_class ('a, 'b) elaborate1 =
|
||||
a :: "'a list"
|
||||
b :: "'b list"
|
||||
|
||||
doc_class ('a, 'b) elaborate2 =
|
||||
c :: "('a, 'b) elaborate1 list"
|
||||
|
||||
doc_class ('a, 'b) elaborate3 =
|
||||
d :: "('a, 'b) elaborate2 list"
|
||||
|
||||
text*[test_elaborate1::"('a::one, 'b) elaborate1", a = "[1]"]\<open>\<close>
|
||||
|
||||
term*\<open>@{elaborate1 \<open>test_elaborate1\<close>}\<close>
|
||||
value* [nbe]\<open>@{elaborate1 \<open>test_elaborate1\<close>}\<close>
|
||||
|
||||
|
||||
text*[test_elaborate2::"('a::one, 'b) elaborate2", c = "[@{elaborate1 \<open>test_elaborate1\<close>}]"]\<open>\<close>
|
||||
|
||||
text*[test_elaborate3::"('a::one, 'b) elaborate3", d = "[@{elaborate2 \<open>test_elaborate2\<close>}]"]\<open>\<close>
|
||||
|
||||
term*\<open>(concat o concat) ((map o map) a (map c (elaborate3.d @{elaborate3 \<open>test_elaborate3\<close>})))\<close>
|
||||
value*\<open>(concat o concat) ((map o map) a (map c (elaborate3.d @{elaborate3 \<open>test_elaborate3\<close>})))\<close>
|
||||
|
||||
|
||||
text\<open>
|
||||
The term antiquotation is considered a ground term.
|
||||
Then its type here is \<^typ>\<open>'a::one list\<close> with \<open>'a\<close> a fixed-type variable.
|
||||
So the following definition only works because the parameter of the class is also \<open>'a\<close>.\<close>
|
||||
declare[[ML_print_depth = 10000]]
|
||||
doc_class 'a elaborate4 =
|
||||
d :: "'a::one list" <= "(concat o concat) ((map o map) a (map c (elaborate3.d @{elaborate3 \<open>test_elaborate3\<close>})))"
|
||||
declare[[ML_print_depth = 20]]
|
||||
|
||||
declare[[ML_print_depth = 10000]]
|
||||
text*[test_elaborate4::"'a::one elaborate4"]\<open>\<close>
|
||||
declare[[ML_print_depth = 20]]
|
||||
|
||||
|
||||
text\<open>Bug:
|
||||
As the term antiquotation is considered as a ground term,
|
||||
its type \<^typ>\<open>'a::one list\<close> conflicts with the type of the attribute \<^typ>\<open>int list\<close>.
|
||||
To support the instantiation of the term antiquotation as an \<^typ>\<open>int list\<close>,
|
||||
the term antiquotation should have the same behavior as a constant definition,
|
||||
which is not the case for now.\<close>
|
||||
declare[[ML_print_depth = 10000]]
|
||||
doc_class-assert-error elaborate4' =
|
||||
d :: "int list" <= "(concat o concat) ((map o map) a (map c (elaborate3.d @{elaborate3 \<open>test_elaborate3\<close>})))"
|
||||
\<open>Type unification failed\<close>
|
||||
declare[[ML_print_depth = 20]]
|
||||
|
||||
text\<open>The behavior we want to support: \<close>
|
||||
|
||||
definition one_list :: "'a::one list" where "one_list \<equiv> [1]"
|
||||
|
||||
text\<open>the constant \<^const>\<open>one_list\<close> can be instantiate as an \<^typ>\<open>int list\<close>:\<close>
|
||||
doc_class elaborate4'' =
|
||||
d :: "int list" <= "one_list"
|
||||
|
||||
declare[[ML_print_depth = 10000]]
|
||||
text*[test_elaborate4''::"elaborate4''"]\<open>\<close>
|
||||
declare[[ML_print_depth = 20]]
|
||||
|
||||
|
||||
term*\<open>concat (map a (elaborate2.c @{elaborate2 \<open>test_elaborate2\<close>}))\<close>
|
||||
value*\<open>concat (map a (elaborate2.c @{elaborate2 \<open>test_elaborate2\<close>}))\<close>
|
||||
|
||||
text\<open>
|
||||
The term antiquotation is considered a ground term.
|
||||
Then its type here is \<^typ>\<open>'a::one list\<close> with \<open>'a\<close> a fixed-type variable.
|
||||
So the following definition only works because the parameter of the class is also \<open>'a\<close>.\<close>
|
||||
declare[[ML_print_depth = 10000]]
|
||||
doc_class 'a elaborate5 =
|
||||
d :: "'a::one list" <= "concat (map a (elaborate2.c @{elaborate2 \<open>test_elaborate2\<close>}))"
|
||||
declare[[ML_print_depth = 20]]
|
||||
|
||||
text\<open>Bug: But when defining an instance, as we use a \<open>'b\<close> variable to specify the type
|
||||
of the instance (\<^typ>\<open>'b::one elaborate5\<close>, then the unification fails\<close>
|
||||
declare[[ML_print_depth = 10000]]
|
||||
text-assert-error[test_elaborate5::"'b::one elaborate5"]\<open>\<close>
|
||||
\<open>Inconsistent sort constraints for type variable "'b"\<close>
|
||||
declare[[ML_print_depth = 20]]
|
||||
|
||||
text\<open>Bug:
|
||||
The term antiquotation is considered a ground term.
|
||||
Then its type here is \<^typ>\<open>'a::one list\<close> with \<open>'a\<close> a fixed-type variable.
|
||||
So it is not compatible with the type of the attribute \<^typ>\<open>'a::numeral list\<close>\<close>
|
||||
doc_class-assert-error 'a elaborate5' =
|
||||
d :: "'a::numeral list" <= "concat (map a (elaborate2.c @{elaborate2 \<open>test_elaborate2\<close>}))"
|
||||
\<open>Sort constraint\<close>
|
||||
|
||||
text\<open>The behavior we want to support: \<close>
|
||||
|
||||
text\<open>the constant \<^const>\<open>one_list\<close> can be instantiate as an \<^typ>\<open>'a::numeral list\<close>:\<close>
|
||||
doc_class 'a elaborate5'' =
|
||||
d :: "'a::numeral list" <= "one_list"
|
||||
|
||||
|
||||
text*[test_elaborate1a::"('a::one, int) elaborate1", a = "[1]", b = "[2]"]\<open>\<close>
|
||||
|
||||
term*\<open>@{elaborate1 \<open>test_elaborate1a\<close>}\<close>
|
||||
value* [nbe]\<open>@{elaborate1 \<open>test_elaborate1a\<close>}\<close>
|
||||
|
||||
text*[test_elaborate2a::"('a::one, int) elaborate2", c = "[@{elaborate1 \<open>test_elaborate1a\<close>}]"]\<open>\<close>
|
||||
|
||||
text*[test_elaborate3a::"('a::one, int) elaborate3", d = "[@{elaborate2 \<open>test_elaborate2a\<close>}]"]\<open>\<close>
|
||||
|
||||
text\<open>
|
||||
The term antiquotation is considered a ground term.
|
||||
Then its type here is \<^typ>\<open>'a::one list\<close> with \<open>'a\<close> a fixed-type variable.
|
||||
So the following definition only works because the parameter of the class is also \<open>'a\<close>.\<close>
|
||||
definition* test_elaborate3_embedding ::"'a::one list"
|
||||
where "test_elaborate3_embedding \<equiv> (concat o concat) ((map o map) elaborate1.a (map elaborate2.c (elaborate3.d @{elaborate3 \<open>test_elaborate3a\<close>})))"
|
||||
|
||||
text\<open>Bug:
|
||||
The term antiquotation is considered a ground term.
|
||||
Then its type here is \<^typ>\<open>'a::one list\<close> with \<open>'a\<close> a fixed-type variable.
|
||||
So it is not compatible with the specified type of the definition \<^typ>\<open>int list\<close>:\<close>
|
||||
definition-assert-error test_elaborate3_embedding'::"int list"
|
||||
where "test_elaborate3_embedding' \<equiv> (concat o concat) ((map o map) elaborate1.a (map elaborate2.c (elaborate3.d @{elaborate3 \<open>test_elaborate3a\<close>})))"
|
||||
\<open>Type unification failed\<close>
|
||||
|
||||
term*\<open>@{elaborate1 \<open>test_elaborate1a\<close>}\<close>
|
||||
value* [nbe]\<open>@{elaborate1 \<open>test_elaborate1a\<close>}\<close>
|
||||
|
||||
|
||||
record ('a, 'b) elaborate1' =
|
||||
a :: "'a list"
|
||||
b :: "'b list"
|
||||
|
||||
record ('a, 'b) elaborate2' =
|
||||
c :: "('a, 'b) elaborate1' list"
|
||||
|
||||
record ('a, 'b) elaborate3' =
|
||||
d :: "('a, 'b) elaborate2' list"
|
||||
|
||||
doc_class 'a one =
|
||||
a::"'a list"
|
||||
|
||||
text*[test_one::"'a::one one", a = "[1]"]\<open>\<close>
|
||||
|
||||
value* [nbe] \<open>@{one \<open>test_one\<close>}\<close>
|
||||
|
||||
term*\<open>a @{one \<open>test_one\<close>}\<close>
|
||||
|
||||
text\<open>Bug:
|
||||
The term antiquotation is considered a ground term.
|
||||
Then its type here is \<^typ>\<open>'a::one list\<close> with \<open>'a\<close> a fixed-type variable.
|
||||
So it is not compatible with the specified type of the definition \<^typ>\<open>('b::one, 'a::numeral) elaborate1'\<close>
|
||||
because the term antiquotation can not be instantiate as a \<^typ>\<open>'b::one list\<close>
|
||||
and the \<open>'a\<close> is checked against the \<open>'a::numeral\<close> instance type parameter:\<close>
|
||||
definition-assert-error test_elaborate1'::"('b::one, 'a::numeral) elaborate1'"
|
||||
where "test_elaborate1' \<equiv> \<lparr> elaborate1'.a = a @{one \<open>test_one\<close>}, b = [2] \<rparr>"
|
||||
\<open>Sort constraint\<close>
|
||||
|
||||
text\<open>This is the same behavior as the following:\<close>
|
||||
definition-assert-error test_elaborate10::"('b::one, 'a::numeral) elaborate1'"
|
||||
where "test_elaborate10 \<equiv> \<lparr> elaborate1'.a = [1::'a::one], b = [2] \<rparr>"
|
||||
\<open>Sort constraint\<close>
|
||||
|
||||
definition-assert-error test_elaborate11::"('b::one, 'c::numeral) elaborate1'"
|
||||
where "test_elaborate11 \<equiv> \<lparr> elaborate1'.a = [1::'a::one], b = [2] \<rparr>"
|
||||
\<open>Type unification failed\<close>
|
||||
|
||||
text\<open>So this works:\<close>
|
||||
definition* test_elaborate1''::"('a::one, 'b::numeral) elaborate1'"
|
||||
where "test_elaborate1'' \<equiv> \<lparr> elaborate1'.a = a @{one \<open>test_one\<close>}, b = [2] \<rparr>"
|
||||
|
||||
term \<open>elaborate1'.a test_elaborate1''\<close>
|
||||
value [nbe] \<open>elaborate1'.a test_elaborate1''\<close>
|
||||
|
||||
text\<open>But if we embed the term antiquotation in a definition,
|
||||
the type unification works:\<close>
|
||||
definition* onedef where "onedef \<equiv> @{one \<open>test_one\<close>}"
|
||||
|
||||
definition test_elaborate1'''::"('b::one, 'a::numeral) elaborate1'"
|
||||
where "test_elaborate1''' \<equiv> \<lparr> elaborate1'.a = a onedef, b = [2] \<rparr>"
|
||||
|
||||
value [nbe] \<open>elaborate1'.a test_elaborate1'''\<close>
|
||||
|
||||
|
||||
definition test_elaborate2'::"(int, 'b::numeral) elaborate2'"
|
||||
where "test_elaborate2' \<equiv> \<lparr> c = [test_elaborate1''] \<rparr>"
|
||||
|
||||
definition test_elaborate3'::"(int, 'b::numeral) elaborate3'"
|
||||
where "test_elaborate3' \<equiv> \<lparr> d = [test_elaborate2'] \<rparr>"
|
||||
|
||||
|
||||
doc_class 'a test3' =
|
||||
test3 :: "int"
|
||||
test3' :: "'a list"
|
||||
|
||||
text*[testtest30::"'a::one test3'", test3'="[1]"]\<open>\<close>
|
||||
text-assert-error[testtest30::"'a test3'", test3'="[1]"]\<open>\<close>
|
||||
\<open>Type unification failed: Variable\<close>
|
||||
|
||||
find_consts name:"test3'.test3"
|
||||
definition testone :: "'a::one test3'" where "testone \<equiv> \<lparr>tag_attribute = 5, test3 = 3, test3' = [1] \<rparr>"
|
||||
definition* testtwo :: "'a::one test3'" where "testtwo \<equiv> \<lparr>tag_attribute = 5, test3 = 1, test3' = [1] \<rparr>\<lparr> test3 := 1\<rparr>"
|
||||
|
||||
text*[testtest3'::"'a test3'", test3 = "1"]\<open>\<close>
|
||||
|
||||
declare [[show_sorts = false]]
|
||||
definition* testtest30 :: "'a test3'" where "testtest30 \<equiv> \<lparr>tag_attribute = 12, test3 = 2, test3' = [] \<rparr>"
|
||||
update_instance*[testtest3'::"'a test3'", test3 := "2"]
|
||||
|
||||
ML\<open>
|
||||
val t = @{value_ [nbe] \<open>test3 @{test3' \<open>testtest3'\<close>}\<close>}
|
||||
val tt = HOLogic.dest_number t
|
||||
\<close>
|
||||
|
||||
text\<open>@{value_ [] [nbe] \<open>test3 @{test3' \<open>testtest3'\<close>}\<close>}\<close>
|
||||
|
||||
update_instance*[testtest3'::"'a test3'", test3 += "2"]
|
||||
|
||||
ML\<open>
|
||||
val t = @{value_ [nbe] \<open>test3 @{test3' \<open>testtest3'\<close>}\<close>}
|
||||
val tt = HOLogic.dest_number t
|
||||
\<close>
|
||||
|
||||
value\<open>test3 \<lparr> tag_attribute = 1, test3 = 2, test3' = [2::int, 3] \<rparr>\<close>
|
||||
value\<open>test3 \<lparr> tag_attribute = 1, test3 = 2, test3' = [2::int, 3] \<rparr>\<close>
|
||||
find_consts name:"test3'.test3"
|
||||
|
||||
ML\<open>
|
||||
val test_value = @{value_ \<open>@{test3' \<open>testtest3'\<close>}\<close>}
|
||||
|
||||
\<close>
|
||||
declare [[show_sorts = false]]
|
||||
update_instance*[testtest3'::"'a test3'", test3 += "3"]
|
||||
declare [[show_sorts = false]]
|
||||
value*\<open>test3 @{test3' \<open>testtest3'\<close>}\<close>
|
||||
value\<open>test3 \<lparr> tag_attribute = 12, test3 = 5, test3' = AAAAAA\<rparr>\<close>
|
||||
|
||||
find_consts name:"test3'.test3"
|
||||
|
||||
text*[testtest3''::"int test3'", test3 = "1"]\<open>\<close>
|
||||
|
||||
update_instance*[testtest3''::"int test3'", test3' += "[3]"]
|
||||
|
||||
value*\<open>test3' @{test3' \<open>testtest3''\<close>}\<close>
|
||||
|
||||
update_instance*[testtest3''::"int test3'", test3' := "[3]"]
|
||||
|
||||
value*\<open>test3' @{test3' \<open>testtest3''\<close>}\<close>
|
||||
|
||||
update_instance*[testtest3''::"int test3'", test3' += "[2,5]"]
|
||||
|
||||
value*\<open>test3' @{test3' \<open>testtest3''\<close>}\<close>
|
||||
|
||||
definition testeq where "testeq \<equiv> \<lambda>x. x"
|
||||
find_consts name:"test3'.ma"
|
||||
|
||||
text-assert-error[testtest3''::"int test3'", test3 = "1", test3' = "[3::'a::numeral]"]\<open>\<close>
|
||||
\<open>Type unification failed\<close>
|
||||
|
||||
text-assert-error[testtest3''::"int test3'", test3 = "1", test3' = "[3]"]\<open>\<close>
|
||||
\<open>Duplicate instance declaration\<close>
|
||||
|
||||
|
||||
declare[[ML_print_depth = 10000]]
|
||||
definition-assert-error testest3''' :: "int test3'"
|
||||
where "testest3''' \<equiv> \<lparr> tag_attribute = 12, test3 = 1, test3' = [2]\<rparr>\<lparr> test3' := [3::'a::numeral]\<rparr>"
|
||||
\<open>Type unification failed\<close>
|
||||
declare[[ML_print_depth = 20]]
|
||||
|
||||
value* \<open>test3 @{test3' \<open>testtest3''\<close>}\<close>
|
||||
value* \<open>\<lparr> tag_attribute = 12, test3 = 1, test3' = [2]\<rparr>\<lparr> test3' := [3::int]\<rparr>\<close>
|
||||
value* \<open>test3 (\<lparr> tag_attribute = 12, test3 = 1, test3' = [2]\<rparr>\<lparr> test3' := [3::int]\<rparr>)\<close>
|
||||
term*\<open>@{test3' \<open>testtest3''\<close>}\<close>
|
||||
|
||||
ML\<open>val t = \<^term_>\<open>test3 @{test3' \<open>testtest3''\<close>}\<close>\<close>
|
||||
|
||||
value\<open>test3 \<lparr> tag_attribute = 12, test3 = 2, test3' = [2::int ,3]\<rparr>\<close>
|
||||
|
||||
find_consts name:"test3'.test3"
|
||||
find_consts name:"Isabelle_DOF_doc_class_test3'"
|
||||
update_instance*[testtest3''::"int test3'", test3 := "2"]
|
||||
ML\<open>
|
||||
val t = @{value_ [nbe] \<open>test3 @{test3' \<open>testtest3''\<close>}\<close>}
|
||||
val tt = HOLogic.dest_number t |> snd
|
||||
\<close>
|
||||
|
||||
doc_class 'a testa =
|
||||
a:: "'a set"
|
||||
b:: "int set"
|
||||
|
||||
text*[testtesta::"'a testa", b = "{2}"]\<open>\<close>
|
||||
update_instance*[testtesta::"'a testa", b += "{3}"]
|
||||
|
||||
ML\<open>
|
||||
val t = @{value_ [nbe] \<open>b @{testa \<open>testtesta\<close>}\<close>}
|
||||
val tt = HOLogic.dest_set t |> map (HOLogic.dest_number #> snd)
|
||||
\<close>
|
||||
|
||||
update_instance-assert-error[testtesta::"'a::numeral testa", a := "{2::'a::numeral}"]
|
||||
\<open>incompatible classes:'a Test_Polymorphic_Classes.testa:'a::numeral Test_Polymorphic_Classes.testa\<close>
|
||||
|
||||
text*[testtesta'::"'a::numeral testa", a = "{2}"]\<open>\<close>
|
||||
|
||||
update_instance*[testtesta'::"'a::numeral testa", a += "{3}"]
|
||||
|
||||
ML\<open>
|
||||
val t = @{value_ [nbe] \<open>a @{testa \<open>testtesta'\<close>}\<close>}
|
||||
\<close>
|
||||
|
||||
update_instance-assert-error[testtesta'::"'a::numeral testa", a += "{3::int}"]
|
||||
\<open>Type unification failed\<close>
|
||||
|
||||
definition-assert-error testtesta'' :: "'a::numeral testa"
|
||||
where "testtesta'' \<equiv> \<lparr>tag_attribute = 5, a = {1}, b = {1} \<rparr>\<lparr> a := {1::int}\<rparr>"
|
||||
\<open>Type unification failed\<close>
|
||||
|
||||
update_instance*[testtesta'::"'a::numeral testa", b := "{3::int}"]
|
||||
ML\<open>
|
||||
val t = @{value_ [nbe] \<open>b @{testa \<open>testtesta'\<close>}\<close>}
|
||||
\<close>
|
||||
|
||||
value* [nbe] \<open>b @{testa \<open>testtesta'\<close>}\<close>
|
||||
|
||||
definition testtesta'' :: "'a::numeral testa"
|
||||
where "testtesta'' \<equiv> \<lparr>tag_attribute = 5, a = {1}, b = {1} \<rparr>\<lparr> b := {2::int}\<rparr>"
|
||||
|
||||
value [nbe]\<open>b testtesta''\<close>
|
||||
|
||||
doc_class 'a test3 =
|
||||
test3 :: "'a list"
|
||||
type_synonym 'a test3_syn = "'a test3"
|
||||
|
||||
text*[testtest3::"int test3", test3 = "[1]"]\<open>\<close>
|
||||
update_instance*[testtest3::"int test3", test3 := "[2]"]
|
||||
ML\<open>
|
||||
val t = \<^term_>\<open>test3 @{test3 \<open>testtest3\<close>}\<close>
|
||||
val tt = \<^value_>\<open>test3 @{test3 \<open>testtest3\<close>}\<close> |> HOLogic.dest_list |> map HOLogic.dest_number
|
||||
\<close>
|
||||
|
||||
update_instance*[testtest3::"int test3", test3 += "[3]"]
|
||||
value*\<open>test3 @{test3 \<open>testtest3\<close>}\<close>
|
||||
|
||||
|
||||
doc_class ('a, 'b) test4 = "'a test3" +
|
||||
test4 :: "'b list"
|
||||
|
||||
definition-assert-error testtest0'::"('a::one, int) test4" where "testtest0' \<equiv>
|
||||
test4.make 11953346
|
||||
[] [1::('a::one)]"
|
||||
\<open>Type unification failed\<close>
|
||||
|
||||
definition-assert-error testtest0''::"('a, int) test4" where "testtest0'' \<equiv>
|
||||
test4.make 11953346
|
||||
[1] Test_Parametric_Classes_2_test4_test4_Attribute_Not_Initialized"
|
||||
\<open>Type unification failed\<close>
|
||||
|
||||
text\<open>Must fail because the equivalent definition
|
||||
\<open>testtest0'\<close> below fails
|
||||
due to the constraint in the where [1::('a::one)] is not an \<^typ>\<open>int list\<close>
|
||||
but an \<^typ>\<open>'a::one list\<close> list \<close>
|
||||
text-assert-error[testtest0::"('a::one, int) test4", test4 = "[1::'a::one]"]\<open>\<close>
|
||||
\<open>Type unification failed\<close>
|
||||
update_instance-assert-error[testtest0::"('a::one, int) test4"]
|
||||
\<open>Undefined instance: "testtest0"\<close>
|
||||
|
||||
value-assert-error\<open>@{test4 \<open>testtest0\<close>}\<close>\<open>Undefined instance: "testtest0"\<close>
|
||||
|
||||
definition testtest0''::"('a, int) test4" where "testtest0'' \<equiv>
|
||||
\<lparr> tag_attribute = 11953346, test3 = [], test4 = [1]\<rparr>\<lparr>test4 := [2]\<rparr>"
|
||||
|
||||
definition testtest0'''::"('a, int) test4" where "testtest0''' \<equiv>
|
||||
\<lparr> tag_attribute = 11953346, test3 = [], test4 = [1]\<rparr>\<lparr>test4 := [2]\<rparr>"
|
||||
|
||||
|
||||
value [nbe] \<open>test3 testtest0''\<close>
|
||||
|
||||
type_synonym notion = string
|
||||
|
||||
doc_class Introduction = text_section +
|
||||
authored_by :: "Author set" <= "UNIV"
|
||||
uses :: "notion set"
|
||||
invariant author_finite :: "finite (authored_by \<sigma>)"
|
||||
and force_level :: "(level \<sigma>) \<noteq> None \<and> the (level \<sigma>) > 1"
|
||||
|
||||
doc_class claim = Introduction +
|
||||
based_on :: "notion list"
|
||||
|
||||
doc_class technical = text_section +
|
||||
formal_results :: "thm list"
|
||||
|
||||
doc_class "definition" = technical +
|
||||
is_formal :: "bool"
|
||||
property :: "term list" <= "[]"
|
||||
|
||||
datatype kind = expert_opinion | argument | "proof"
|
||||
|
||||
doc_class result = technical +
|
||||
evidence :: kind
|
||||
property :: "thm list" <= "[]"
|
||||
invariant has_property :: "evidence \<sigma> = proof \<longleftrightarrow> property \<sigma> \<noteq> []"
|
||||
|
||||
doc_class example = technical +
|
||||
referring_to :: "(notion + definition) set" <= "{}"
|
||||
|
||||
doc_class conclusion = text_section +
|
||||
establish :: "(claim \<times> result) set"
|
||||
invariant establish_defined :: "\<forall> x. x \<in> Domain (establish \<sigma>)
|
||||
\<longrightarrow> (\<exists> y \<in> Range (establish \<sigma>). (x, y) \<in> establish \<sigma>)"
|
||||
|
||||
text\<open>Next we define some instances (docitems): \<close>
|
||||
|
||||
declare[[invariants_checking_with_tactics = true]]
|
||||
|
||||
text*[church1::Author, email="\<open>church@lambda.org\<close>"]\<open>\<close>
|
||||
|
||||
text*[resultProof::result, evidence = "proof", property="[@{thm \<open>HOL.refl\<close>}]"]\<open>\<close>
|
||||
text*[resultArgument::result, evidence = "argument"]\<open>\<close>
|
||||
|
||||
text\<open>The invariants \<^theory_text>\<open>author_finite\<close> and \<^theory_text>\<open>establish_defined\<close> can not be checked directly
|
||||
and need a little help.
|
||||
We can set the \<open>invariants_checking_with_tactics\<close> theory attribute to help the checking.
|
||||
It will enable a basic tactic, using unfold and auto:\<close>
|
||||
|
||||
declare[[invariants_checking_with_tactics = true]]
|
||||
|
||||
text*[curry::Author, email="\<open>curry@lambda.org\<close>"]\<open>\<close>
|
||||
text*[introduction2::Introduction, authored_by = "{@{Author \<open>church\<close>}}", level = "Some 2"]\<open>\<close>
|
||||
(* When not commented, should violated the invariant:
|
||||
update_instance*[introduction2::Introduction
|
||||
, authored_by := "{@{Author \<open>church\<close>}}"
|
||||
, level := "Some 1"]
|
||||
*)
|
||||
|
||||
text*[introduction_test_parsed_elaborate::Introduction, authored_by = "authored_by @{Introduction \<open>introduction2\<close>}", level = "Some 2"]\<open>\<close>
|
||||
term*\<open>authored_by @{Introduction \<open>introduction_test_parsed_elaborate\<close>}\<close>
|
||||
value*\<open>authored_by @{Introduction \<open>introduction_test_parsed_elaborate\<close>}\<close>
|
||||
text*[introduction3::Introduction, authored_by = "{@{Author \<open>church\<close>}}", level = "Some 2"]\<open>\<close>
|
||||
text*[introduction4::Introduction, authored_by = "{@{Author \<open>curry\<close>}}", level = "Some 4"]\<open>\<close>
|
||||
|
||||
text*[resultProof2::result, evidence = "proof", property="[@{thm \<open>HOL.sym\<close>}]"]\<open>\<close>
|
||||
|
||||
text\<open>Then we can evaluate expressions with instances:\<close>
|
||||
|
||||
term*\<open>authored_by @{Introduction \<open>introduction2\<close>} = authored_by @{Introduction \<open>introduction3\<close>}\<close>
|
||||
value*\<open>authored_by @{Introduction \<open>introduction2\<close>} = authored_by @{Introduction \<open>introduction3\<close>}\<close>
|
||||
value*\<open>authored_by @{Introduction \<open>introduction2\<close>} = authored_by @{Introduction \<open>introduction4\<close>}\<close>
|
||||
|
||||
value*\<open>@{Introduction \<open>introduction2\<close>}\<close>
|
||||
|
||||
value*\<open>{@{Author \<open>curry\<close>}} = {@{Author \<open>church\<close>}}\<close>
|
||||
|
||||
term*\<open>property @{result \<open>resultProof\<close>} = property @{result \<open>resultProof2\<close>}\<close>
|
||||
value*\<open>property @{result \<open>resultProof\<close>} = property @{result \<open>resultProof2\<close>}\<close>
|
||||
|
||||
value*\<open>evidence @{result \<open>resultProof\<close>} = evidence @{result \<open>resultProof2\<close>}\<close>
|
||||
|
||||
declare[[invariants_checking_with_tactics = false]]
|
||||
|
||||
declare[[invariants_strict_checking = false]]
|
||||
|
||||
doc_class test_A =
|
||||
level :: "int option"
|
||||
x :: int
|
||||
|
||||
doc_class test_B =
|
||||
level :: "int option"
|
||||
x :: "string" (* attributes live in their own name-space *)
|
||||
y :: "string list" <= "[]" (* and can have arbitrary type constructors *)
|
||||
(* LaTeX may have problems with this, though *)
|
||||
|
||||
text\<open>We may even use type-synonyms for class synonyms ...\<close>
|
||||
type_synonym test_XX = test_B
|
||||
|
||||
doc_class test_C0 = test_B +
|
||||
z :: "test_A option" <= None (* A LINK, i.e. an attribute that has a type
|
||||
referring to a document class. Mathematical
|
||||
relations over document items can be modeled. *)
|
||||
g :: "thm" (* a reference to the proxy-type 'thm' allowing
|
||||
|
||||
to denote references to theorems inside attributes *)
|
||||
|
||||
|
||||
doc_class test_C = test_B +
|
||||
z :: "test_A option" <= None (* A LINK, i.e. an attribute that has a type
|
||||
referring to a document class. Mathematical
|
||||
relations over document items can be modeled. *)
|
||||
g :: "thm" (* a reference to the proxy-type 'thm' allowing
|
||||
|
||||
to denote references to theorems inside attributes *)
|
||||
|
||||
datatype enum = X1 | X2 | X3 (* we add an enumeration type ... *)
|
||||
|
||||
|
||||
doc_class test_D = test_B +
|
||||
x :: "string" <= "\<open>def \<longrightarrow>\<close>" (* overriding default *)
|
||||
a1 :: enum <= "X2" (* class - definitions may be mixed
|
||||
with arbitrary HOL-commands, thus
|
||||
also local definitions of enumerations *)
|
||||
a2 :: int <= 0
|
||||
|
||||
doc_class test_E = test_D +
|
||||
x :: "string" <= "''qed''" (* overriding default *)
|
||||
|
||||
doc_class test_G = test_C +
|
||||
g :: "thm" <= "@{thm \<open>HOL.refl\<close>}" (* warning overriding attribute expected*)
|
||||
|
||||
doc_class 'a test_F =
|
||||
properties :: "term list"
|
||||
r :: "thm list"
|
||||
u :: "file"
|
||||
s :: "typ list"
|
||||
b :: "(test_A \<times> 'a test_C_scheme) set" <= "{}" (* This is a relation link, roughly corresponding
|
||||
to an association class. It can be used to track
|
||||
claims to result - relations, for example.*)
|
||||
b' :: "(test_A \<times> 'a test_C_scheme) list" <= "[]"
|
||||
invariant br :: "r \<sigma> \<noteq> [] \<and> card(b \<sigma>) \<ge> 3"
|
||||
and br':: "r \<sigma> \<noteq> [] \<and> length(b' \<sigma>) \<ge> 3"
|
||||
and cr :: "properties \<sigma> \<noteq> []"
|
||||
|
||||
lemma*[l::test_E] local_sample_lemma :
|
||||
"@{thm \<open>refl\<close>} = @{thm ''refl''}" by simp
|
||||
\<comment> \<open>un-evaluated references are similar to
|
||||
uninterpreted constants. Not much is known
|
||||
about them, but that doesn't mean that we
|
||||
can't prove some basics over them...\<close>
|
||||
|
||||
text*[xcv1::test_A, x=5]\<open>Lorem ipsum ...\<close>
|
||||
text*[xcv2::test_C, g="@{thm ''HOL.refl''}"]\<open>Lorem ipsum ...\<close>
|
||||
text*[xcv3::test_A, x=7]\<open>Lorem ipsum ...\<close>
|
||||
|
||||
text\<open>Bug: For now, the implementation is no more compatible with the docitem term-antiquotation:\<close>
|
||||
text-assert-error[xcv10::"unit test_F", r="[@{thm ''HOL.refl''},
|
||||
@{thm \<open>local_sample_lemma\<close>}]", (* long names required *)
|
||||
b="{(@{docitem ''xcv1''},@{docitem \<open>xcv2\<close>})}", (* notations \<open>...\<close> vs. ''...'' *)
|
||||
s="[@{typ \<open>int list\<close>}]",
|
||||
properties = "[@{term \<open>H \<longrightarrow> H\<close>}]" (* notation \<open>...\<close> required for UTF8*)
|
||||
]\<open>Lorem ipsum ...\<close>\<open>Type unification failed\<close>
|
||||
|
||||
text*[xcv11::"unit test_F", r="[@{thm ''HOL.refl''},
|
||||
@{thm \<open>local_sample_lemma\<close>}]", (* long names required *)
|
||||
b="{(@{test_A ''xcv1''},@{test_C \<open>xcv2\<close>})}", (* notations \<open>...\<close> vs. ''...'' *)
|
||||
s="[@{typ \<open>int list\<close>}]",
|
||||
properties = "[@{term \<open>H \<longrightarrow> H\<close>}]" (* notation \<open>...\<close> required for UTF8*)
|
||||
]\<open>Lorem ipsum ...\<close>
|
||||
|
||||
value*\<open>b @{test_F \<open>xcv11\<close>}\<close>
|
||||
|
||||
typ\<open>unit test_F\<close>
|
||||
|
||||
text*[xcv4::"unit test_F", r="[@{thm ''HOL.refl''},
|
||||
@{thm \<open>local_sample_lemma\<close>}]", (* long names required *)
|
||||
b="{(@{test_A ''xcv1''},@{test_C \<open>xcv2\<close>})}", (* notations \<open>...\<close> vs. ''...'' *)
|
||||
s="[@{typ \<open>int list\<close>}]",
|
||||
properties = "[@{term \<open>H \<longrightarrow> H\<close>}]" (* notation \<open>...\<close> required for UTF8*)
|
||||
]\<open>Lorem ipsum ...\<close>
|
||||
|
||||
value*\<open>b @{test_F \<open>xcv4\<close>}\<close>
|
||||
|
||||
text*[xcv5::test_G, g="@{thm \<open>HOL.sym\<close>}"]\<open>Lorem ipsum ...\<close>
|
||||
|
||||
update_instance*[xcv4::"unit test_F", b+="{(@{test_A ''xcv3''},@{test_C ''xcv2''})}"]
|
||||
|
||||
update_instance-assert-error[xcv4::"unit test_F", b+="{(@{test_A ''xcv3''},@{test_G ''xcv5''})}"]
|
||||
\<open>Type unification failed: Clash of types\<close>
|
||||
|
||||
|
||||
|
||||
typ\<open>unit test_G_ext\<close>
|
||||
typ\<open>\<lparr>test_G.tag_attribute :: int\<rparr>\<close>
|
||||
text*[xcv6::"\<lparr>test_G.tag_attribute :: int\<rparr> test_F", b="{(@{test_A ''xcv3''},@{test_G ''xcv5''})}"]\<open>\<close>
|
||||
|
||||
|
||||
text\<open>\<open>lemma*\<close>, etc. do not support well polymorphic classes term antiquotations.
|
||||
For now only embedded term-antiquotation in a definition could work:\<close>
|
||||
definition* testtest_level where "testtest_level \<equiv> the (text_section.level @{test2 \<open>testtest2''\<close>})"
|
||||
lemma*[e5::E] testtest : "xx + testtest_level = yy + testtest_level \<Longrightarrow> xx = yy" by simp
|
||||
|
||||
text\<open>Indeed this fails:\<close>
|
||||
(*lemma*[e6::E] testtest : "xx + the (level @{test2 \<open>testtest2''\<close>}) = yy + the (level @{test2 \<open>testtest2''\<close>}) \<Longrightarrow> xx = yy" by simp*)
|
||||
|
||||
end
|
File diff suppressed because it is too large
Load Diff
|
@ -165,11 +165,11 @@ text\<open>The intended use for the \<open>doc_class\<close>es \<^verbatim>\<ope
|
|||
\<^verbatim>\<open>math_example\<close> (or \<^verbatim>\<open>math_ex\<close> for short)
|
||||
are \<^emph>\<open>informal\<close> descriptions of semi-formal definitions (by inheritance).
|
||||
Math-Examples can be made referentiable triggering explicit, numbered presentations.\<close>
|
||||
doc_class math_motivation = tc +
|
||||
doc_class math_motivation = technical +
|
||||
referentiable :: bool <= False
|
||||
type_synonym math_mtv = math_motivation
|
||||
|
||||
doc_class math_explanation = tc +
|
||||
doc_class math_explanation = technical +
|
||||
referentiable :: bool <= False
|
||||
type_synonym math_exp = math_explanation
|
||||
|
||||
|
@ -207,7 +207,7 @@ datatype math_content_class =
|
|||
text\<open>Instances of the \<open>doc_class\<close> \<^verbatim>\<open>math_content\<close> are by definition @{term "semiformal"}; they may
|
||||
be non-referential, but in this case they will not have a @{term "short_name"}.\<close>
|
||||
|
||||
doc_class math_content = tc +
|
||||
doc_class math_content = technical +
|
||||
referentiable :: bool <= False
|
||||
short_name :: string <= "''''"
|
||||
status :: status <= "semiformal"
|
||||
|
@ -516,34 +516,34 @@ subsection\<open>Content in Engineering/Tech Papers \<close>
|
|||
text\<open>This section is currently experimental and not supported by the documentation
|
||||
generation backend.\<close>
|
||||
|
||||
doc_class engineering_content = tc +
|
||||
doc_class engineering_content = technical +
|
||||
short_name :: string <= "''''"
|
||||
status :: status
|
||||
type_synonym eng_content = engineering_content
|
||||
|
||||
|
||||
doc_class "experiment" = eng_content +
|
||||
doc_class "experiment" = engineering_content +
|
||||
tag :: "string" <= "''''"
|
||||
|
||||
doc_class "evaluation" = eng_content +
|
||||
doc_class "evaluation" = engineering_content +
|
||||
tag :: "string" <= "''''"
|
||||
|
||||
doc_class "data" = eng_content +
|
||||
doc_class "data" = engineering_content +
|
||||
tag :: "string" <= "''''"
|
||||
|
||||
doc_class tech_definition = eng_content +
|
||||
doc_class tech_definition = engineering_content +
|
||||
referentiable :: bool <= True
|
||||
tag :: "string" <= "''''"
|
||||
|
||||
doc_class tech_code = eng_content +
|
||||
doc_class tech_code = engineering_content +
|
||||
referentiable :: bool <= True
|
||||
tag :: "string" <= "''''"
|
||||
|
||||
doc_class tech_example = eng_content +
|
||||
doc_class tech_example = engineering_content +
|
||||
referentiable :: bool <= True
|
||||
tag :: "string" <= "''''"
|
||||
|
||||
doc_class eng_example = eng_content +
|
||||
doc_class eng_example = engineering_content +
|
||||
referentiable :: bool <= True
|
||||
tag :: "string" <= "''''"
|
||||
|
||||
|
|
|
@ -376,7 +376,9 @@ fun fig_content_antiquotation name scan =
|
|||
|
||||
|
||||
fun figure_content ctxt (cfg_trans,file:Input.source) =
|
||||
let val (wdth_val_s, ht_s, caption) = process_args cfg_trans
|
||||
let val _ = Resources.check_file ctxt (SOME (get_document_dir ctxt)) file
|
||||
(* ToDo: must be declared source of type png or jpeg or pdf, ... *)
|
||||
val (wdth_val_s, ht_s, caption) = process_args cfg_trans
|
||||
val args = ["keepaspectratio","width=" ^ wdth_val_s, ht_s]
|
||||
|> commas
|
||||
|> enclose "[" "]"
|
||||
|
@ -427,11 +429,11 @@ fun convert_src_from_margs ctxt (X, (((str,_),value)::R)) =
|
|||
fun float_command (name, pos) descr cid =
|
||||
let fun set_default_class NONE = SOME(cid,pos)
|
||||
|set_default_class (SOME X) = SOME X
|
||||
fun create_instance ((((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) =
|
||||
fun create_instance (((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) =
|
||||
Value_Command.Docitem_Parser.create_and_check_docitem
|
||||
{is_monitor = false}
|
||||
{is_inline = true}
|
||||
{define = true} oid pos (set_default_class cid_pos) doc_attrs
|
||||
{define = true} binding (set_default_class cid_pos) doc_attrs
|
||||
fun generate_fig_ltx_ctxt ctxt cap_src oid body =
|
||||
Latex.macro0 "centering"
|
||||
@ body
|
||||
|
@ -439,25 +441,31 @@ fun float_command (name, pos) descr cid =
|
|||
@ Latex.macro "label" (DOF_core.get_instance_name_global oid (Proof_Context.theory_of ctxt)
|
||||
|> DOF_core.output_name
|
||||
|> Latex.string)
|
||||
fun parse_and_tex (margs as (((oid, _),_), _), cap_src) ctxt =
|
||||
(convert_src_from_margs ctxt margs)
|
||||
|> pair (upd_caption (K Input.empty) #> convert_meta_args ctxt margs)
|
||||
|> fig_content ctxt
|
||||
|> generate_fig_ltx_ctxt ctxt cap_src oid
|
||||
|> (Latex.environment ("figure") )
|
||||
fun parse_and_tex (margs as ((binding,_), _), cap_src) ctxt =
|
||||
let val oid = Binding.name_of binding
|
||||
in
|
||||
(convert_src_from_margs ctxt margs)
|
||||
|> pair (upd_caption (K Input.empty) #> convert_meta_args ctxt margs)
|
||||
|> fig_content ctxt
|
||||
|> generate_fig_ltx_ctxt ctxt cap_src oid
|
||||
|> (Latex.environment ("figure") )
|
||||
end
|
||||
in Monitor_Command_Parser.onto_macro_cmd_command (name, pos) descr create_instance parse_and_tex
|
||||
end
|
||||
|
||||
fun listing_command (name, pos) descr cid =
|
||||
let fun set_default_class NONE = SOME(cid,pos)
|
||||
|set_default_class (SOME X) = SOME X
|
||||
fun create_instance ((((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) =
|
||||
fun create_instance (((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) =
|
||||
Value_Command.Docitem_Parser.create_and_check_docitem
|
||||
{is_monitor = false}
|
||||
{is_inline = true}
|
||||
{define = true} oid pos (set_default_class cid_pos) doc_attrs
|
||||
fun parse_and_tex (margs as (((_, pos),_), _), _) _ =
|
||||
ISA_core.err ("Not yet implemented.\n Please use text*[oid::listing]\<open>\<close> instead.") pos
|
||||
{define = true} binding (set_default_class cid_pos) doc_attrs
|
||||
fun parse_and_tex (margs as ((binding,_), _), _) _ =
|
||||
let val pos = Binding.pos_of binding
|
||||
in
|
||||
ISA_core.err ("Not yet implemented.\n Please use text*[oid::listing]\<open>\<close> instead.") pos
|
||||
end
|
||||
in Monitor_Command_Parser.onto_macro_cmd_command (name, pos) descr create_instance parse_and_tex
|
||||
end
|
||||
|
||||
|
@ -752,7 +760,7 @@ text\<open> @{table_inline [display] (cell_placing = center,cell_height =\<open
|
|||
|
||||
(*>*)
|
||||
|
||||
text\<open>beamer frame environment\<close>
|
||||
text\<open>beamer support\<close>
|
||||
(* Under development *)
|
||||
|
||||
doc_class frame =
|
||||
|
@ -784,6 +792,18 @@ fun upd_frametitle f =
|
|||
fun upd_framesubtitle f =
|
||||
upd_frame (fn (options, frametitle, framesubtitle) => (options, frametitle, f framesubtitle))
|
||||
|
||||
type block = {title: Input.source}
|
||||
|
||||
val empty_block = {title = Input.empty}
|
||||
|
||||
fun make_block title = {title = title}
|
||||
|
||||
fun upd_block f =
|
||||
fn {title} => make_block (f title)
|
||||
|
||||
fun upd_block_title f =
|
||||
upd_block (fn title => f title)
|
||||
|
||||
val unenclose_end = unenclose
|
||||
val unenclose_string = unenclose o unenclose o unenclose_end
|
||||
|
||||
|
@ -794,6 +814,42 @@ fun read_string s =
|
|||
else unenclose_string s |> Syntax.read_input
|
||||
end
|
||||
|
||||
val block_titleN = "title"
|
||||
|
||||
fun block_modes (ctxt, toks) =
|
||||
let val (y, toks') = ((((Scan.optional
|
||||
(Args.parens
|
||||
(Parse.list1
|
||||
((Args.$$$ block_titleN |-- Args.$$$ "=" -- Parse.document_source
|
||||
>> (fn (_, k) => upd_block_title (K k)))
|
||||
))) [K empty_block])
|
||||
: (block -> block) list parser)
|
||||
>> (foldl1 (op #>)))
|
||||
: (block -> block) parser)
|
||||
(toks)
|
||||
in (y, (ctxt, toks')) end
|
||||
|
||||
fun process_args cfg_trans =
|
||||
let val {title} = cfg_trans empty_block
|
||||
in title end
|
||||
|
||||
fun block ctxt (cfg_trans,src) =
|
||||
let val title = process_args cfg_trans
|
||||
in Latex.string "{"
|
||||
@ (title |> Document_Output.output_document ctxt {markdown = false})
|
||||
@ Latex.string "}"
|
||||
@ (src |> Document_Output.output_document ctxt {markdown = false})
|
||||
|> (Latex.environment "block")
|
||||
end
|
||||
|
||||
fun block_antiquotation name scan =
|
||||
(Document_Output.antiquotation_raw_embedded name
|
||||
(scan : ((block -> block) * Input.source) context_parser)
|
||||
(block: Proof.context -> (block -> block) * Input.source -> Latex.text));
|
||||
|
||||
val _ = block_antiquotation \<^binding>\<open>block\<close> (block_modes -- Scan.lift Parse.document_source)
|
||||
|> Theory.setup
|
||||
|
||||
fun convert_meta_args ctxt (X, (((str,_),value) :: R)) =
|
||||
(case YXML.content_of str of
|
||||
"frametitle" => upd_frametitle (K(YXML.content_of value |> read_string))
|
||||
|
@ -808,18 +864,19 @@ fun convert_meta_args ctxt (X, (((str,_),value) :: R)) =
|
|||
fun frame_command (name, pos) descr cid =
|
||||
let fun set_default_class NONE = SOME(cid,pos)
|
||||
|set_default_class (SOME X) = SOME X
|
||||
fun create_instance ((((oid, pos),cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) =
|
||||
fun create_instance (((binding,cid_pos), doc_attrs) : ODL_Meta_Args_Parser.meta_args_t) =
|
||||
Value_Command.Docitem_Parser.create_and_check_docitem
|
||||
{is_monitor = false}
|
||||
{is_inline = true}
|
||||
{define = true} oid pos (set_default_class cid_pos) doc_attrs
|
||||
fun titles_src ctxt frametitle framesubtitle src = Latex.string "{"
|
||||
@ Document_Output.output_document ctxt {markdown = false} frametitle
|
||||
@ Latex.string "}"
|
||||
@ Latex.string "{"
|
||||
@ (Document_Output.output_document ctxt {markdown = false} framesubtitle)
|
||||
@ Latex.string "}"
|
||||
@ Document_Output.output_document ctxt {markdown = true} src
|
||||
{define = true} binding (set_default_class cid_pos) doc_attrs
|
||||
fun titles_src ctxt frametitle framesubtitle src =
|
||||
Latex.string "{"
|
||||
@ Document_Output.output_document ctxt {markdown = false} frametitle
|
||||
@ Latex.string "}"
|
||||
@ Latex.string "{"
|
||||
@ (Document_Output.output_document ctxt {markdown = false} framesubtitle)
|
||||
@ Latex.string "}"
|
||||
@ Document_Output.output_document ctxt {markdown = true} src
|
||||
fun generate_src_ltx_ctxt ctxt src cfg_trans =
|
||||
let val {options, frametitle, framesubtitle} = cfg_trans empty_frame
|
||||
in
|
||||
|
|
File diff suppressed because it is too large
Load Diff
|
@ -51,7 +51,13 @@ definition rep1 :: "'a rexp \<Rightarrow> 'a rexp" ("\<lbrace>(_)\<rbrace>\<^sup
|
|||
|
||||
definition opt :: "'a rexp \<Rightarrow> 'a rexp" ("\<lbrakk>(_)\<rbrakk>")
|
||||
where "\<lbrakk>A\<rbrakk> \<equiv> A || One"
|
||||
|
||||
|
||||
find_consts name:"RegExpI*"
|
||||
|
||||
ML\<open>
|
||||
val t = Sign.syn_of \<^theory>
|
||||
\<close>
|
||||
print_syntax
|
||||
value "Star (Conc(Alt (Atom(CHR ''a'')) (Atom(CHR ''b''))) (Atom(CHR ''c'')))"
|
||||
text\<open>or better equivalently:\<close>
|
||||
value "\<lbrace>(\<lfloor>CHR ''a''\<rfloor> || \<lfloor>CHR ''b''\<rfloor>) ~~ \<lfloor>CHR ''c''\<rfloor>\<rbrace>\<^sup>*"
|
||||
|
@ -240,6 +246,8 @@ no_notation Star ("\<lbrace>(_)\<rbrace>\<^sup>*" [0]100)
|
|||
no_notation Plus (infixr "||" 55)
|
||||
no_notation Times (infixr "~~" 60)
|
||||
no_notation Atom ("\<lfloor>_\<rfloor>" 65)
|
||||
no_notation rep1 ("\<lbrace>(_)\<rbrace>\<^sup>+")
|
||||
no_notation opt ("\<lbrakk>(_)\<rbrakk>")
|
||||
|
||||
ML\<open>
|
||||
structure RegExpInterface_Notations =
|
||||
|
@ -248,7 +256,9 @@ val Star = (\<^term>\<open>Regular_Exp.Star\<close>, Mixfix (Syntax.read_input "
|
|||
val Plus = (\<^term>\<open>Regular_Exp.Plus\<close>, Infixr (Syntax.read_input "||", 55, Position.no_range))
|
||||
val Times = (\<^term>\<open>Regular_Exp.Times\<close>, Infixr (Syntax.read_input "~~", 60, Position.no_range))
|
||||
val Atom = (\<^term>\<open>Regular_Exp.Atom\<close>, Mixfix (Syntax.read_input "\<lfloor>_\<rfloor>", [], 65, Position.no_range))
|
||||
val notations = [Star, Plus, Times, Atom]
|
||||
val opt = (\<^term>\<open>RegExpInterface.opt\<close>, Mixfix (Syntax.read_input "\<lbrakk>(_)\<rbrakk>", [], 1000, Position.no_range))
|
||||
val rep1 = (\<^term>\<open>RegExpInterface.rep1\<close>, Mixfix (Syntax.read_input "\<lbrace>(_)\<rbrace>\<^sup>+", [], 1000, Position.no_range))
|
||||
val notations = [Star, Plus, Times, Atom, rep1, opt]
|
||||
end
|
||||
\<close>
|
||||
|
||||
|
|
|
@ -208,13 +208,13 @@ text\<open>
|
|||
in many features over-accomplishes the required features of \<^dof>.
|
||||
\<close>
|
||||
|
||||
figure*["fig:dof-ide",relative_width="95",file_src="''figures/cicm2018-combined.png''"]\<open>
|
||||
figure*["fig_dof_ide",relative_width="95",file_src="''figures/cicm2018-combined.png''"]\<open>
|
||||
The \<^isadof> IDE (left) and the corresponding PDF (right), showing the first page
|
||||
of~@{cite "brucker.ea:isabelle-ontologies:2018"}.\<close>
|
||||
|
||||
text\<open>
|
||||
We call the present implementation of \<^dof> on the Isabelle platform \<^isadof> .
|
||||
@{figure "fig:dof-ide"} shows a screen-shot of an introductory paper on
|
||||
@{figure "fig_dof_ide"} shows a screen-shot of an introductory paper on
|
||||
\<^isadof>~@{cite "brucker.ea:isabelle-ontologies:2018"}: the \<^isadof> PIDE can be seen on the left,
|
||||
while the generated presentation in PDF is shown on the right.
|
||||
|
||||
|
|
|
@ -477,7 +477,7 @@ on the level of generated \<^verbatim>\<open>.aux\<close>-files, which are not n
|
|||
error-message and compiling it with a consistent bibtex usually makes disappear this behavior.
|
||||
\<close>
|
||||
|
||||
subsection*["using-term-aq"::technical, main_author = "Some @{author ''bu''}"]
|
||||
subsection*["using_term_aq"::technical, main_author = "Some @{author ''bu''}"]
|
||||
\<open>Using Term-Antiquotations\<close>
|
||||
|
||||
text\<open>The present version of \<^isadof> is the first version that supports the novel feature of
|
||||
|
@ -577,11 +577,11 @@ term antiquotations:
|
|||
\<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["subsec:onto-term-ctxt"::technical]
|
||||
declare_reference*["subsec_onto_term_ctxt"::technical]
|
||||
(*>*)
|
||||
|
||||
text\<open>They are text-contexts equivalents to the \<^theory_text>\<open>term*\<close> and \<^theory_text>\<open>value*\<close> commands
|
||||
for term-contexts introduced in @{technical (unchecked) \<open>subsec:onto-term-ctxt\<close>}\<close>
|
||||
for term-contexts introduced in @{technical (unchecked) \<open>subsec_onto_term_ctxt\<close>}\<close>
|
||||
|
||||
subsection\<open>A Technical Report with Tight Checking\<close>
|
||||
text\<open>An example of tight checking is a small programming manual to document programming trick
|
||||
|
|
|
@ -164,7 +164,7 @@ text\<open>
|
|||
to and between ontological concepts.
|
||||
\<close>
|
||||
|
||||
subsection*["odl-manual0"::technical]\<open>Some Isabelle/HOL Specification Constructs Revisited\<close>
|
||||
subsection*["odl_manual0"::technical]\<open>Some Isabelle/HOL Specification Constructs Revisited\<close>
|
||||
text\<open>
|
||||
As ODL is an extension of Isabelle/HOL, document class definitions can therefore be arbitrarily
|
||||
mixed with standard HOL specification constructs. To make this manual self-contained, we present
|
||||
|
@ -231,7 +231,7 @@ corresponding type-name \<^boxed_theory_text>\<open>0.foo\<close> is not. For th
|
|||
definition of a \<^boxed_theory_text>\<open>doc_class\<close> reject problematic lexical overlaps.\<close>
|
||||
|
||||
|
||||
subsection*["odl-manual1"::technical]\<open>Defining Document Classes\<close>
|
||||
subsection*["odl_manual1"::technical]\<open>Defining Document Classes\<close>
|
||||
text\<open>
|
||||
A document class\<^bindex>\<open>document class\<close> can be defined using the @{command "doc_class"} keyword:
|
||||
\<^item> \<open>class_id\<close>:\<^bindex>\<open>class\_id@\<open>class_id\<close>\<close> a type-\<open>name\<close> that has been introduced
|
||||
|
@ -350,7 +350,7 @@ layout; these commands have to be wrapped into
|
|||
text\<open>
|
||||
|
||||
\<^item> \<open>obj_id\<close>:\<^index>\<open>obj\_id@\<open>obj_id\<close>\<close> (or \<^emph>\<open>oid\<close>\<^index>\<open>oid!oid@\<open>see obj_id\<close>\<close> for short) a \<^emph>\<open>name\<close>
|
||||
as specified in @{technical \<open>odl-manual0\<close>}.
|
||||
as specified in @{technical \<open>odl_manual0\<close>}.
|
||||
\<^item> \<open>meta_args\<close> :
|
||||
\<^rail>\<open>obj_id ('::' class_id) ((',' attribute '=' HOL_term) *) \<close>
|
||||
\<^item> \<^emph>\<open>evaluator\<close>: from @{cite "wenzel:isabelle-isar:2020"}, evaluation is tried first using ML,
|
||||
|
@ -465,16 +465,16 @@ text*[b::B'_test']\<open>\<close>
|
|||
|
||||
term*\<open>@{B'_test' \<open>b\<close>}\<close>
|
||||
|
||||
declare_reference*["text-elements-expls"::technical]
|
||||
declare_reference*["text_elements_expls"::technical]
|
||||
(*>*)
|
||||
|
||||
subsection*["subsec:onto-term-ctxt"::technical]\<open>Ontological Term-Contexts and their Management\<close>
|
||||
subsection*["subsec_onto_term_ctxt"::technical]\<open>Ontological Term-Contexts and their Management\<close>
|
||||
text\<open>
|
||||
\<^item> \<open>annotated_term_element\<close>
|
||||
\<^rail>\<open>
|
||||
(@@{command "term*"} ('[' meta_args ']')? '\<open>' HOL_term '\<close>'
|
||||
| (@@{command "value*"}
|
||||
| @@{command "assert*"}) \<newline> ('[' meta_args ']')? ('[' evaluator ']')? '\<open>' HOL_term '\<close>'
|
||||
| @@{command "assert*"}) \<newline> ('[' evaluator ']')? ('[' meta_args ']')? '\<open>' HOL_term '\<close>'
|
||||
| (@@{command "definition*"}) ('[' meta_args ']')?
|
||||
('... see ref manual')
|
||||
| (@@{command "lemma*"} | @@{command "theorem*"} | @@{command "corollary*"}
|
||||
|
@ -503,9 +503,9 @@ for example). With the exception of the @{command "term*"}-command, the term-ant
|
|||
This expansion happens \<^emph>\<open>before\<close> evaluation of the term, thus permitting
|
||||
executable HOL-functions to interact with meta-objects.
|
||||
The @{command "assert*"}-command allows for logical statements to be checked in the global context
|
||||
(see @{technical (unchecked) \<open>text-elements-expls\<close>}).
|
||||
(see @{technical (unchecked) \<open>text_elements_expls\<close>}).
|
||||
% TODO:
|
||||
% Section reference @{docitem (unchecked) \<open>text-elements-expls\<close>} has not the right number
|
||||
% Section reference @{docitem (unchecked) \<open>text_elements_expls\<close>} has not the right number
|
||||
This is particularly useful to explore formal definitions wrt. their border cases.
|
||||
For @{command "assert*"}, the evaluation of the term can be disabled
|
||||
with the \<^boxed_theory_text>\<open>disable_assert_evaluation\<close> theory attribute:
|
||||
|
@ -558,7 +558,7 @@ of this meta-object. The latter leads to a failure of the entire command.
|
|||
\<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["sec:advanced"::technical]
|
||||
declare_reference*["sec_advanced"::technical]
|
||||
(*>*)
|
||||
|
||||
subsection\<open>Status and Query Commands\<close>
|
||||
|
@ -586,7 +586,7 @@ text\<open>
|
|||
The raw term will be available in the \<open>input_term\<close> field of \<^theory_text>\<open>print_doc_items\<close> output and,
|
||||
\<^item> \<^theory_text>\<open>check_doc_global\<close> checks if all declared object references have been
|
||||
defined, all monitors are in a final state, and checks the final invariant
|
||||
on all objects (cf. @{technical (unchecked) \<open>sec:advanced\<close>})
|
||||
on all objects (cf. @{technical (unchecked) \<open>sec_advanced\<close>})
|
||||
\<close>
|
||||
|
||||
subsection\<open>Macros\<close>
|
||||
|
@ -738,7 +738,7 @@ text\<open>The command syntax follows the implicit convention to add a ``*''
|
|||
to distinguish them from the (similar) standard Isabelle text-commands
|
||||
which are not ontology-aware.\<close>
|
||||
|
||||
subsection*["text-elements"::technical]\<open>The Ontology \<^verbatim>\<open>scholarly_paper\<close>\<close>
|
||||
subsection*["text_elements"::technical]\<open>The Ontology \<^verbatim>\<open>scholarly_paper\<close>\<close>
|
||||
(*<*)
|
||||
ML\<open>val toLaTeX = String.translate (fn c => if c = #"_" then "\\_" else String.implode[c])\<close>
|
||||
ML\<open>writeln (DOF_core.print_doc_class_tree
|
||||
|
@ -821,9 +821,9 @@ or
|
|||
\<open>text*[\<dots>::example, main_author = "Some(@{author \<open>bu\<close>})"] \<open> \<dots> \<close>\<close>}
|
||||
|
||||
where \<^boxed_theory_text>\<open>"''bu''"\<close> is a string presentation of the reference to the author
|
||||
text element (see below in @{docitem (unchecked) \<open>text-elements-expls\<close>}).
|
||||
text element (see below in @{docitem (unchecked) \<open>text_elements_expls\<close>}).
|
||||
% TODO:
|
||||
% Section reference @{docitem (unchecked) \<open>text-elements-expls\<close>} has not the right number
|
||||
% Section reference @{docitem (unchecked) \<open>text_elements_expls\<close>} has not the right number
|
||||
\<close>
|
||||
|
||||
text\<open>Some of these concepts were supported as command-abbreviations leading to the extension
|
||||
|
@ -866,7 +866,7 @@ of Isabelle is its ability to handle both, and to establish links between both w
|
|||
Therefore, the formal assertion command has been integrated to capture some form of formal content.\<close>
|
||||
|
||||
|
||||
subsubsection*["text-elements-expls"::example]\<open>Examples\<close>
|
||||
subsubsection*["text_elements_expls"::example]\<open>Examples\<close>
|
||||
|
||||
text\<open>
|
||||
While the default user interface for class definitions via the
|
||||
|
@ -1018,9 +1018,9 @@ schemata:
|
|||
|
||||
|
||||
|
||||
section*["sec:advanced"::technical]\<open>Advanced ODL Concepts\<close>
|
||||
section*["sec_advanced"::technical]\<open>Advanced ODL Concepts\<close>
|
||||
|
||||
subsection*["sec:example"::technical]\<open>Example\<close>
|
||||
subsection*["sec_example"::technical]\<open>Example\<close>
|
||||
text\<open>We assume in this section the following local ontology:
|
||||
|
||||
@{boxed_theory_text [display]\<open>
|
||||
|
@ -1089,11 +1089,11 @@ text\<open>
|
|||
\<close>
|
||||
|
||||
(*<*)
|
||||
declare_reference*["sec:monitors"::technical]
|
||||
declare_reference*["sec:low_level_inv"::technical]
|
||||
declare_reference*["sec_monitors"::technical]
|
||||
declare_reference*["sec_low_level_inv"::technical]
|
||||
(*>*)
|
||||
|
||||
subsection*["sec:class_inv"::technical]\<open>ODL Class Invariants\<close>
|
||||
subsection*["sec_class_inv"::technical]\<open>ODL Class Invariants\<close>
|
||||
|
||||
text\<open>
|
||||
Ontological classes as described so far are too liberal in many situations.
|
||||
|
@ -1144,7 +1144,7 @@ text\<open>
|
|||
Hence, the \<^boxed_theory_text>\<open>inv1\<close> invariant is checked
|
||||
when the instance \<^boxed_theory_text>\<open>testinv2\<close> is defined.
|
||||
|
||||
Now let's add some invariants to our example in \<^technical>\<open>sec:example\<close>.
|
||||
Now let's add some invariants to our example in \<^technical>\<open>sec_example\<close>.
|
||||
For example, one
|
||||
would like to express that any instance of a \<^boxed_theory_text>\<open>result\<close> class finally has
|
||||
a non-empty property list, if its \<^boxed_theory_text>\<open>kind\<close> is \<^boxed_theory_text>\<open>proof\<close>, or that
|
||||
|
@ -1178,22 +1178,22 @@ text\<open>
|
|||
declare[[invariants_checking_with_tactics = true]]\<close>}
|
||||
There are still some limitations with this high-level syntax.
|
||||
For now, the high-level syntax does not support the checking of
|
||||
specific monitor behaviors (see @{technical (unchecked) "sec:monitors"}).
|
||||
specific monitor behaviors (see @{technical (unchecked) "sec_monitors"}).
|
||||
For example, one would like to delay a final error message till the
|
||||
closing of a monitor.
|
||||
For this use-case you can use low-level class invariants
|
||||
(see @{technical (unchecked) "sec:low_level_inv"}).
|
||||
(see @{technical (unchecked) "sec_low_level_inv"}).
|
||||
Also, for now, term-antiquotations can not be used in an invariant formula.
|
||||
\<close>
|
||||
|
||||
|
||||
subsection*["sec:low_level_inv"::technical]\<open>ODL Low-level Class Invariants\<close>
|
||||
subsection*["sec_low_level_inv"::technical]\<open>ODL Low-level Class Invariants\<close>
|
||||
|
||||
text\<open>
|
||||
If one want to go over the limitations of the actual high-level syntax of the invariant,
|
||||
one can define a function using SML.
|
||||
A formulation, in SML, of the class-invariant \<^boxed_theory_text>\<open>has_property\<close>
|
||||
in \<^technical>\<open>sec:class_inv\<close>, defined in the supposedly \<open>Low_Level_Syntax_Invariants\<close> theory
|
||||
in \<^technical>\<open>sec_class_inv\<close>, defined in the supposedly \<open>Low_Level_Syntax_Invariants\<close> theory
|
||||
(note the long name of the class),
|
||||
is straight-forward:
|
||||
|
||||
|
@ -1222,7 +1222,7 @@ val _ = Theory.setup (DOF_core.make_ml_invariant (check_result_inv, cid_long)
|
|||
\<^boxed_theory_text>\<open>oid\<close> is bound to a variable here and can therefore not be statically expanded.
|
||||
\<close>
|
||||
|
||||
subsection*["sec:monitors"::technical]\<open>ODL Monitors\<close>
|
||||
subsection*["sec_monitors"::technical]\<open>ODL Monitors\<close>
|
||||
text\<open>
|
||||
We call a document class with an \<open>accepts_clause\<close> a \<^emph>\<open>monitor\<close>.\<^bindex>\<open>monitor\<close> Syntactically, an
|
||||
\<open>accepts_clause\<close>\<^index>\<open>accepts\_clause@\<open>accepts_clause\<close>\<close> contains a regular expression over class identifiers.
|
||||
|
@ -1291,18 +1291,18 @@ text\<open>
|
|||
sections.
|
||||
For now, the high-level syntax of invariants does not support the checking of
|
||||
specific monitor behaviors like the one just described and you must use
|
||||
the low-level class invariants (see @{technical "sec:low_level_inv"}).
|
||||
the low-level class invariants (see @{technical "sec_low_level_inv"}).
|
||||
|
||||
Low-level invariants checking can be set up to be triggered
|
||||
when opening a monitor, when closing a monitor, or both
|
||||
by using the \<^ML>\<open>DOF_core.add_opening_ml_invariant\<close>,
|
||||
\<^ML>\<open>DOF_core.add_closing_ml_invariant\<close>, or \<^ML>\<open>DOF_core.add_ml_invariant\<close> commands
|
||||
respectively, to add the invariants to the theory context
|
||||
(See @{technical "sec:low_level_inv"} for an example).
|
||||
(See @{technical "sec_low_level_inv"} for an example).
|
||||
\<close>
|
||||
|
||||
|
||||
subsection*["sec:queries_on_instances"::technical]\<open>Queries On Instances\<close>
|
||||
subsection*["sec_queries_on_instances"::technical]\<open>Queries On Instances\<close>
|
||||
|
||||
text\<open>
|
||||
Any class definition generates term antiquotations checking a class instance or
|
||||
|
@ -1315,19 +1315,19 @@ text\<open>
|
|||
or to get the list of the authors of the instances of \<open>introduction\<close>,
|
||||
it suffices to treat this meta-data as usual:
|
||||
@{theory_text [display,indent=5, margin=70] \<open>
|
||||
value*\<open>map (result.property) @{result-instances}\<close>
|
||||
value*\<open>map (text_section.authored_by) @{introduction-instances}\<close>
|
||||
value*\<open>map (result.property) @{result_instances}\<close>
|
||||
value*\<open>map (text_section.authored_by) @{introduction_instances}\<close>
|
||||
\<close>}
|
||||
In order to get the list of the instances of the class \<open>myresult\<close>
|
||||
whose \<open>evidence\<close> is a \<open>proof\<close>, one can use the command:
|
||||
@{theory_text [display,indent=5, margin=70] \<open>
|
||||
value*\<open>filter (\<lambda>\<sigma>. result.evidence \<sigma> = proof) @{result-instances}\<close>
|
||||
value*\<open>filter (\<lambda>\<sigma>. result.evidence \<sigma> = proof) @{result_instances}\<close>
|
||||
\<close>}
|
||||
The list of the instances of the class \<open>introduction\<close> whose \<open>level\<close> > 1,
|
||||
can be filtered by:
|
||||
@{theory_text [display,indent=5, margin=70] \<open>
|
||||
value*\<open>filter (\<lambda>\<sigma>. the (text_section.level \<sigma>) > 1)
|
||||
@{introduction-instances}\<close>
|
||||
@{introduction_instances}\<close>
|
||||
\<close>}
|
||||
\<close>
|
||||
|
||||
|
@ -1414,7 +1414,7 @@ text\<open>
|
|||
\<close>
|
||||
|
||||
|
||||
section*["document-templates"::technical]\<open>Defining Document Templates\<close>
|
||||
section*["document_templates"::technical]\<open>Defining Document Templates\<close>
|
||||
subsection\<open>The Core Template\<close>
|
||||
|
||||
text\<open>
|
||||
|
|
|
@ -188,7 +188,7 @@ text\<open>
|
|||
|
||||
section\<open>Programming Class Invariants\<close>
|
||||
text\<open>
|
||||
See \<^technical>\<open>sec:low_level_inv\<close>.
|
||||
See \<^technical>\<open>sec_low_level_inv\<close>.
|
||||
\<close>
|
||||
|
||||
section\<open>Implementing Monitors\<close>
|
||||
|
@ -203,7 +203,7 @@ text\<open>
|
|||
val next : automaton -> env -> cid -> automaton\<close>}
|
||||
where \<^boxed_sml>\<open>env\<close> is basically a map between internal automaton states and class-id's
|
||||
(\<^boxed_sml>\<open>cid\<close>'s). An automaton is said to be \<^emph>\<open>enabled\<close> for a class-id,
|
||||
iff it either occurs in its accept-set or its reject-set (see @{docitem "sec:monitors"}). During
|
||||
iff it either occurs in its accept-set or its reject-set (see @{docitem "sec_monitors"}). During
|
||||
top-down document validation, whenever a text-element is encountered, it is checked if a monitor
|
||||
is \emph{enabled} for this class; in this case, the \<^boxed_sml>\<open>next\<close>-operation is executed. The
|
||||
transformed automaton recognizing the suffix is stored in \<^boxed_sml>\<open>docobj_tab\<close> if
|
||||
|
@ -228,7 +228,7 @@ text\<open>
|
|||
\expandafter\providekeycommand\csname isaDof.#1\endcsname}%\<close>}
|
||||
|
||||
The \<^LaTeX>-generator of \<^isadof> maps each \<^boxed_theory_text>\<open>doc_item\<close> to an \<^LaTeX>-environment (recall
|
||||
@{docitem "text-elements"}). As generic \<^boxed_theory_text>\<open>doc_item\<close>s are derived from the text element,
|
||||
@{docitem "text_elements"}). As generic \<^boxed_theory_text>\<open>doc_item\<close>s are derived from the text element,
|
||||
the environment \inlineltx|isamarkuptext*| builds the core of \<^isadof>'s \<^LaTeX> implementation.
|
||||
|
||||
\<close>
|
||||
|
|
Loading…
Reference in New Issue